• Hannah Ost

China's 'Social Credit System' is Too Close to Home

A lot of people don't know what's going on in China right now. Most people know about the Communist dictatorship, but what they don't know is how scarily close it is to our civilisation in the West.



What's going on?

Citizens in China are being measured by a social credit system. Millions of cameras monitor and collect data on each citizen. Every action has an assigned value, which will raise or lower your 'social credit score'. For example, buying nappies suggests you are a parent and will gain you points, for being a responsible member, contributing to society. Buying a crate of alcohol loses you points for being irresponsible, playing video games loses points for being an idle citizen. Everything is controlled by the Chinese government's view of a perfect citizen.


The point of the points? High scorers get rewards like better job prospects, seating upgrades on transportation and decreased insurance premiums. Low scoring citizens could find their children kicked out of school, have their interest rates increased and be unsuccessful in job interviews. Even so much as associating with people with a lower score lowers yours, leading to social isolation and disownment of low-scoring family members.


An even scarier fact: Those with low scores cannot go on public transport, meaning they cannot leave the country - the system forces you to become its version of 'better' before you can escape it. You have to consume and conform.



Now, you're probably thinking, wow, that sounds horrific, like something out of a dystopian novel. Boy, am I glad we're safe in the West. Think again.


In actual fact, this 'social credit system' is not at all dissimilar to schemes already in place in the UK, where I am from, and in the US and Australia as well! For example, car insurance is higher if you've had an accident, if you're a new driver, or if you have any driving-related criminal convictions. Furthermore, we already track drivers' actions with a points system! Being on your phone, not waiting at a crossing - there are laws of the road that we all follow, else we face points on our license and the resulting, usually financial, punishments.


There is no real problem with raising insurance rates for bad drivers. Punishing someone for speeding encourages them not to do it again, it makes our roads safer and may make them a better driver. So, this is one arguably positive use of a scoring system. It is, however, an example of information being collected by an independent company and used against consumers in order to make them improve.


Don't believe this "perfect customer" ideology is widespread? Hold onto your hats. Here are some examples of systems we have in place, not too far off the social credit score system in China...


Credit Scores

Loan companies collect your data and supply it to the company you're applying to. They take your previous behaviour and score you, affecting your ability to get loans and mortgages in the future. If you are successful in your application, your interest rates will likely be higher. This system is built around financial credibility and encourages customers to be better with their money. So, what stops this system from being applied anywhere else? Absolutely nothing.


Uber

Uber is used by millions and has a function to rate drivers and passengers. For drivers, car models, conversation and complimentary water/gum can affect whether they get travellers and ultimately, get paid. For a passenger, closing the door too hard, politeness and putting your feet on the seat can all affect your ability to get somewhere. Taking this further, it doesn't even have to be you who did it! Did you book your friend Cheryl a car home and she was so drunk she threw up on the back seat? That's your score going down. Low score? Nobody will want you in their car.


Resturants

Restaurants can ban you from eating there for what they deem to be inappropriate behaviour. They can hold data on you, for example, CCTV footage, names and photos and publish a blacklist for everyone to see. Although nobody wants antisocial behaviour and you should always treat your servers with respect, there is nothing to stop a restaurant from blacklisting you for say, missing your reservation? Australia's booking service 'Dimmi' allows restaurants to stop customers booking again for no-shows. It doesn't care about your excuse and it may even charge you for not attending your reservation! They can even share your data with other businesses, who may blacklist you, even if you've never been to their restaurant before! But, it works. Since its introduction, the amount of no-shows has dropped considerably, saving businesses millions of dollars.


Employability and Social Media

Employers will check your social media before they hire you - you could even lose your job over someone's opinion of the photos you post. Work in a school and you've got a bikini holiday photo on your Insta feed? You could be fired for that. Don't believe me? Here's an article of a cheerleader fired for posting a so-called "sexy" picture on Instagram.

In the same way, we can't control what others post about us online. If we are involuntarily dragged into a spat on Facebook, or a revealing picture makes its way onto a friend's Instagram, even to go as far as 'revenge porn', all of these can have an effect on our employability.

Furthermore, who decides when a photo is too sexy to be appropriate? If it is deemed appropriate by Instagram, why is there one rule for them and another for the company you're applying to? With each company having its own set of rules, is it actually sensible to have one points system where every action has a single score?


Working with Children

Employers also ask for previous criminal convictions, which could restrict your ability to work with children especially. A woman who has previously worked in the sex industry, for example, would most likely never be allowed to work in a primary school. Not because national law dictates this, but because companies and corporations have made their own rules about what is acceptable behaviour. Someone at the top has made a choice, based on their own opinion, about what a perfect primary teacher looks like and they are acting upon that goal.



Now, while I am most definitely not condoning illegal nor antisocial behaviour, I am questioning the power of private companies, who can make their own rules at will, totally separate from government legislation. Companies are already collecting our data, monitoring our behaviour and restricting what we can and cannot do because of that data. Although it hasn't reached a governmental level (yet) as it has in China, with technology now allowing it and independent companies already making use of it, who's to say it couldn't happen here?


To present a slightly controversial argument, who's to say it shouldn't? It's working in China - people are becoming better citizens. Our systems are working here too - we try to be nicer to our servers, to our Uber drivers and aim to improve our credit score. I have and want to continue to work with children, so I constantly monitor what I am putting on the internet and try to see it from an employer's point of view. Perhaps monitoring our actions and developing a social score system would help us to become better people in general!


In theory, yes. An advanced version of this could (and I need to say this again) T H E O R E T I C A L L Y work! It could make us better humans, HOWEVER... only in the eyes of the person making the rules.


The Chinese Communist Party thinks being a mother equals responsibility and buying alcohol equates to addictive tendencies. This can't, and doesn't, reflect well on single mothers, same-sex couples, or people who are bringing a couple of crates of beer to a barbeque! The major flaw of China's system is that it doesn't take into account why people are completing certain actions. Maybe someone is buying nappies for a friend, maybe they're buying a crate of alcohol for a big office party, when in fact they themselves are teetotal.


Systems here do the same thing. Picture these scenarios:


You've booked a reservation at a restaurant when all of a sudden, you get a phone call. A close family member has suddenly fallen ill and you rush straight to the hospital. The last thing you're thinking about is that dinner reservation. But, you might just find yourself blacklisted because of your no-show. The restaurant doesn't take into account the reasons behind your actions, only the final outcome.


Your friend Cheryl actually never usually gets drunk. But, unbeknownst to everyone, she had her drink spiked tonight. As soon as she gets home, she passes out completely. Her mother calls for an ambulance. But the Uber driver didn't know that; he assumed she was a drunk student and gave you a bad rating for not just walking her home.


When we put a value on an action, we fail to acknowledge the deeper meaning behind it. When we put a value on a person, we fail to acknowledge who they are.



That's all from me today. If you have any questions, comments, or want to know my sources for information, feel free to comment below or contact me here!


Hannah



Pin This Post!



©2020 Hannah Ost

Student Journalist