Connect with us

Hi, what are you looking for?

Tech & Science

Op-Ed: Chinese ‘Social Credit’, or how AI can rule your life right now

The ramifications of these systems are truly hideous, and China may have scored an own goal. ABC Australia’s highly respected Foreign Correspondent has a fascinating coverage of China’s “social credit” system. This is a points system which gives citizens a score out of 800-900. Your score can affect everything from your finances to your travel and your friends and family. The wrong words and moves can affect anyone you know.
Under tight biometric surveillance, and using 200 million surveillance cameras, every move, from shopping to personal relationships can be monitored using AI. (There’s an irony here; the word “Ai” is also Mandarin for “love”. Tough love if ever there was.) Every observed action can affect a social credit score in real time. You could go out one day with a score of 900, and if you have a few negative encounters with anything or anyone, your score could be shot to pieces by the time you return home.
(Please note: The purpose of this article is to discuss social credit, and this type of use of AI in societies. I’m not about to regurgitate the entire Foreign Correspondent coverage. It’s a must-see, exceptionally well presented, and there’s a lot of information. Examples of how social credit work include a model citizen and a so-called bad citizen, an investigative journalist who has exposed corruption and fallen foul of a defamation case which has basically trashed his social credit. This material needs to be seen to be even barely believed.)
Social Credit Dynamics
Social credit is based on, you guessed it, algorithms. These algorithms are formulaic responses to the social credit scoring system for observed actions, and from the look of the sheer level of detail covered, the range is vast. While many experts would debate the value and the accuracy of algorithms in correctly assessing human behaviour, the fact is that that’s how AI systems work. They need constantly running algorithms. The Chinese social credit system needs to operate the way it does to work at all.
Despite the monolithic look of social credit, it also has some serious liabilities built in:
• It could be a major target for hostile cyber-attacks. The sheer level of disruption a successful attack could cause makes social credit a prime target by definition.
• Any algorithm can be “gamed”, that is, cheated, using software or even something simple like a jamming device or many other methods.
• Biometrics are not 100% infallible. In a recent reversal, an AI system couldn’t tell the difference between a human being and a gorilla, for example.
• The social credit system is surveillance-dependent. Any significant damage to the system could simply derail it.
• The system itself could become a focal point for social resistance. That’s not happening, but it’s a predictable possible response.
• Ironically, in the very credentials-conscious Chinese Communist Party (CCP) it could be used against Party members in power plays. That’s a potentially deadly use of social credit, and it’s odd the CCP seems to have ignored it.
• Surveillance and observation of themselves do not necessarily lead to the right decisions. Information can be misinterpreted, distorted, or simply inaccurate.
How useful is social credit?
Social credit, in theory, rewards the good guys and punishes the bad guys. You don’t need to be a surveillance fanatic to see a few positives and negatives in that approach. The same theory effectively means that you’re conducting surveillance on good people who really don’t need much surveillance, to start with.
Not to be totally negative, you could use a similar system for some very positive things, too:

• Finding lost kids and people with dementia, or people in trouble. A positive ID would save a lot of time and the laborious tasks of location pretty effectively.

• Crime prevention is likely to be a lot better with a system like this, although you’d still need a good enforcement and judicial system to make that work.

• Reliable court evidence of a person’s whereabouts and actions could be provided, if that evidence is acceptable in a court of law.

• Social credit-like surveillance could establish a safety net for anyone, keeping track of them and raising the alarm if they get in to trouble.

• AI surveillance of this kind could be potentially very useful in the hideous cases of kidnapped kids or people trafficking, to cite just two basic examples.

Social credit and the West
In Western countries, the culture is different, some might say non-existent, but resistance to government control is typically ferocious. Libertarians and “leftists”, right wingers and apathetic idiots, nobody is likely to take too kindly to this level of surveillance.
It is, however, possible to introduce this type of surveillance in the West, and quickly. Most major Western cities are wired up like Christmas trees with surveillance cameras, and adding a few bars of AI software would be extremely easy. Put it this way, it’d be marginally tougher, but not much, than changing a light globe.
Given the abysmal track record of our current crop of useless scumbag politicians in the West on every subject, giving them that sort of power is to put it mildly optional. It’s unlikely any electorate, even the most delusional, would trust any political government with that scope of intrusion.
So far, sounds OK, doesn’t it? The problem is that a range of legislative excuses, notably law enforcement, national security, and other barely credible options for introducing something of this kind already exist. There are multiple back doors for our egregious duly elected morons to be clever.
Consider for a moment that many corporations conduct maniacal and expensive audits of toilet time in workplaces in the name of efficiency. A mentality like that, very similar to the political mentality, could easily be talked in to using a surveillance system on a national basis.
Given the possible positives and negatives, a social credit-level surveillance system would have to have severe checks and balances. Information from these systems would have to be considered private by default, to protect the rights of citizens. Warrants would have to be required, with strict judicial oversight of the use of this information to make it trustworthy and accountable within legal processes.
Which leaves us with a few sadly not rhetorical questions:
How competent and trustworthy do we consider our governments?
How do we define what rights are infringed, if they are?
What are the risks to individuals if this information is accessed by hackers, etc.?
What recourse to legal remedy would people under surveillance have, if they suffer some sort of legal injury?
What happens when these systems become obsolete, and easy to penetrate?
Exactly how efficient, and how credible, are these systems in the real world?
How do you protect the public from corrupt operators and information handlers? (In the Foreign Correspondent story, the Chinese journalist exposed actual corruption. That is an ongoing very high priority for Beijing, and yet a guy with a good record was penalized for a mistake. Seems wrong, in too many possible ways.)

So there you have it – Something even Big Brother didn’t think of, up and running. What to do about it? Interesting question, and the answer will have to be damn good indeed.
Just one more thing – In Chinese, the word for freedom is pronounced “See you”. The problem for the Chinese is when speaking English, and people saying goodbye in English are saying “freedom” in your language.

Avatar photo
Written By

Editor-at-Large based in Sydney, Australia.

You may also like:


Leaders push for policy development and collaboration to accelerate energy transition and avoid losing public support.

Tech & Science

My advice to Steam is fix this mess before it becomes fatal to the brand image.


Chinese social media censors have blocked multiple influencers known for showing off their lavish lifestyles.

Tech & Science

OpenAI was forced to apologise to actor Scarlett Johansson last week for using her voice — or something very similar — on its latest...