The UK Police is testing technology to ‘assess the risk of someone committing a crime’
The government is using Facebook’s business model and we are all being treated as statistics effectively and I am calling it ‘FacebookAbility’
UK police has announced it will start using artificial intelligence to help identify offending behaviour early in the day. Potentially, babies will be targeted for “early intervention”.
The idea is that the police will target suspects to be criminals and give them appropriate education and other support to help stop them growing into full size gangsters.
On the face of it, trying to identify sources of crime early, sounds like a fantastic ideal. But then, whenever I see a government agency trying to act like a social media company, I naturally get worried.
Social media companies don’t have “people”. Instead, they have “numbers”. Serial numbers, statistical numbers, viewing numbers, click through numbers, and of course dollar numbers. For social media companies, people are a tool, whose role is to make up the numbers, which is absolutely fine.
However, when a government agency, such as the police, starts to turn into a mini Facebook by treating people as numbers in a big statistical exercise, the story is quite different.
Whilst going in with good intentions, you don’t need to be a genius to realise that the losers through stereotyping are always the same “usual suspects”.
Ethnic minorities, new immigrants, poor people and everyone else who happened to be unfortunate enough to live in a demographically crime prone street, will now be treated as a potential criminal, or a baby criminal, or a criminal in waiting.
The condemned (or needy, depending who you ask) will receive group treatments, as they should, for the benefit of the rest of society, to prevent them turning into criminals, like their next door neighbour.
To understand the harsh consequences of this FacebookAbility of the poor and deprived, to turn their initial misfortune on its head, we need to understand how the increasing use of artificial intelligence is affecting Facebook users’ ability to change, grow and progress.
Facebook users are constantly being fed with certain and “personally tailored” political, religious, and other general information. The “personally tailored” information is another name for artificial intelligence which works out past behaviour and assumes that people never want to change.
If for example, Facebook considers you to be a person from a financially poor background, the information you will be fed is likely to be very basic and unchallenging, which means the likelihood that you will ever progress, advance and grow as an individual is slim. No challenge, no growth.
Part of the misfortune is that artificial intelligence dictates that your current position, assumed IQ, assumed level of intelligence and assumed general knowledge are cemented.
Facebook, Youtube, Google and now the police, will all conspire to ensure that individuals stay the same as they have always been, or perhaps the same way that artificial intelligence says they should be.
Artificial intelligence might be great for Facebook and particularly for boosting up its advertising revenue. However, let’s remember that history is full of examples of individuals who did exceptional things under exceptional circumstances because they managed to change. They managed to change what they were statistically meant to become, and they did this by challenging themselves.
If artificial intelligence is to be used widely by the police to identify and “treat” potential criminals, individuals will be labelled, perhaps from birth, as potential criminals. The chance of those individuals then being able to excel will be lower than it is already.
Artificial intelligence should be used very carefully by police authorities. There is no harm in using it for statistics of factual matters to help assist decision making processes but utilising artificial intelligence as a preventative measure to identify and label individuals is both patronising and dangerous.