Technology

Should we trust governments with our faces?

Why the latest developments in facial recognition should leave us worried.

Should we trust governments with our faces?

London is one of the world’s most surveilled cities. Recently, China has begun using facial recognition to keep track of one of their district’s Muslim populations. Ethically speaking, I believe we can all agree that, this is one of the worst uses of modern technology we have seen to date. At the beginning of this year, I have decided to get an iPhone X: one of its flagship features was Face ID. It is a great feature and works flawlessly, allowing me to use my phone immediately after raising it up from the table. However, one might ask: how are companies and governments going to use my data?

Practically speaking this is quite an easy question to answer; ethically, not so much. We can discuss examples of the uses of facial recognition software and judge whether they’re right or wrong. Does the law of this country, and of all others, allow for the recognition of the difference between these right and wrong cases? As a computing student who enjoys development in his spare time, this is a question I ask myself often – most recently at ICHack 18, this year’s edition of the Imperial College Hackathon organised by DoCSoc.

“However, one might ask: how are companies and governments going to use my data?”

One instance of application of facial recognition software is Microsoft’s Azure platform. Microsoft was one of many sponsors behind ICHack, providing a whole suite of solutions for application development – ranging from facial recognition to judging a person’s emotions from their expression. Now this all sounds great in theory, and I have nothing against Microsoft or its platform; I believe everyone should have access to great technology. However, it’s easy to forget in this excitement about how these services can be manipulated. Almost all ideas using these services would work best when we recorded the user without their attention; after all it’s best to see how someone is feeling without them posing for a selfie. If you were on the app-store and saw an app that would record you all the time, would you download it? Regardless of how much of an improvement it would make, would you say yes? How can you be sure that the company or individual behind it won’t misuse your data?

Algorithms can now recognize far more than just age or gender // Mac Observer

In a perfect world we would not have to resort to tracking every person, however in this broken world we might just have to. A lot of good can come of it, we can stop terrorist attacks, burglaries and violent attacks to name a few by ensuring the location of dangerous individuals are known to the authorities. Once again though, we are coming to a cross-roads where we must decide if this good is coming at the cost of too much of our privacy. Furthermore, we must ask ourselves whether the grainy CCTV pictures we always see on our TV are good enough for an artificial intelligence to judge whether someone was there or not. Ultimately, at any point in this debate, we can not resort to tracking minorities and vulnerable groups, as limiting as it may be to the scope of the mass-surveillance taking place. The actions in China are worthy of condemnation and we must never go down a similar path.

One of the greatest privacy concerns of facial recognition is that it is able to collect its data in just a fraction of a second. In Russia, this technology is already at large, with an app called ‘FindFace’ which allows you to find a person on Russia’s leading social media network, VKontakte. With our personalities and pictures increasingly out there, it will only become easier to find who we are and what we are like.

“The actions taken in China are worthy of condemnation and we must never go down a similar path”

In one way we’re already giving this power to our governments. A lot of people get emotional about privacy, but no one acts like it’s of any importance. Surely, it’s not as easy as: ‘If you’ve got nothing to hide you have nothing to worry about’? This is definitely something to think about, especially as cognitive-enabled applications are created. Artificial intelligence is causing a massive stir-up in our society, from robots taking over our jobs, to the weaponization of it to end the lives of other human beings more effectively.