A few weeks ago, former Senator Scott Ludlam highlighted an article about a new CCTV and Wi-Fi system being installed by the City of Darwin council. It made several claims including that there would be virtual fences, facial recognition and, most outlandishly, a social credit system like that found in China. The council was forced to refute these claims, denying that the facial recognition feature would be used, and of course that there would be no social credit system.
China is a real-world, dystopian end-point for facial recognition, but it and other tracking methods are quietly making their way into everyday life in Australia and other democracies. In London, during a facial recognition test, Metropolitan police stopped those who cover their face. There was a large system at the Brisbane Commonwealth Games last year, but it was mostly ineffective. And Australia has a national facial recognition database known as ‘The Capability’, into which the federal and state governments can feed faces captured from CCTV for matching.
It’s not just the government that is into facial recognition — shopping centres are one of the leading places of public surveillance. Westfield centres have smart advertising billboards with facial detection to determine your age and gender. Shopping centres also track you through their free Wi-Fi. If you connect, they often require you to connect with your Facebook account, which gives them your demographic details, and they can then track your location via your phone as you walk around the centre. They mostly want this information for aggregate analytics but there’s nothing to stop them tracking an individual. Even if you don’t connect, your phone broadcasts probes looking for your home Wi-Fi network, which can be anonymously tracked.
Educational institutions are also looking at students’ faces. A start-up called LoopLearn is marking attendance based on facial recognition, leading the Department of Education and Training Victoria to ban the technology from state schools unless they perform a rigorous privacy assessment and receive explicit consent from all parents. I also know of a university that is considering using Wi-Fi location analytics to track the drop-off in student attendance over a semester, so it can relocate classes to smaller rooms and close off wings of buildings to save power.
Even ignoring its applications, facial recognition is far from a neutral technology. There are good arguments that it is inherently flawed, and even racist, since it is worse at identifying women and people of colour due to the image databases it is trained upon. Even if it were perfect, should we be accepting of a technology that can track our movements wherever we go? Some pushback at the government level is occurring — San Francisco has banned the use of facial recognition by police and other city government departments, but this doesn’t affect business use. A vote was put to Amazon shareholders to ask the company to stop selling facial recognition technology to government agencies, but only 2.4% were in favour — 27.5% voted for a proposal to investigate how it could harm civil rights and privacy.
But what can we do if or until these technologies are regulated? One option is CV Dazzle makeup, designed to confuse facial recognition algorithms and prevent them matching your features. For Wi-Fi tracking, I use an Android app that turns my phone’s Wi-Fi off when not near a network it knows about (eg, home and work). Plus, since version 8, Android uses a random MAC address for scanning, and in Android Q it will use a random MAC address when connecting to a network. But Q will also prevent apps from turning Wi-Fi on and off programmatically, which is a privacy loss.
iOS has had MAC address randomisation since iOS 8. The initial implementation activated only in a few circumstances but recent testing shows it is active most of the time when not connected to a Wi-Fi network, and the probe requests can identify the networks it is connected to. In addition, both OS’s randomisation can be easily broken, although in some cases only by an active attack.
More broadly, as IT professionals, we have a duty of care to scrutinise and hold technologies such as this to account. Facial recognition technology has a high likelihood of contravening the SAGE-AU Code of Ethics, and appears to be largely flying under the radar.