In education IT we have a phrase — ‘technical solution to a people problem’ — to describe the use of technology to solve a behavioural problem that would be better solved simply by having people behave more sensibly or by following existing rules… rather than trying to implement increasingly baroque limits with technology, which are invariably bypassed because the problem is that the user isn’t respecting the rules in the first place.

When it comes to COVID-19 contact tracing, it has become clear that Bluetooth-based contact tracing is a technical solution to a public health problem. This applies to both the COVIDSafe app, which is technically deficient in multiple ways, as well the Google/Apple Exposure Notification, which is conceptually deficient.

Both of these technologies (I hesitate to call them solutions) are based on mobile phones sending out short-lived (15-minute), low-energy Bluetooth signals and recording the signals they hear. There’s a lot of cryptography to make it private and secure. If a user tests positive for COVID-19, they can trigger the app on their phone so that the health authorities can notify other app users who were in close proximity that they may have been exposed to COIVD-19 and should get tested.

The technical problems of COVIDSafe have been well exposed by a group of Australian technologists including Jim Mussared, Vanessa Teague and Geoffrey Huntley (who recently received a Linux Australia recognition award for their work).

Based on the BlueTrace protocol developed in Singapore, the app has always had issues with iPhone-to-iPhone communication while devices are sleeping. As well, there have been problems with interference with other Bluetooth devices, draining of batteries, various cryptography problems, automatic updates not working, lack of transparency in the server-side code and so on. Some of these were fixed, but then reintroduced by the adoption of VMware's Herald protocol. At this point it’s clear that it can’t be made to work and should just be abandoned.

What about the Google/Apple Exposure Notification system (GAEN)? Unfortunately, while it will provide individual users of notification of possible COVID-19 exposure, by design it won’t automatically notify public health authorities of this close contact.

The ring fencing of close contacts by public health organisations has been the lynchpin of Australia’s strategy of aggressive suppression of the coronavirus. Being able to quickly order testing and isolation of not just first-level close contacts, but second- and sometimes even third-level close contacts has delivered results.

Yet exposure notification only works with positive tests, requiring the receipt of a test result (which can take days during an outbreak) before contacts can be notified. This is too much lost time when dealing with a virus that can be infectious for days before symptoms appear.

Practical evidence of the success of exposure notification apps is missing — at one point Switzerland was hailed as a success, but it has had a significant second wave since October. At the time Sang-il Kim, head of the digital transformation division at the Swiss Federal Office of Public Health, said, “We have proof that the app works”, pointing to 100 users who had been tested after being notified by the app since June. While technically true, it’s hardly a large contribution in a country where hundreds were testing positive per day in October (peaking at 10,000 in November).

I don’t think it’s fair to blame Google and Apple entirely for the failure of the exposure notification protocol; in fact they should be applauded for working together to provide a cross-platform solution. However, over time it’s become clear that it’s no silver bullet, and in fact misses key populations who don’t have mobile phones — students, the elderly, low-income households — leaving gaps in coverage. Combined with the delay from requiring a positive test result, it’s no solution for Australia either.

So what does work? The state-based QR code attendance registry apps. Having a central database that can be quickly queried is a boon for contact tracers, with Victoria integrating its version directly into its contact tracing system. Victoria also uses App Clip technology on iOS, meaning just scanning the QR code launches a small version of the app without having to install it from the App store. Use of these apps can be made mandatory by imposing conditions on the venues, rather than relying on people to install and continue to run a Bluetooth app.

The downside, of course, is privacy — some people object to the government tracking where they’ve been. To which I would say, if the government really wants to track you, they’ll just purchase your location information from a third-party location broker, like the US military does to bypass case law that forbids them gathering it directly without a warrant.

Some countries have QR check-in apps which just store a list of locations on your phone, and then notify you if they are added to a list of exposure sites. This model falls down in the same way as GAEN in that it’s only good for the person; there’s no way for public health authorities to automatically do multiple-level contact tracing before receiving positive test results.

GAEN was designed to store contact data on your phone and avoid a central database, as a way of encouraging privacy-conscious users to be comfortable installing it. One advantage of COVIDSafe was the strong law that protected it from any use of the data apart from contact tracing. Even then, ASIO inadvertently captured some COVIDSafe data from users’ phones which it then had to delete. The same could easily happen with locally stored QR check-ins.

GAEN has potential for privacy violations too. By design there must be a public list of contact identities that have tested positive; while they should be cryptographically protected, if someone (or the government) has access to your phone then they can de-anonymise them.

It is the law that will protect privacy, not technological solutions. So we should be arguing for strong laws to protect the privacy of the state QR check-in databases. Singapore is a good warning here — its police force gained access to the database to assist with a murder investigation. This access is now being codified into law with a “serious crimes only” limitation, but I would argue for prevention of all police access.

Australian health authorities take their privacy obligations seriously — see how the SA health department refused to provide detailed notes from a contact tracing interview that could have led to criminal charges.

As IT professionals, we must always be aware of the limitations of the technologies we work with, and be prepared to advocate for stronger enforcement rather than just technological workarounds.