We’re at a crucial juncture in discussions about technology and how our data can be used for good and evil. Now is a good time to think about where we’re headed.
This week, Australia had a turning point of sorts. There was a sprinkle of caution mixed in with hope, that we may have avoided the worst of the COVID-19 pandemic.
We’ve seen some unlikely scenes in the past few weeks. Scenes that Australians would never have thought possible given this country’s “larrikin” free spirit. In particular, COVID-19 continues to test the extent to which people accept the State’s imposition on their personal liberties. Whether in the form of laws or “in-the-public-interest” cajoling.
It appears that in times of crisis and fear, people are willing to trade their personal freedoms for some concept of safety. But how far are people willing to let it extend to all facets of life?
Tracing You (and Others)
This week, the Australian PM suggested that everyone should download a coronavirus tracing app called “TraceTogether”, adding that he wouldn’t like to make it mandatory. This veiled threat offended more than a few people, culminating in a tweet where the PM walked it all back:
The App we are working on to help our health workers trace people who have been in contact with coronavirus will not be mandatory.
— Scott Morrison (@ScottMorrisonMP) April 18, 2020
In short, the app uses Bluetooth on your mobile phone to detect if you’ve been near another person (who has the app) for 15 minutes or more, being the “period defined as a contact”. This would give health authorities the ability to trace people in close contact with confirmed COVID-19 cases.
According to a spokesperson of Stuart Robert MP, “the app’s data will be fully encrypted, and ‘close contacts’ will be shared with health authorities only after an individual has tested positive and consents to sharing their information.”
Unfortunately, this sort of assurance from someone who, just weeks ago, couldn’t distinguish between a DDoS (Distributed Denial of Service) attack and sub-standard server capacity doesn’t inspire the greatest confidence. ‘Nuff said.
Anyhoo, what about the bigger picture?
The TraceTogether app fits into a broader conversation about data privacy.
At the end of 2019, the sentiment across society was that Big Tech such as Google and Facebook had gained far too much power over our data. It was time to bring in legislation to limit data sharing, aggregation and use for purposes that we could no longer consent to. The ACCC’s Digital Platforms Enquiry and final report in July 2019 was one step in this direction.
All this seems to have slid under the carpet while we’ve been collectively torpedoed by COVID-19. But if you’re keen as beans, it’s a good time for us all—particularly lawyers—to start reading up and becoming more digitally savvy.
Brain Control event
In December 2019, I attended a fantastic event in Melbourne called “Brain Control: The impact of science and technology on our mental health, law and privacy”. It was moderated by ABC radio presenter Jon Faine. The panelists:
- Mr Sven Bluemmel – Victorian Information Commissioner
- Prof Judy Illes – Neuroethics Canada, University of British Columbia
- Prof Mark Andrejevic – Professor of Media Studies, Monash University
- Ms Vrinda Edan – Chief Operating Officer, Victorian Mental Illness Awareness Council.
The panelists touched on most of the thinking in 2019 about data privacy and use, revolving around these key areas:
- Consent – Our privacy laws are outdated. They don’t consider how we can effectively provide consent, or make an informed choice about our data. Two examples in the biometric space in 2019 included (1) an employee’s refusal to hand over his fingerprint information to his employer; and (2) the use of facial recognition cameras in schools.
- Ownership – Should companies be allowed to sell your data? For example, Google now owns your FitBit data after acquiring Fitbit in 2019. What sort of opt-in processes should be in place or legislated?
- Predictive AI – The extent to which we use artificial intelligence to “predict” criminal behaviour or health outcomes—and how it could run roughshod over people’s rights. It’s also a significant problem when we’re talking about in-built biases in the code.
- Algorithmic bias – Dealing with bias in algorithms, as well as combatting illogical reasoning often associated with small datasets (or programmers)—one example was in a small-scale study where the AI detected a link between people who used strongly religious language and Type 2 diabetes. Clearly nonsensical stuff).
- Brain control – Use of data for manipulating people’s emotions (think Cambridge Analytica) or purchasing habits.
- Balancing privacy concerns without impeding innovation – Where does the balance lie?
The good news is that you can watch a recording of the Brain Control event on YouTube:
I’ll leave you with a few words from one of the panelists about the ongoing privacy discussion and why facial recognition cameras should not be the new normal at schools:
“I predict with hope that the privacy discussion will become what it means to be human…
Last year there were some issues about some schools in Victoria potentially having facial recognition cameras… and I was asked about my opinion about that. My opinion was—it’s appalling. And it’s appalling notwithstanding all the safeguards…
I’m worried [about facial recognition cameras] because what will that do to the next generation growing up? Is that they think that surveillance spy machines and authority is normal. What will that do to our public discourse? What will that do to the ability of the next generation and the one after that, and so on, to become critical thinkers? To not be coerced into, or becoming compliant, and all the same? Because if that happens, we’re all doomed. Because every great advance in humanity was originally highly controversial and unorthodox. And if we bring up generations of kids—that because of surveillance being pervasive—stop challenging orthodoxy, then we’re in trouble.”
— Mr Sven Bluemmel, Victorian Information Commissioner
(see video at 50:25 onwards)
Want more resources? Sign up to the LB mailing list.