Everything is wrong with the framing of this story. It’s wrong because this is a global health emergency, and we need solutions which are both global and very human. Politicians are arguing over it, governments are scaring privacy and human rights campaigners, all of which risks failing to see the bigger picture.
Scrolling through socials over the course of March, it’s been impossible to miss that the celebrations of International Women’s Month have two distinct objectives. While many mark the achievements women have already accomplished, the bulk of the focus rightly remains on empowering future generations and solving the problems still facing women today. It’s been a challenging few weeks to say the least, and as frustrated as I’ve felt by the continued misogyny facing us, I’ve also been truly inspired by the strength of women who are, yet again, joining collectively to stand up for one another.
Deepfakes are a relatively new phenomenon, but they’re becoming ever more prevalent as our social worlds migrate further and further online. If you haven’t come across one before, the concept is pretty much what it says on the tin – fake videos of real people. They’re often created with the intention of humor, but the dangerous truth is that, when well made, they are difficult to discern from reality.
As many people grow more concerned with the ways their data privacy is at risk, there remains a lack of general awareness of exactly why we need to worry. We all know it’s creepy when we suddenly get ads for Mummy and Me classes after buying one baby shower gift. But in the end, as Gilad Edelman points out, “There is no human being snooping through your laundry, just a machine trying to sell you more stuff.”
Clubhouse, the latest invite-only, all-audio social network, is all the rage. But experts express serious concerns about how app users’ data is at risk. Is listening to your favourite celeb chatting live worth risking your personal info (or that of your entire address book)?
Singapore has confirmed that its police can access the country’s COVID-19 contact tracing data if they wish. This latest development seems unnecessary at best and evidence of a creeping state panopticon at worst.
Salaat First, an app that reminds Muslims when to pray, has been caught recording and selling users’ location data. The source who provided the dataset of precise user movements was concerned that such sensitive information could potentially be exploited by bad actors.