Google Sent Police After Man Who Took Naked Pictures of his Toddler for the Doctor.
Google uses AI to determine if pictures taken are abusive in nature. But when the system gets it wrong, innocent people face dire consequences. Is this a case for Decentralisation?
Dear readers,
As I always say, I like to talk about subjects that are either discussed a lot or not being discussed in our community enough. Today’s subject is one such absurd incident laying out the importance of privacy, decentralised data control and the power of Big Tech.
TL: DR - Mark’s toddler had a painful, swollen penis. His wife contacted their doctor’s office, where a nurse asked Mark to send them a picture of the toddler’s penis because the pandemic was raging and the doctor wasn’t seeing patients in person. Mark’s phone synched the photo to his Google Photos account, and Google’s scanning tools automatically detected the picture of a child’s penis and turned Mark into the SFPD, accusing him of molesting his son. The SFPD understood the situation. But Google has still not given Mark his account back. Mark is completely locked out of his digital life without access to his phone number and email address.
I am borrowing the content of the story written by Kashmir Hill for New York Times and very well summarised by Cory Doctorow for this letter, so a special thanks to them for highlighting the issue.
A Sick Toddler, Worried Parent, Helpful Doctor and an Unintended Consequence of AI-based software
One night in February 2021 when the world was still grappling with the second wave of Covid, Mark’s wife was dialling the doctor’s office to book an emergency consultation for the next morning. Mark, a stay-at-home dad and his wife took pictures of their toddler’s penis, which was swollen and causing a lot of pain that night to share it with the doctor.
The child had a bacterial infection, which was quickly cured with antibiotics that the doctor prescribed via telemedicine. But what followed for Mark was a huge problem.
One of the photos the couple took on Mark’s Android phone had Mark’s hand in it. The photo was uploaded to Google Photos automatically. That’s when Google’s algorithms detected the photo and marked it as child sexual abuse material (CSAM).
Google refused to listen to Mark's explanation. Instead, they terminated his account, seizing more than a decade's worth of personal and business email, cloud files, and calendar entries.
He lost all the family photos he'd synched with Google Photos (including all the photos of his toddler from birth, on). He even lost his mobile plan, because he's a Google Fi user. Since he lost his email and phone number, he could no longer access his accounts on other digital platforms. He could not reset the login credentials of such accounts as they would send notifications on the old email/phone.
Mark was completely locked out of his digital life.
Mark received an envelope from the San Francisco Police Department telling him that Google had contacted them, accusing him of producing child sexual abuse material (CSAM) and that the company had secretly given the police full access to all of his files and data, including his location and search history, as well as all his photos and videos.
Google had shut down his phone number and so they couldn't reach him on phone and had to send this information by mail/post.
SFPD realised what Mark was doing and that he wasn’t a child molester.
Google, on the other hand still holds all his data. A few days after Mark filed an appeal, Google responded that it would not reinstate the account, with no further explanation.
Same Problem, Another Family.
Around the same time, another google user Cassio was going through a similar problem.
On February 22, 2021, Google disabled Cassio’s account saying he had seriously violated their policies. Cassio wrote,
Thinking about the recent activities that might have triggered Google to detect this, the only thing that comes to my mind is that in the previous two days, I took pictures of my son's infection in his intimal parts to send to his pediatrician who was following daily updates. I tried to appeal Google's initial decision through their review process form, but I got a negative answer:
Both dads were cleared by the police but lost access to their Google accounts.
Writing for the New York Times, Kashmir Hill spoke to Electronic Frontier Foundation’s Jon Callas, who called the scanning intrusive, because a family photo album on someone’s personal device should be a “private sphere.”
Google claims that they only scan your photos when you take an "affirmative action" related to them, but this includes automatically uploading your photos to Google Photos, which is the default behaviour on Android devices, as explained by Cory Doctrow.
Kate Klonick, a cyber law prof and expert on content moderation also chimed in with her views in the NYT article. She pointed out that this was "doubly dangerous in that it also results in someone being reported to law enforcement," suggesting that this could have resulted in a loss of custody if the police had been a little less measured.
As for the update in the case of both dads, Cassio was told by a customer support representative earlier this year that sending the pictures to his wife using Google Hangouts violated the chat service’s terms of service.
And Mark was told, that reviewers had not detected a rash or redness in the photos he took and that the subsequent review of his account turned up a video from six months earlier that Google also considered problematic, of a young child lying in bed with an unclothed woman.
Mark did not remember this video and no longer had access to it, but he said it sounded like a private moment he would have been inspired to capture, not realizing it would ever be viewed or judged by anyone else.
Not everything is truly bad about this service though. In 2021, Google filed over 600,000 reports of child abuse material and disabled the accounts of over 270,000 users as a result. The two dad’s experience is nothing but a drop in the bucket relative to this. But even with all the follow-ups, the fact remains, that both Mark and Cassio are ousted from their Google accounts.
Hill writes - a Google spokeswoman said the company stands by its decisions, even though law enforcement cleared the two men.
Apple also announced a similar service that will scan iCloud photos for CSAM content but the rollout was delayed indefinitely after resistance from privacy groups.
A case for Decentralized Content Storage?
Google, Apple, and Facebook have tons of data collected from each of their users. The above story is just one example showcasing how powerful Google is. The company has created a sleuth of tools that you cannot live a digital life without, but at the same time is able to scan every little photo of yours, every file you store, and every message you send.
There are no good alternatives that can replace these products immediately. We are quite literally at the mercy of these giants to live our life.
But, is decentralized storage and file-sharing a solution? Something that comes very close to a working distributed Peer-to-peer system today is IPFS. A lot of Web3 projects are actually modelling their decentralised storage architecture based on IPFS.
IPFS is a peer-to-peer (p2p) storage network. Content is accessible through peers located anywhere in the world, that might relay information, store it, or do both. Torrents are still one of the best means for peer-to-peer file sharing.
Slap a bit of encryption on top of the files stored on these peer networks and you have yourself a fairly secure distributed network of copies of files that no one but only the user with the decryption key can access.
The problem isn’t that secure P2P solutions aren’t available, the problem is that they are not as easy to work with as a tech giant like Google or Apple can make them.
Then there is a regulatory problem. Tools to identify child abuse material have been in existence for over a decade now. We can replace the services that use such tools, but what if the government pushes for scanning each file on these P2P storage servers eventually? Who is to say that one such photo on a storage node won’t be incorrectly flagged?
We honestly don’t know.
But one thing we can possibly rely upon decentralization for is the possibility of process distribution. Imagine a scenario where instead of just google, a community was to decide the course of action. What if Mark or Cassio were given opportunities to explain themselves and the decision on their data was taken by collective efforts instead of an obtuse communication channel? What if there was decentralized governance to control these processes?
These are pipe-dreams as of now, but could very well be our future if we pushed hard for them.
For now though, to protect yourself, stop syncing your photos with Google photos automatically if you use Android phones and delete intimate photos from your devices and cloud. Back up and have copies of your data across multiple devices and using different services. Do not rely on only google services for your online signups and setups.
And one of the most important things you can do today is subscribe to this newsletter. If you are already subscribed, I thank you from the bottom of my heart and request you to share this with whosoever you may feel should learn from it.
P.S. Leave a comment. I am listening reading.
I don’t have any media sync. Google, Apple nothing. I take backups and put them in my external hard drive. It’s more work, but better for privacy. Then we have some folks who save their seed phrase in gmail drafts. It’s a disaster waiting to happen. When using cloud storage, you’re essentially delegating your media management to big tech for convenience. If you want convenience you’ll have to compromise on privacy.