Google Photos flags father's pictures of son as CSAM, starts police investigation

Archived alternative in case the original gets taken down:
https://archive.ph/tL6wk#selection-1153.27-1153.172

There’s a LOT to unpack here. Going to try and do this without getting this page flagged by someone/thing, so apologies for gratuitous censorship.

Mark noticed something amiss with his toddler. His son’s looked swollen and was hurting him. Mark, a stay-at-home dad in San Francisco, grabbed his Android smartphone and took photos to document the problem so he could track its progression.

OK this is completely reasonable imo. Everyone’s gone to the doctor with a rash or abscess or whatever, and one of the first questions you get is “has it changed at all since you first noticed it?”. As a relatively new father, my wife is constantly asking me “Does this red spot look bigger? Did he have this scratch before?” etc.

With help from the photos, the doctor diagnosed the issue and prescribed antibiotics, which quickly cleared it up.

Tada. OK, this all seems legitimate and above-board to me.

After setting up a Gmail account in the mid-aughts, Mark, who is in his 40s, came to rely heavily on Google. He synced appointments with his wife on Google Calendar. His Android smartphone camera backed up his photos and videos to the Google cloud. He even had a phone plan with Google Fi.

Also feels like a pretty reasonable place to be for most people. Everyone wants an integrated ecosystem because it’s convenient and/or required. There are really only three options (if you count Microscoft) and so you have to choose. Even at this stage in my life, I’m still using a Google account to access a couple paid applications on the Play store, and for academic software Google groups where that’s the only option.

Two days after taking the photos of his son, Mark’s phone made a blooping notification noise: His account had been disabled because of “harmful content” that was “a severe violation of Google’s policies and might be illegal.” A “learn more” link led to a list of possible reasons, including “child sexual abuse & exploitation.”

OK, so this is where the story gets a little confusing to me - I’ll spare the confusion here. At the bottom, it says he was using Google Photos, and this is where the scanning for CSAM occurs. Perhaps this isn’t news, although it’s new to me. A few years ago I moved everything off Google Photos to NextCloud but there are some killer features for Google Photos (actually good OCR, ability to text search, and good sorting) and my family all still use Google Photos. It’s a huge selling point of Android.

OK this next sentence everyone already knows is coming:

A few days after Mark filed the appeal, Google responded that it would not reinstate the account, with no further explanation.

…but what I didn’t expect:

Mark didn’t know it, but Google’s review team had also flagged a video he made and the San Francisco Police Department had already started to investigate him.

OK so how does this end? The police find nothing, Google won’t reinstate his account (but he stops trying, understandable). There’s a few details though, that are pretty interesting and I don’t want to get missed:

When Mark’s and Cassio’s photos were automatically uploaded from their phones to Google’s servers, this technology flagged them.

This has been the entire sale this whole time and discussion about CSAM. No human intervention is necessary (so no-one is looking at your photos or reading you messages) but the AI can still detect new, never-before-seen child porn. It’s a win win because think of the children, and also because it doesn’t invade your privacy since no human is involved in flagging it.

Except…

A human content moderator for Google would have reviewed the photos after they were flagged by the artificial intelligence to confirm they met the federal definition of child sexual abuse material.

So, to recap up to this point:

  • Father takes images of child’s genitalia to track infection/send to doctor
  • Google AI flags it
  • Google employee views non-sexual, medical images of a couple’s naked child without explicit permission
  • Google employee decides it is sexual, and forwards it to the police (indirectly, apparently)

Am I just so blown away because I never really internalized what this scanning means re: false positives? Is it because I’m a relatively new father? I always thought the scanning was a bad idea, but in my head, it went something like:

  • Upload image of CSAM
  • AI detects and flags. False positives. Maybe law enforcement. Automatic account deletion.

Somehow it didn’t occur to me that there’s a Google employee looking at photos of people’s naked children, and/or medical information. Now, Google isn’t a healthcare provider so I’m not sure HIPAA applies, but what I don’t understand is how on earth a Google employee looking at a legal photo of a naked child that isn’t theirs, is not illegal? How is that not distributing CSAM? Is Google magically exempt from the laws surrounding this? At the very least, this is a gross, abso-fucking-lutely insane invasion of privacy. Google is literally taking other people’s private photos, and distributing them to their employees.

One more little tidbit:

In December 2021, Mark received a manila envelope in the mail from the San Francisco Police Department. It contained a letter informing him that he had been investigated as well as copies of the search warrants served on Google and his internet service provider. An investigator, whose contact information was provided, had asked for everything in Mark’s Google account: his internet searches, his location history, his messages and any document, photo and video he’d stored with the company.
The search, related to “child exploitation videos,” had taken place in February, within a week of his taking the photos of his son.

Hope he didn’t have anything else to hide.

7 Likes

When Apple wound up to try the same, they were intending to work with the people at the NCMEC / police to do the verifying.

They acknowledged that parents and relatives would get caught up in the net, and it would require investigation.
So IIRC, they would report to police, but not block accounts by default. then the police/NCMEC could give the all clear.

Unfortunately, the “think of the children” does allow companies and govs to stamp on peoples rights, freedom and accounts, even where there is no wrong doing, and often accounts are not re-instated.

2 Likes

As I’m a little tech aware, I’ve always ensured minimal nude pics of my own son. Pretty sad I have to do that, and it seems you can’t be too careful, though losing my Google storage (a biz account) would push me very easily to using SyncThing for sure.

This isn’t victim blaming, it isn’t the father’s fault (unless he really is a pedo), however you really shouldn’t take nude photographs of your child. They could very easily be obtained by a hacker/ or in a leak if you are dumb of enough to sync everything with google drive.

Also, I find it weird in general that someone would take pictures of a child naked. I never had a picture taken of me naked as a kid neither did anyone I know. Even if there is a history of this stuff occurring in a non-sexual way, it can only be at most 40 years old (or whatever) since that is about the length of time that photography has become accessible enough to become a part of everyday life.

In my opinion photographing of small children at all should be a criminal offense considering how they are completely unable to give informed consent and they don’t really have anyway to avoid being photographed. Of course this will mean we will have to stop video taping everything, another positive.

Yes, parents should get informed consent from their children. Can’t tell if trolling…

5 Likes

I have video and pics taken of me as a child. Under 5 years old. I was naked as was the norm in my family for children of that age in the privacy of our home. I’m 60 now… Times change and not always for the better.

4 Likes

well, vinyl is seeing a comeback, maybe the same will happen for film?

unrelated to this thread really, but I was recently thinking about how much of a faff it is to keep digital photographs safe from being lost due to a data crash, etc.

and then i thought about the physical family album that sits in our shelf, undisturbed for over 20 years. and the photos in there look just as they did 20 years ago, and in just as good condition.
same goes for passwords - lots of password manager technology out there, and ive found myself re-discovering the “discrete notebook tucked neatly on the shelf” password management technology. It’s acutally quite good!

the point being, just having physical copies of information does actually short-circuit quite a lot of problems that you otherwise get when trying to store things digitally, eg. risk of loss, data mining and data indexing, etc.

2 Likes

Photography has not existed long enough to establish a tradition or precedent worth keeping. It would be different if this was a practice that extended tens of thousands of years. Lets also be clear the common nudity of children throughout history has also been accompanied by common place sexual abuse.

Children can’t give informed consent.

P.S. If you have any question on whether someone is trolling I think its much better to message them/ the mods directly since the majority of the time a public accusation of bad faith is the easiest way to devolve a discussion. I don’t doubt you are saying that in good faith so I am being nice, but just be more conscious next time.

1 Like

Problem is pure analog is hard to come by and as a consequence of being part of hipster culture, pretty expensive. Most of the sane printing method involves some of sort digital crossover of some sort before it goes back into printing into paper/film.

I don’t think that whether there should be photographs of children or not is of much consequence to this discussion, as there are many photographs of children. Even if it ought not be accepted, it most certainly is.

This is the thing that bothers me the most. Some random Google employee looked at photos of this guy’s kid’s genitalia. That is the grossest invasion of privacy imaginable, of all the possible invasions of privacy that are possible, that is a literal worst case one. And yet not many people seem to be as upset about that as I think is necessary.

Google has no business injecting itself in the middle here. If it matches hashes, fine. But this is insane.

3 Likes

Poor random google moderator getting all desensitized with all these kinds of crap.

Literally “someone else’s computer”

I suspect, they are only doing what they are doing, because they were forced to.

Honestly, I don’t think Google cares.
Their business model is to hoover up data, and then monetise it.
Part of that is uploading peoples data to The Cloud, sometimes by choice.
Once there, they might not even look at it, simply hold it to ransom when “free” storage runs out.

Because their system is to hoover as much data, then some bad gets swept up, and I Presume they don’t want to be liable for hosting illicit material, if a government can charge them with not taking reasonable precautions

So they start using AI.

But AI are bad (a lot. Until it gets better)
They need to check samples for accuracy.

I don’t think it is right.
Nor that They will stop looking.
They might change who does the looking in future, with public support/outrage.

And because they are a private company, they can terminate accounts unfairly, immediately, with no recourse.
Which there should be a law against it, or the very least, hold data in escrow awaiting a legal trial.
But I suspect they would terminate a false positive…

At least, that’s what this tin-foil thinks

2 Likes

Slightly off topic but i suspect to the vast majority here this is depsessing and saddening but not in any way shocking or odd. This is the exact reality the the “paranoid crazies online” have been warning about for years.

If the system exists it will be abused, maybe not maliciously but it will destroy peoples lives completely in the wrong and when they are innocent.

But please… Think of the children… Who will one day be victimised by this same system that was built to protect them.

No one wants to think about the extent, reality and consequences of this, eveyone wants to protect their children, but when, now for a little while or never for the rest of their lives. These need to be thought about, talked about and the uncomfortable realites have to be discussed because avoiding them and going the way we are, everyone is guilty.

4 Likes

I’ll be the voice of dissent here, but this is my take on it.

Just don’t upload your shit to other peoples’ servers. Whether it’s ‘syncing’ services, cloud storage, whatever… once it leaves your device, it’s out of your control.

We live in a brave new world and while corporate marketing has abstracted everything away into The Cloud™, at the end of the day you’re sending your images/video to be stored on someone else’s spinning rust. That means doing a little bit of thinking about what the other party can/will do with it once it’s out of your hands. Does it suck that people who aren’t tech saavy (or are too naive to distrust Google/etc) are the ones that will be bitten by this? Maybe, but that’s the reality we live in now.

This.

2 Likes

That definitely the answer but at waht cost.

You cannot expect every person that buys an android phone to go through and disable the syncing, especially with the warnings and fear baiting google do when you turn features off.

You cant just not get an android phone usually, as it is either not really possible where you live due to cell bands or they just cant do what you need now days if its a dumbphone

This is something we have been led into and now locked into largely, i know not permentantly but people just dont know how to manage this part of their lives, i mean look at driving, most of them can barely manage that safely, tech nuances are just out the window.

The solution unfortunately for everyone, rather than just the specifically educated, is regulating this from ever happening, which i dont like but at this stage, we cant trust the user or the company, everyone is at fault here for allowing it to happen and not making a bigger fuss over this to get it stopped, but i circle back to “paranoid crazies online” not being believed and turned against by media for the likely reason we are hre.

2 Likes

Agree, I’m not saying it’s a good answer. It’s just that this -

- is the part that sucks for a lot of other people. I think most of us here at the L1T forums know how to navigate around bullshit privacy statements that have been completely meaningless for the last decade, or to set up our own remote stoarge/etc, but the average consumer has no concept of what is/isn’t stored on their own storage and what’s stored on Google’s.

The general consumer is going to start seeing the results of their “So? I have nothing to hide.” mentality getting worse and worse into the future.

4 Likes

This.
For all the “just don’t”-arguments, it is increasingly difficult to function as a member of society with a “dumb-phone”. Bit of a trust me bro argument, so YMMV.

I am awaiting the day after privacy was shot in the head in bright daylight, on the townsquare no less.
And everyone will suddenly pull out the very real torches and pitchforks…


My take on the above article is more or less what has been stated:

  1. When you have no business looking at those pictures (you shouldn’t), and your doctor doesn’t either (unless asked to do so), then any Big-Corp Inc. should land its entire board of directors in jail for doing or facilitating it.

  2. Stuff leaks, and the speed of outrage is WAY off the charts. Peoples lives are ruined in seconds, almost Social Credit Score levels of “no job, no loan, no life” happens without anyone asking “was this done with malicious intent?”.
    And the “Oh, sorry bro”-levels of apology will not magically undo the psychological horror of a letter from the local police department informing you about an investigation, and the loss of friends and family your mugshot in the local newspaper may entail.

2 Likes

Or worse, getting off to the access to that which is illegal to everyone else.

Who vets them, who hires them, who decides that this particular person is responsible to handle not just CSAM but potential false positives that could ruin the persons life because they didn’t wait another second before reporting and thinking.

Its weird we have designated people okay to view CSAM, something that is literally illegal to everyone…


Addition: In a way by implementing this system with false positives that are forced to be checked by a person, by the nature of what’s happening creates CSAM. As in this case they are using his child to blackmail and harass him, they are abusing the power they hold over the pictures of his child and what that can do to him.

Ironically in the act of trying to stop it creating the very thing.

1 Like

Right?! Google is creating CSAM by taking private medical photos and distributing them to someone.

2 Likes