I’ve got several post drafts written in my Drafts app about this issue, but I’ve found myself struggling to express how I feel.
For context: Apple’s announcement from last week
The two most controversial changes Apple announced were a new Messages app feature that uses ML on children’s accounts to detect if they are sending or receiving sexually explicit material, and a new feature for all iCloud Photo users in which your devices will start checking all of your photos to see if their hash matches the hash of known existing child sexual assault material.
Children’s Messages ML Scanning
Simply put, children are entitled to privacy too, especially on their digital devices. The need for privacy is especially important for minors. I’ve quoted David Brooks in the past on this, and although his NYT columns are varied in quality, on the point of privacy I believe he is correct:
Privacy is important to the development of full individuals because there has to be an interior zone within each person that other people don’t see. There has to be a zone where half-formed thoughts and delicate emotions can grow and evolve, without being exposed to the harsh glare of public judgment. There has to be a place where you can be free to develop ideas and convictions away from the pressure to conform. There has to be a spot where you are only yourself and can define yourself.
When children played together in decades past, we didn’t insist that they bring a tape recorder or other device with them to monitor for illicit things they say. Just because technology today makes it easier to monitor them today, doesn’t mean we should.
And that’s just assuming that computers can do a decent job at this. We know they can’t! Trusting computers to make judgment calls about images is profoundly irresponsible with the technology we have. Machine learning creates some impressive demos sometimes, but it’s not that accurate, as anyone who has ever typed “damn you autocorrect” can attest to (and autocorrect is acting on much simpler inputs). I have no problem using ML to let me search for pictures of sandwiches in my photo library, but when it comes to something like monitoring children’s messages for sexually explicit content, the stakes are a hell of a lot higher, and our tech doesn’t meet that bar.
It’s beyond naïve to trust automated systems to accurately make these judgment calls about the content our children are sending each other via Messages. Hell, we know that human systems for moderation do a poor job. We know that moderation systems systematically will over-flag LGBT content as sexual even when it isn’t.
Protecting children is always a popular choice among people looking to deploy mass surveillance into people’s electronics because the idea of a predator sending materials to your child, or worse, coercing your child into sending them materials of themselves, is very unsettling.
But it’s even more unsettling to me that we are willing to go head first into deploying a dragnet that is automatically monitoring 100% of children opted in, and will give them stern warnings (warnings that to LGBT children could be terrifying) about reporting things to their parents.
Combine that with the fact that now that this system is already built, a few tweaks could turn it into a tool for mass oppression in the wrong hands, and it’s downright irresponsible of Apple to deploy this tool, no matter how well-intentioned it was.
Photos Scanning for CSAM
Apple’s come out of the gate emphasizing that this is just for iCloud photos and not applying to your Messages photos or anything (as if that makes it much better), but for me this whole issue comes down to this: my photo library is a personal and private collection, akin to a collection of photo albums I keep in my home. The fact that it gets replicated to the cloud is nothing more than an implementation detail, and that doesn’t make it any less private.
Apple seems to take the fact that your photos are going into the cloud as tacit consent for them to be analyzed for law enforcement, and instead of focusing on whether that’s right, they instead focus on the subject matter of what’s being analyzed, again because it’s hard to argue in favor of the proliferation of child porn.
Our devices, along with the data on them, are manifestations of our private selves. It’s true that when I carry my phone with me now, I’m also carrying an unprecedented amount of personal information about myself. But it is that precise vulnerability that makes it all the more essential that the privacy of the contents of my phone remain sacred.
I wouldn’t let a police officer constantly snoop around my house just because the police officer is only allowed to be looking for one specific item and one item only, and it should be obvious why I don’t.
Just because technology makes it possible for a little robot to do the same thing to your private data at scale, it doesn’t make it any more acceptable.
Stop accepting the framing that the threat makes the tradeoff worth it. Our children deserve better than to be dehumanized by having their communications monitored by a robot, only to be told it’s for their own good. As citizens we deserve better from private corporations and the law enforcement lobby that’s no doubt pushed them hard on implementing these things.
Privacy is a fundamental human right. There’s no fine print attached to that that says “except child porn.”
Leave a Reply