The idea of protecting young children, along with many others such as protecting the national security, has become an excuse of doing many questionable things. Recently, Apple published that they are going to put child sexual abuse prevention in actions by pushing new features in iOS 15 system which will be available later this year. There are two major features, one is a hash matching for all photos sent to iCloud comparing with all known child sexual abuse pictures in a database, and the other is scanning the iMessage messages through all iPhones under parental control.
To Scan or not Scan
It is quite hard to recall that before the digital age, how people are facing the question that whether personal information should be scanned or not. Back to that time, information is documented on paper, and no one is sharing personal information publicly on paper unless necessary. So, if you would like to scan someone’s personal information, you would have to interrogate him or her in person a hundred years ago. Fast forward to the current moment we are living in, personal information is everywhere in a digital format. Either it is the picture one has taken, some personal notes put on cloud storage, or even important meeting information. For years, the government has been putting pressure on all big technology companies to gain the access to the personal information they have gathered. Of course, if the government is able to scan all these information easily, it is going to be easier for them to keep the power to control the whole country. It is not very rare to see many tech companies have already bend their knees and agreed to be scanned performed by the government, but it was a quite unique case when I first heard that Apple refused to unlock an iPhone from a suspect for the federal government. From then on, Apple has being trying to set up a strong image of being a protector of personal information in the market. This campaign includes a huge billboard with words “what happen on your iPhone stays on your iPhone”. This advertisement really worked in some way and I can feel the trust building up within the smartphone community regarding Apple brand. However, this time Apple failed their customers’ belief that Apple is able to protect their personal privacy no matter what.
Looks like this time, Apple not only wants to redefine a smartphone, it is also trying to redefine a language.
Recently, Apple has published that they are going to roll out a new iOS system, iOS 15, which has a feature to scan all photos uploaded to iCloud and check whether it is child sexual abuse material related. The algorithm will check the hash of the original photo and compare it with all known child sexual abuse pictures from a database. There is a threshold to ensure a low false positive rate for this algorithm, and once the number of photos from one user uploaded to the iCloud passed this threshold, the photo flagged by this algorithm will be decrypted by Apple, and a human being will be involved for further inspection. In the article Apple published, it specifically points out that this is not a “scan”, but if it is looping through all informations and work on it, it is a scan. Looks like this time, Apple not only wants to redefine a smartphone, it is also trying to redefine a language.
I bet my life on it that once the Chinese government knows there is a potential backdoor for accessing personal data on every iPhone, they would ask for it.
This feature raises many questions and concerns. For example, if Chinese government wants to add pictures more than just child sexual abuse pictures such as a HongKong independent flag or a Winnie bear, what would Apple do? Although the CTO of Apple, Craig Federighi, has stated that Apple would refuse to this, but the Chinese government could still threat to shut the door for Apple to sell their product in China in order to get access to these information. I bet my life on it that once the Chinese government knows there is a potential backdoor for accessing personal data on every iPhone, they would ask for it. Right before this feature, they probably didn’t know there is even a way for it. Besides Chinese, there is always a pressure from the U.S. government. On top of the pressure from governments around the world, there is another issue regarding this feature. As we all aware, child porn has been a way to frame and destroy people for ages, and this feature is going to make the process of framing people easier. The general way of framing people for child sexual abusive is putting a child porn related material in the victim’s digital devices, either a phone or a laptop. In this case, the police who is most of time useless will put this victim on guilty fairly quickly. This is an even fairly cheap and effective service in the market, and I am afraid this new feature of iOS 15 has made it easier to practice this procedure. Just snag a potential child porn picture then plant it into the victim’s iPhone to start, and Apple will finish the rest of the job for you.
The Boundaries of Children Privacy
I guess the question here is, do children ever have privacy after all?
Besides the photos scanning feature, there is another horrifying feature comes with it in iOS 15 focusing on children themselves. For every iPhone under parental control, Apple is planning to scan the messages sent through iMessages on it. If there is an impropriate image, the image itself will be blocked and a button will be shown under the image for further actions. After clicking the button, Apple will warn the user that this picture is not appropriate, and if you insist on showing this picture, your parents will receive it as well as you do. The first time I saw this process, many questions flooded into my head. I really want to see the picture and see whether there is a false positive! But the picture is just a block of color grey. If I was a teenager and saw this on my phone, this would even draw my attention to click and see what that even is.
I guess the question here is, do children ever have privacy after all? I definitely don’t want my parents to know what is going on within my phone when I was young, and I am sure most of youth share similar belief as I do. However, I am sure this feature is going to make more parents consider buying iPhone for their children which could be somehow seen as a market strategy made by Apple. It is understandable that parents are trying to protect their children, or the whole nation is trying to protect their children, but isn’t this way too much? This is exactly how things work when the government wants all our information on our phone, to protect us. I hope the same excuse does not happen on young and powerless children. After all, we could switch off iCloud uploading to not be effected by the first feature, but young adults are not always able to afford their own phones.
Mass surveillance won’t work on protecting mass ordinary people, and it surely won’t work on parental practices.
The above discussion reminds me another issue regarding how we try to protect our young children better. In many cases, we tend to cut things at a clear and specific point. By saying that, I am referring to we cut things youth can and cannot do at a specific age. For example, people are considered adults after eighteen years old, and we can legally drink at twenty one years old. Passing the age is a clear cutting edge without any transaction. It feels like people can suddenly take care of themselves after their eighteenth birthday, and are able to control their drinking issue after their twenty first birthday. Most of time, this is not usually true. There are young adults at seventeen who are even more mature than people at twenty five, and people with real problems needs our help regardless their age. Similar things happen here on iPhone parental control, instead of giving the power to parents to let them see what messages their children are sending, isn’t it better to teach these parents how to talk to their children? Mass surveillance won’t work on protecting mass ordinary people, and it surely won’t work on parental practices.
There is Always a Way to Bypass
As always, there is a loophole within every system, and same as inside of this system. For the feature which scans all photos uploaded to iCloud, assuming the user does care about privacy, one could switch to other cloud provider alternatives. I have seen many articles written about the comparisons of all different alternatives to iCloud after Apple published this news, and an easy search would do the job. If some users don’t need iCloud, such as me, and yes, we do exist as non-iCloud iPhone users, can easily turn off photo synchronization in iCloud setting. I am always a believer that everything uploaded to the internet is public to everyone and assuming compromised for all online data encryptions. However, it is very not certain that Apple will not scan everything on each iPhone regardless the data is sent to the cloud or not. I guess the best and only way to react to this is abandoning smart phones.
I feel apple is simply saying, we have done our part of duty to protect children, so leave us alone and none of the child porn happening on iPhones have anything to do with us.
As for young adults, there are not that many choices for them to choose their own phones. As someone lacks income sources, they are very vulnerable when they are trying to protect their own privacy. It looks like unless they are able to afford their own phones, there is no other ways for them to avoid being scanned while messaging anyone using iMessage. However, there is another way to bypass this, which is not using iMessage. There are so many other end-to-end encrypting messaging apps, and some of them even destroy the message after the receiver read it. I feel apple is simply saying, we have done our part of duty to protect children, so leave us alone and none of the child porn happening on iPhones have anything to do with us.
Kik Who Did Basically Nothing
Children sexual abuse has always been a topic since the internet exists. The internet speed up the information transferring process for everything including dangerous contents. Apple did something they believe it is the correct way to do things, there are some other companies choose to do nothing. Kik, a company provides online chatting whose users are mostly young adults and children, did nothing to control the children sexual abusive content thriving on their platform. The group chat feature enables pedophile to gather together and share child porn content to each other. You may invite more people to your group and share with no issue. On top of this, there is basically no practical way to report children sexual abuse content and get a response quickly on Kik whatsoever. The reporting process takes forever and the online support seems robotic. Even for today, Kik has not sorted out their solution for children sexual abuse content on their platform, or they are not even bothered to solve this issue at all.
Maybe the resulted policy made to attempt to solve this issue is not perfect, but it should at least not harm ordinary people who has been doing nothing wrong.
This might seems a paradox, since when Kik does nothing, we got mad, and when Apple does something, we got mad too. The question arrives as, what exactly we should do to control online children sexual abuse materials? The first thing we need to realize is the limited ability for anyone to control anything. Nothing can be fully controlled and there is no universal solver for anything. No matter how harsh the law is, someone is willing to break it regardless. However, does this passive reality say we should not try at all? No, not at all. We should attempt with the knowledge of this limitation, and never try too hard to hurt innocent people. What Apples attempts to do would control children sexual abuse material in someway, but it is going to hurt innocent people with no doubt. Policies making should never punish bad guys based on sacrificing good guys. So, maybe the resulted policy made to attempt to solve this issue is not perfect, but it should at least not harm ordinary people who has been doing nothing wrong.
Privacy vs Convenience
The debate between privacy and convenience started at the same time the internet was connected. The connection of internet means a process of sharing individual data to whoever connected on this line. This data sharing might give you conveniency by knowing who you are and where you live, but most importantly, it exploits your privacy. Online services use personal data to serve its user better, such as winery requires customers’ birthday to comply the law. However, sometimes some websites are trying to know more than they should. Especially the advertisements companies, they always try to know as much as possible to push everyone to buy their products. Besides ad companies, there is another branch of power which is trying to exploit individual privacy as well, the government.
There are two governments I have experiences with, the US government and the Chinese government. After Edward Snowden, there is no further arguments regarding whether or not and how much the US government is abusing individual personal data. Yes, they are abusing it, and people in the US hate it. We try whatever we could to protect us from being abused by the US government. However, the situation in China is a totally different story. There are less people who really care about their own privacy in China, and there are are even less people understand what is personal privacy in China. Most importantly, Chinese people are too afraid to talk against Chinese government even they think the government is abusing their power.
There should always be choices for people, and there should be no discrimination upon any choices made.
When the internet connects China with the rest of the world, the Chinese government gained a powerful tool to massively spy their citizens. By combining the power with tech giants in China, the Chinese government gains the access to everything. The horror show is, the massive majority believes this is fine for them. Even they have to share their true identity even vaccination proof with WeChat, they are fine with it. Notice here I use the word “have to”, people do not have choices to either share or not share, they are forced to share. In some cases happened last year during covid, elder people didn’t have access to public transportation since they do not have a smart phone to show their “green code” which indicates they are covid free.
The conveniency is nice but there is always a consequence by sharing personal data we should be aware of. Most importantly, there should always be choices for people, and there should be no discrimination upon any choices made by different people who believe in different things.