Apple’s privateness status is in danger with new adjustments

During the Consumer Electronics Show (CES) 2019 in Las Vegas, Nevada, USA, on Monday, January 7, 2019, a monorail with Google signage drives past an advertising panel that advertises the safety of the Apple iPhone.

Bloomberg | Bloomberg | Getty Images

Apple this week announced a system that will allow it to flag and report to authorities pictures of child exploitation uploaded to iCloud storage in the United States.

The move was welcomed by child protection advocates. John Clark, CEO of the National Center for Missing and Exploited Children – a nonprofit created by a mandate from Congress – called it a “game changer” in a statement.

But the new system, which is now being tested in the US, has also been loudly rejected by data protection officers, who warned that it is a slippery slope and could be tweaked and further exploited to censor other types of content on people’s devices .

Apple is not unique in its efforts to rid its cloud storage of illegal child pornography images. Other cloud services are already doing this. Google has been using hash technology since 2008 to identify illegal images on its services. Facebook said in 2019 it removed 11.6 million pieces of content related to child nudity and child sexual exploitation in just three months.

Apple says its system is an improvement over industry approaches as it leverages its hardware controls and sophisticated math to learn as little as possible about the pictures on a person’s phone or cloud account, while also doing illegal child pornography on cloud servers Report to. It doesn’t scan actual images, it just compares hashes, the unique numbers that correspond to the image files.

But privacy officials see the move as the beginning of a policy change that could put Apple under pressure from overseas governments to repurpose the system to suppress political utterance, for example, by asking Apple to flag photos of protests or political memes. Skeptics are not concerned about how the system works today and do not defend people who collect familiar images of child exploitation. They worry about how it might play out in the years to come.

Skeptics worry about how the system might develop

“Make no mistake: if you can scan for child porn today, you can scan for anything tomorrow,” tweeted NSA whistleblower Edward Snowden.

The Electronic Frontier Foundation (EFF), which has previously supported Apple’s encryption and data protection guidelines, has criticized the move in a blog post, calling it a “back door” or a system that gives governments a way to access encrypted data.

“Apple can explain in detail how its technical implementation ensures privacy and security in its proposed back door, but at the end of the day even a well-documented, carefully thought-out, and narrow-minded back door is still a back door,” influential nonprofit said in a blog post.

Apple’s new system has also been criticized by the company’s competitors, including Facebook subsidiary WhatsApp, which also uses end-to-end encryption for some of its messages and has been under pressure to give more access to people’s content to prevent child exploitation.

“Rather than focus on making it easier for people to report content that is shared with them, Apple created software that can scan all of the private photos on your phone – even photos you haven’t shared with anyone,” tweeted WhatsApp boss Will Cathcart on Friday. He said WhatsApp won’t adopt a similar system. “That’s not privacy.”

Data protection has become a central part of iPhone marketing. Apple has made the security architecture of its systems public and is one of the vocal defenders of end-to-end encryption, which means it doesn’t even know the contents of any messages or other data stored on its servers.

Most notably, it went to court against the FBI in 2016 to protect the integrity of its encryption systems in the investigation against a mass shooter.

Apple has advocated this stance. Law enforcement agencies around the world have put pressure on the company to weaken its encryption for iMessage and other software services like iCloud to investigate child exploitation or terrorism.

Apple sees it as a win-win

Apple sees the new system as part of its privacy tradition: a win-win situation in which it protects user privacy while eliminating illegal content. Apple also claims that the system cannot be used for other types of content.

But that’s also why privacy advocates see the new system as a betrayal. You feel like you’ve lost an ally who built computers to – as much as possible – prevent data leaks to governments, Apple, and other companies. Now you see, as Snowden put it, a system that compares user photos with a “secret blacklist”.

That’s because of Apple’s own marketing. In 2019, during an electronics fair in Las Vegas, it bought a giant billboard with the slogan “What happens on your iPhone stays on your iPhone”.

Apple CEO Tim Cook has dealt with the “chilling effect” of knowing that what is on your device can be intercepted and verified by third parties. Cook said a lack of digital privacy could lead people to censor themselves even if the person using the iPhone didn’t do anything wrong.

“In a world without digital privacy, you begin to censor yourself even if you haven’t done anything wrong except thinking differently,” Cook said in a 2019 opening speech at Stanford University. “Not quite at first. Just a little bit, little by little. Risk Less, Hope Less, Imagine Less, Dare Less, Create Less, Try Less, Talk Less, Think Less. Digital surveillance is profound and affects everything. “

Apple’s approach to data protection has been successful for the company. This year, paid data protection services such as Private Relay were introduced, a service that hides the IP addresses and thus the location of users.

Privacy was also part of the sales pitch as Apple ventures into lucrative new industries like personal finance with its Goldman Sachs-powered credit card and healthcare with software that allows users to download medical records to their iPhones.

But reputations can be quickly destroyed, especially if they appear to contradict previous public positions. Privacy and security are complicated and not exactly conveyed by marketing slogans. Critics of Apple’s new plan to end child exploitation do not see a better developed system that improves what Google and Microsoft have been doing for years. Instead, they see a significant change in the company’s policy, which said, “What happens on your iPhone stays on your iPhone.”

Comments are closed.