Apple Caving on Hong Kong Shows the Limits of Security as a Sales Tool – PCMag

There's a saying that the biggest security vulnerability is located between the keyboard and the chair, highlighting human fallibility. It's true, we're easily tricked, and we're lazy as a rule. Human failings also bring down perfect systems of security and privacy, which is why clear, moral codes are required to protect those systems. When Apple agreed to remove the Hkmap.live app from the App Store under pressure from the Chinese government in Beijing, it illustrated just how tenuous even the most robust security and privacy systems can be.

What is security and privacy without morality? It's just a selling point.

For those who missed the story, pro-democracy protestors in Hong Kong have been using an app called HKmap.live to warn other protestors about police moving through the city. Apple first approved the app, and then banned it, claiming that it was being used to perpetrate crimes. Given the increasing violence amidst an intense government crackdown, it's easy to assume that protestors have an even more existential concern regarding the app's availability.

This reminded me how, not long ago, Apple squared off against the full force of the FBI and DOJ as the US government pushed for the company to grant it access to an iPhone belonging to the San Bernardino shooters. In that case, Apple refused. While the company had cooperated with law enforcement in the past, the request to essentially build a special backdoor into its operating system so the law enforcement could examine a device was more than Apple could bear.

Apple, along with a host of other companies, didn't budge on the issue. They even got support from former NSA types. In the end, Apple won out and the FBI ended up paying a third-party company a rumored one million dollars for a way into the phone.

It wasn't Apple's security practices, encryption systems, or engineering prowess that stood between investigators and the data within an iPhone. It was Apple's laudable willingness to stand by its stated beliefs and refuse to cooperate. The company could easily have stepped aside, but by choosing not to, it protected its devices and its users.

How could Beijing pressure Apple so effectively? NPR reports that last year, Apple sold $52 billion of products in China that last year. Maybe that has something to do with it.

Along with the code and the engineering that goes into protecting iOS, the App Store is the other mechanism Apple has for ensuring the safety and security of its users. Apple is able to extend security and privacy protections through its hardware and OS, but it's by managing its app store that it has the biggest impact on users. If any app attempts to circumvent Apple's privacy protections, it can be removed. Conversely, Apple can also choose to keep apps available despite controversy. The App Store supports many encrypted messaging apps, whose data cannot be read by law enforcement or even Apple itself.

Unfortunately, the company has a more mixed record on this front.

Apple has used its ban hammer to protect its walled garden from apps that slurped your personal information, unfairly tricked users, or were outright malicious. These actions have kept users safe, and encouraged good behavior among developers.

The company has also made controversial decisions about which apps to ban. It has kicked out apps that too closely replicate functions of the iPhone, that track drone strikes, or that grant access to so-called "adult content." This last one has always struck me as particularly odd, considering that the best app for porn on an iPhone is Safari.

Now imagine that it wasn't a crowd-sourced map that Apple banned at the behest of a government, but Signal or any other encrypted messenger apps, or the Tor app, or VPNs. (Actually, Apple has banned some of those apps in China before.) Those tools can also be used for bad thingsin fact, that's always law enforcement's argument against such appsbut they also protect individuals from harm, and afford them the privacy they desire.

I won't call Apple's decision to remove the HKmap.live the company's first, or even its greatest, moral failing. There have been others before this, and there will likely be more to come. It's also not the only company to have similarly failed. Google was criticized for removing a game where you played as a Hong Kong protestor, and various social media platforms are embroiled in roiling controversy over how they present information to users, and for what lengths they are willing to go to appease the Chinese government in exchange for access to its markets. Perhaps we shouldn't be looking to any for-profit corporations to fight our moral battles for usbut I digress.

What this sad drama does highlight is the tenuousness of privacy and security. A company can earn a sterling record of protecting its users and fostering exactly the kind of environment that makes people safer and allows them the freedom to speak their minds without fear of reprisal. Our connected devices, we're told by companies, aren't just products; they're supposed to make the world better. But even when a company, or an individual, uses all the right code and follows all the best practices, none of that matters if there aren't unwavering morals to back that up. It's deciding what is right and using the code to enforce those decisions that makes it all work.

I argued that the feds should let math be math. That's true as far as mechanics go, but it also a firm moral stance. Without the courage of your convictions, math is meaningless.

Excerpt from:
Apple Caving on Hong Kong Shows the Limits of Security as a Sales Tool - PCMag

Related Posts
This entry was posted in $1$s. Bookmark the permalink.