Can code just be ‘disappeared’ from the internet? – POLITICO

With help from Mohar Chatterjee

The U.S. Treasury Department building. | Patrick Semansky/AP Photo

The U.S. Treasurys recent sanctions of Tornado Cash are opening important new fronts in the ever-evolving arms race between government regulators and the digital innovators trying to build a new world without them.

This week: Can an open-source piece of code really be deplatformed?

Cryptocurrency, and much of the open internet, is based on the idea that computer code is a shared public resource that can live more or less forever online. Bitcoin, to take the best-known example, is nothing but a bunch of servers running the same protocol and tracking the same list of transactions.

Tornado Cash, as most of the crypto world knows by now, is a mixer, a piece of software that obscures the origin of cryptocurrency. Worried about its use for money laundering, the Treasury Department has been trying to bar people from using it, including by sanctioning Tornado Cash itself.

The issue is that Tornado isnt a person, a country or a company, the typical subjects of Treasurys blacklists. As we addressed here in the wake of sanctions, it's a self-executing piece of software, something without an owner, a legal residence or a bank account. It might even enjoy some constitutional protections.

Last week, we took a look at the First Amendment questions raised by the Tornado sanctions.

This week, were seeing those questions start to be put to the test in a way that raises the prospect of a broader controversy over the platforming of controversial code, just as the platforming of controversial social media content has become a hot-button political issue.

In response to Treasurys sanctions, GitHub, a platform for software developers, took down pages that were used to develop the tool. Because of the novelty of applying sanctions to open-source software, it is not clear whether GitHubs takedown was required by law, but, when in doubt, companies often err on the side of complying with the governments wishes.

While the U.S. now forbids its use, a portion of the Tornado code continues to exist on the decentralized Ethereum blockchain network, from which it would be impractical to remove it. On GitHub, the full code had been available in an accessible form, until it was yanked offline.

Except now its not offline. Matthew Green, a computer science professor at Johns Hopkins University, announced this week that he has re-uploaded the source code to GitHub.

Green, who studies cryptography and teaches students about Tornado Cashs privacy features, tells Digital Future Daily that this is the first instance hes aware of in which notable open-source code has gone offlineand its worth worrying about.

The idea that source code disappears from the internet is a really bad thing, he said. In his view, it's not unlike banning or destroying books: It's an act that diminishes the store of shared human knowledge.

Though Green has not yet been embroiled in any legal fights over his republication, he is already being represented by the Electronic Frontier Foundation, a veteran of fights with the government over code and expression, which asserts that Greens republication does not violate the Treasury Departments sanctions. The group argues that his posting of software code itself for the purpose of study or improvement amounts to speech.

Critics of the sanctions argue that they will lead private companies, eager to avoid irritating the government, to take pre-emptive steps against secrecy tools beyond what the law requires, even when they interfere with legitimate privacy applications or free expression.

A spokeswoman for GitHub, Sandra Dieron, did not address questions about whether or not the platform planned to take Greens post down. In a statement, she said, We examine government sanctions thoroughly to be certain that users and customers are not impacted beyond what is required by law.

Green is no stranger to controversies over software and government power. In 2013, an administrator at Hopkins, which has close ties to the federal government, asked Green to take down a blog post discussing Edward Snowdens leak of National Security Agency material, then changed course and apologized after the takedown generated online outrage.

As the Biden administration steps up its focus on blockchain, and interest in new cryptographic methods continues to grow, Green predicts that fights over secrecy tools will multiply.

This privacy stuff, he said, is going to snowball.

Tesla thinks that driving is such a complicated, fast-moving task that it requires extremely powerful hardware to train machine learning models for it at scale. | Spencer Platt/Getty Images

Earlier this week, Tesla unveiled DOJO its in-house supercomputer for machine learning.

The idea isnt to help Tesla build cars, but to drive them. The company thinks that driving is such a complicated, fast-moving task that it requires extremely powerful hardware to train machine learning models for it at scale. As top Tesla engineer and DOJO head Ganesh Venkataramanan put it: "real world data processing is only feasible through machine learning techniques.

And that, in turn, requires computers of a kind we havent seen before.

For the technorati, DOJO is an incredible machine: a single training tile, or processing unit of this supercomputer, can reach 1 exaflops in computing speed, twice that of leading current supercomputers like the Japanese Fugaku. (An exaflop is one quintillion (1018) floating-point operations per second.)

The DOJO seems to be the result of Teslas frustration with the lack of scalable computing systems that can handle the intense and unexpected challenges of truly self-driving cars and is a reminder that one of our most straightforward human activities is a very big issue to automate safely. Mohar Chatterjee

Bias in AI systems is an increasingly important policy issue, but the real mechanics of how the bias creeps in, and how it works, are frustratingly difficult for non-experts to understand.

But not impossible. In June, Vox published a video explaining the popular new AI image generators like DALL-E 2. For the uninitiated, DALL-E 2 is a tool that lets you describe an image you want to see any image and will create sometimes beautiful, occasionally haunting and often disconcertingly photorealistic renderings of your prompt.

Starting at 5:58, the video does a particularly good job explaining how the model actually works, and the biases inherent in the process. Much of the bias stems from the data the model is trained onin this case, hundreds of millions of random images scraped off the internet. But the latent space in which the model figures things out is so complex that it is nearly impossible for humans to perceive what, or even how, its learning.

And if youve been watching the explosion of misinformation in politics over the past few years and youre worried that a tool like this could be misused youre not the only one. Mohar Chatterjee

Stay in touch with the whole team: Ben Schreckinger ([emailprotected]); Derek Robertson ([emailprotected]); Mohar Chatterjee ([emailprotected]); Konstantin Kakaes ([emailprotected]); and Heidi Vogt ([emailprotected]). Follow us on Twitter @DigitalFuture.

Ben Schreckinger covers tech, finance and politics for POLITICO; he is an investor in cryptocurrency.

If youve had this newsletter forwarded to you, you can sign up here. And read our mission statement here.

Visit link:
Can code just be 'disappeared' from the internet? - POLITICO

Related Posts
This entry was posted in $1$s. Bookmark the permalink.