Facebook, AWS team up to produce open-source PyTorch AI libraries, grad student says he successfully used GPT-2 to write his homework…. – The…

Roundup Hello El Reg readers. If you're stuck inside, and need some AI news to soothe your soul, here's our weekly machine-learning roundup.

Nvidia GTC virtual keynote coming to YouTube: Nvidia cancelled its annual GPU Technology Conference in Silicon Valley in March over the ongoing coronavirus pandemic. The keynote speech was promised to be screened virtually, and then that got canned, too. Now, its back.

CEO Jensen Huang will present his talk on May 14 on YouTube at 0600 PT (1300 UTC). Yes, thats early for people on the US West Coast. And no, Jensen isnt doing it live at that hour: the video is prerecorded.

Still, graphics hardware and AI fans will probably want to keep an eye on the presentation. Huang is expected to unveil specs for a new GPU architecture reportedly named the A100, which is expected to be more powerful than its Tesla V100 chips. Youll be able to watch the keynote when it comes out on Nvidias YouTube channel, here.

Also, Nvidia has partnered up with academics at Kings College London to release MONAI, an open-source AI framework for medical imaging.

The framework packages together tools to help researchers and medical practitioners process image data for computer vision models built with PyTorch. These include things like segmenting features in 3D scans or classifying objects in 2D.

Researchers need a flexible, powerful and composable framework that allows them to do innovative medical AI research, while providing the robustness, testing and documentation necessary for safe hospital deployment, said Jorge Cardoso, chief technology officer of the London Medical Imaging & AI Centre for Value-based Healthcare. Such a tool was missing prior to Project MONAI.

You can play with MONAI on GitHub here, or read about it more here.

New PyTorch libraries for ML production: Speaking of PyTorch, Facebook and AWS have collaborated to release a couple of open-source goodies for deploying machine-learning models.

There are now two new libraries: TorchServe and TorchElastic. TorchServe provides tools to manage and perform inference with PyTorch models. It can be used in any cloud service, and you can find the instructions on how to install and use it here.

TorchElastic allows users to train large models over a cluster of compute nodes with Kubernetes. The distributed training means that even if some servers go down for maintenance or random network issues, the service isnt completely interrupted. It can be used on any cloud provider that supports Kubernetes. You can read how to use the library here.

These libraries enable the community to efficiently productionize AI models at scale and push the state of the art on model exploration as model architectures continue to increase in size and complexity, Facebook said this week.

MIT stops working with blacklisted AI company: MIT has discontinued its five-year research collaboration with iFlyTek, a Chinese AI company the US government flagged as being involved in the ongoing persecution of Uyghur Muslims in China.

Academics at the American university made the decision to cut ties with the controversial startup in February. iFlyTek is among 27 other names that are on the US Bureau of Industry and Securitys Entity List, which forbids American organizations from doing business with without Uncle Sam's permission. Breaking the rules will result in sanctions.

We take very seriously concerns about national security and economic security threats from China and other countries, and human rights issues, Maria Zuber, vice president of research at MIT, said, Wired first reported.

MIT entered a five-year deal with iFlyTek in 2018 to collaborate on AI research focused on human-computer interaction, speech recognition, and computer vision.

The relationship soured when it was revealed iFlyTek was helping the Chinese government build a mass automated voice recognition and monitoring system, according to the non-profit Human Rights Watch. That technology was sold to police bureaus in the provinces of Xinjiang and Anhui, where the majority of the Uyghur population in China resides.

OpenAIs GPT-2 writes university papers: A cheeky masters degree student admitted this week to using OpenAIs giant language model GPT-2 to help write his essays.

The graduate student, named only as Tiago, was interviewed by Futurism. We're told that although he passed his assignments using the machine-learning software, he said the achievement was down to failings within the business school rather than to the prowess of state-of-the-art AI technology.

In other words, his science homework wasn't too rigorously marked in this particular unnamed school, allowing him to successfully pass off machine-generated write-ups of varying quality as his own work and GPT-2's output does vary in quality, depending on how you use it.

You couldnt write an essay on science that could be anywhere near convincing using the methods that I used," he said. "Many of the courses that I take in business school wouldnt make it possible as well.

"However, some particular courses are less information-dense, and so if you can manage to write a few pages with some kind of structure and some kind of argument, you can get through. Its not that great of an achievement, I would say, for GPT-2.

Thanks to the Talk to Transformer tool, anyone can use GPT-2 on a web browser. Tiago would feed opening sentences to the model, and copy and paste the machine-generated responses to put in his essay.

GPT-2 is pretty convincing at first: it has a good grasp of grammar, and there is some level of coherency in its opening paragraphs when responding to a statement or question. Its output quality begins to fall apart, becoming incoherent or absurd, as it rambles in subsequent paragraphs. It also doesnt care about facts, which is why it wont be good as a collaborator for subjects such as history and science.

Sponsored: Practical tips for Office 365 tenant-to-tenant migration

Go here to see the original:
Facebook, AWS team up to produce open-source PyTorch AI libraries, grad student says he successfully used GPT-2 to write his homework.... - The...

Related Posts

Comments are closed.