Researchers open-source neural network with 117B parameters – SiliconANGLE News

A group of researchers today released Bloom, an advanced natural language processing model that features 117 billion parameters.

The researchers have made the code for Bloom available under an open-source license.

The project began last year as collaboration between Hugging Face Inc., an artificial intelligence startup that recently raised $100 million from investors, and two supercomputing organizations in France. Hugging Face and its partners formed a research group called BigScience to lead the development of Bloom. More than 1,000 researchers from more than 70 countries participated in the effort.

Bloom supports 46 languages and 13 programming languages, BigScience researchers wrote in a blog post today. The AI can answer questions, summarize text, extract snippets of information from documents and perform a variety of other tasks. Blooms versatility is partly the result of the fact that it features 117 billion parameters.

Parameters are the settings that determine how an AI goes about performing a computing task. The more such settings an AI system includes, the more advanced the tasks that its capable of performing. With 117 billion parameters, Bloom is one of the most sophisticated natural language processing models in the world.

Bloom features more parameters than the advanced GPT-3 neural network that OpenAI LLC detailed in 2020. Like Bloom, GPT-3 is optimized for natural language processing use cases. Its also capable of performing other tasks such as generating software code.

BigScience researchers trained Bloom using the Jean Zay supercomputer near Paris. The supercomputer, which includes AI-optimized graphics cards from Nvidia Corp., has a top speed of more than 28 petaflops. One petaflop equals a quadrillion calculations per second.

This is the culmination of a year of work involving over 1000 researchers from 70+ countries and 250+ institutions, leading to a final run of 117 days (March 11 July 6) training, BigScience researchers detailed today. The development effort was supported by a compute grant worth an estimated 3M from French research agencies CNRS and GENCI, they elaborated.

Alongside the code for Bloom, the BigScience research group open-sourced some of the technical data that was produced during the development process. Developers can run Bloom on their own hardware or access a hosted version of the AI through an application programming interface provided by BigScience.

In the future, the research group plans to develop a new version of Bloom with even more advanced capabilities. BigScience intends to add support for more languages and optimize the AI to make it easier to run on a companys own infrastructure. BigScience will also develop additional AI systems with more complex architectures than Bloom.

Link:

Researchers open-source neural network with 117B parameters - SiliconANGLE News

Related Posts
This entry was posted in $1$s. Bookmark the permalink.