New Audis To Use Supercomputer That Controls Almost Everything – Motor1

As the mad dash for technology and innovation among automakers surges forward, new cars are becoming even more complex. In an effort to simplify how advanced components like the powertrain, chassis, and safety systems work together, the next crop of Audi cars will have much bigger brains.

That may sound like an oversimplification, but thats what Audi is getting at here. Today, the company showed off plans to incorporate a much more sophisticated computer called the Integrated Vehicle Dynamics computer, which functions as the central facility for the cars dynamic systems. Audis new central computer system is ten times more powerful than the one found in current models and will be able to control up to 90 different systems, depending on vehicle application.

The new vehicle dynamics computer will find its way into the entire Audi lineup. Everything from the compact A3 to the massive Q8 SUV will get the futuristic hardware, including the all-electric E-Tron. Audi says that this speaks to the dynamic computers versatility in that its designed to work with cars of all performance thresholds.

In Audis electric cars, the computer will monitor and control important systems such as brake regeneration, which contributes up to 30 percent of the batterys potential range. In hot rods like the RS models, trick systems like anti-roll stabilization and active suspension will rely on the vehicle dynamics computer for power. This also marks the first time that the chassis and powertrain controls will be housed within one component something that Audi says will bring a greater range of comfort and performance in its vehicles.

Audi didnt specify when the switch to the new dynamics computers would enter its product line, but an engineer mentioned that the component is ready for serialized production which means its happening soon. This doesnt mean much to the driver and how they interact with the car, but its one of several recent announcements from Audi that makes us even more excited for future models.

23 Photos

View original post here:

New Audis To Use Supercomputer That Controls Almost Everything - Motor1

Japanese supercomputer ranked as worlds most powerful system

A Japanese supercomputer built with technology from Arm Ltd, whose chip designs power most of the worlds smartphones, has taken the top spot among the worlds most powerful systems, displacing one powered by IBM chips.

The Fugaku supercomputer, a system jointly developed by Japanese research institute RIKEN and Fujitsu in Kobe, Japan, took the highest spot on the TOP500 list, a twice-yearly listing of the worlds most powerful computers, its backers said on Monday. The chip technology comes from Arm, which is headquartered in the UK but owned by Japans Softbank.

The previous top-ranked system as of November 2019 was at Oak Ridge National Laboratory in the US with chips designed by IBM. The chips from Intel and IBM had dominated the top 10 rankings, with the lone exception of a system at the National Supercomputing Center in Wuxi, China powered by Chinese-designed chip.

Governments use supercomputers to simulate nuclear blasts to perform virtual weapons testing. They are also used for modeling climate systems and biotechnology research. The Fugaku supercomputer will be used in such research as part of Japans Society 5.0 technology program.

I very much hope that Fugaku will show itself to be highly effective in real-world applications and will help to realize Society 5.0, Naoki Shinjo, corporate executive officer of Fujitsu, said in a statement.

The Arm-based system in Japan in November had taken the highest spot on TOP500s list for power-efficient supercomputers. Arm said the system also took the top spot in a list designed to closely resemble real-world computing tasks known as the high-performance conjugate gradient benchmark.

Go here to see the original:

Japanese supercomputer ranked as worlds most powerful system

Top 10 Supercomputers | HowStuffWorks

Advertisement

If someone says "supercomputer," your mind may jump to Deep Blue, and you wouldn't be alone. IBM's silicon chess wizard defeated grandmaster Gary Kasparov in 1997, cementing it as one of the most famous computers of all time (some controversy around the win helped, too). For years, Deep Blue was the public face of supercomputers, but it's hardly the only all-powerful artificial thinker on the planet. In fact, IBM took Deep Blue apart shortly after the historic win! More recently, IBM made supercomputing history with Watson, which defeated "Jeopardy!" champions Ken Jennings and Brad Rutter in a special match.

Brilliant as they were, neither Deep Blue nor Watson would be able to match the computational muscle of the systems on the November 2013 TOP500 list. TOP500 calls itself a list of "the 500 most powerful commercially available computer systems known to us." The supercomputers on this list are a throwback to the early computers of the 1950s -- which took up entire rooms -- except modern computers are using racks upon racks of cutting-edge hardware to produce petaflops of processing power.

Your home computer probably runs on four processor cores. Most of today's supercomputers use hundreds of thousands of cores, and the top entry has more than 3 million.

TOP500 currently relies on the Linpack benchmark, which feeds a computer a series of linear equations to measure its processing performance, although an alternative testing method is in the works. The November 2013 list sees China's Tianhe-2 on top of the world. Every six months, TOP500 releases a list, and a few new computers rise into the ranks of the world's fastest. Here are the champions as of early 2014. Read on to see how they're putting their electronic mettle to work.

Read the original:

Top 10 Supercomputers | HowStuffWorks

What are supercomputers currently used for? | HowStuffWorks

As we said, supercomputers were originally developed for code cracking, as well as ballistics. They were designed to make an enormous amount of calculations at a time, which was a big improvement over, say, 20 mathematics graduate students in a room, hand-scratching operations.

In some ways, supercomputers are still used for those ends. In 2012, the National Nuclear Security Administration and Purdue University began using a network of supercomputers to simulate nuclear weapons capability. A whopping 100,000 machines are used for the testing [source: Appro].

But it's not just the military that's using supercomputers anymore. Whenever you check the weather app on your phone, the National Oceanic and Atmospheric Administration (NOAA) is using a supercomputer called the Weather and Climate Operational Supercomputing System to forecast weather, predict weather events, and track space and oceanic weather activity as well [source: IBM].

As of September 2012, the fastest computer in the world -- for now, anyway -- is IBM's Sequoia machine, which can operate 16.32 petaflops a second. That's 16,000 trillion operations, to you. It's used for nuclear weapon security and to make large-scale molecular dynamics calculations [source: Walt].

But supercomputers aren't just somber, intellectual machines. Some of them are used for fun and games literally. Consider World of Warcraft, the wildly popular online game. If a million people are playing WoW at a time, graphics and speed are of utmost importance. Enter the supercomputers, used to make the endless calculations that help the game go global.

Speaking of games, we can't forget Deep Blue, the supercomputer that beat chess champion Garry Kasparov in a six-game match in 1997. And then there's Watson, the IBM supercomputer that famously beat Ken Jennings in an intense game of Jeopardy. Currently, Watson is being used by a health insurer to predict patient diagnoses and treatments [source: Feldman]. A real jack of all trades, that Watson.

So, yes: We're still benefiting from supercomputers. We're using them when we play war video games and in actual war. They're helping us predict if we need to carry an umbrella to work or if we need to undergo an EKG. And as the calculations become faster, there's little end to the possibility of how we'll use supercomputers in the future.

Go here to read the rest:

What are supercomputers currently used for? | HowStuffWorks

Microsoft announces new supercomputer, lays out vision for …

As weve learned more and more about what we need and the different limits of all the components that make up a supercomputer, we were really able to say, If we could design our dream system, what would it look like? said OpenAI CEO Sam Altman. And then Microsoft was able to build it.

OpenAIs goal is not just to pursue research breakthroughs but also to engineer and develop powerful AI technologies that other people can use, Altman said. The supercomputer developed in partnership with Microsoft was designed to accelerate that cycle.

We are seeing that larger-scale systems are an important component in training more powerful models, Altman said.

For customers who want to push their AI ambitions but who dont require a dedicated supercomputer, Azure AI provides access to powerful compute with the same set of AI accelerators and networks that also power the supercomputer. Microsoft is also making available the tools to train large AI models on these clusters in a distributed and optimized way.

At its Build conference, Microsoft announced that it would soon begin open sourcing its Microsoft Turing models, as well as recipes for training them in Azure Machine Learning. This will give developers access to the same family of powerful language models that the company has used to improve language understanding across its products.

It also unveiled a new version of DeepSpeed, an open source deep learning library for PyTorch that reduces the amount of computing power needed for large distributed model training. The update is significantly more efficient than the version released just three months ago and now allows people to train models more than 15 times larger and 10 times faster than they could without DeepSpeed on the same infrastructure.

Along with the DeepSpeed announcement, Microsoft announced it has added support for distributed training to the ONNX Runtime. The ONNX Runtime is an open source library designed to enable models to be portable across hardware and operating systems. To date, the ONNX Runtime has focused on high-performance inferencing; todays update adds support for model training, as well as adding the optimizations from the DeepSpeed library, which enable performance improvements of up to 17 times over the current ONNX Runtime.

We want to be able to build these very advanced AI technologies that ultimately can be easily used by people to help them get their work done and accomplish their goals more quickly, said Microsoft principal program manager Phil Waymouth. These large models are going to be an enormous accelerant.

In self-supervised learning, AI models can learn from large amounts of unlabeled data. For example, models can learn deep nuances of language by absorbing large volumes of text and predicting missing words and sentences. Art by Craighton Berman.

Designing AI models that might one day understand the world more like people do starts with language, a critical component to understanding human intent, making sense of the vast amount of written knowledge in the world and communicating more effortlessly.

Neural network models that can process language, which are roughly inspired by our understanding of the human brain, arent new. But these deep learning models are now far more sophisticated than earlier versions and are rapidly escalating in size.

A year ago, the largest models had 1 billion parameters, each loosely equivalent to a synaptic connection in the brain. The Microsoft Turing model for natural language generation now stands as the worlds largest publicly available language AI model with 17 billion parameters.

This new class of models learns differently than supervised learning models that rely on meticulously labeled human-generated data to teach an AI system to recognize a cat or determine whether the answer to a question makes sense.

In whats known as self-supervised learning, these AI models can learn about language by examining billions of pages of publicly available documents on the internet Wikipedia entries, self-published books, instruction manuals, history lessons, human resources guidelines. In something like a giant game of Mad Libs, words or sentences are removed, and the model has to predict the missing pieces based on the words around it.

As the model does this billions of times, it gets very good at perceiving how words relate to each other. This results in a rich understanding of grammar, concepts, contextual relationships and other building blocks of language. It also allows the same model to transfer lessons learned across many different language tasks, from document understanding to answering questions to creating conversational bots.

This has enabled things that were seemingly impossible with smaller models, said Luis Vargas, a Microsoft partner technical advisor who is spearheading the companys AI at Scale initiative.

The improvements are somewhat like jumping from an elementary reading level to a more sophisticated and nuanced understanding of language. But its possible to improve accuracy even further by fine tuning these large AI models on a more specific language task or exposing them to material thats specific to a particular industry or company.

Because every organization is going to have its own vocabulary, people can now easily fine tune that model to give it a graduate degree in understanding business, healthcare or legal domains, he said.

Read the original post:

Microsoft announces new supercomputer, lays out vision for ...

GE taps into US supercomputer to advance offshore wind – reNEWS

GE will be able to access one of worlds fastest supercomputers in order to help advance offshore wind power.

GE Research aerodynamics engineer Jing Li is leading a team that has been granted access to the Summit supercomputer at Oak Ridge National Laboratory (ORNL) in Tennessee, through the US Department of Energys competitive Advanced Scientific Computing Research Leadership Computing Challenge programme.

The key focus of the supercomputing project will be to study coastal low-level jets, which produce a distinct wind velocity profile of potential importance to the design and operation of future wind turbines.

Using the Summit supercomputer system, the GE team will run simulations to study and inform new ways of controlling and operating offshore turbines to best optimise wind production.

Li said: The Summit supercomputer will allow our GE team to run computations that would be otherwise impossible.

This research could dramatically accelerate offshore wind power as the future of clean energy and our path to a more sustainable, safe environment.

As part of the project, the GE team will work closely with research teams at NREL and ORNL to advance the ExaWind platform.

ExaWind, one of the applications of the DoEs Exascale computing project, focuses on the development of computer software to simulate different wind farm and atmospheric flow physics.

These simulations provide crucial insights for engineers and scientists to better understand wind dynamics and their impact on wind farms.

Li said: Scientists at NREL and ORNL are part of a broader team that have built up a tremendous catalogue of new software code and technical expertise with ExaWind, and we believe our project can discover critical new insights that support and validate this larger effort.

The ExaWind goal is to establish a virtual wind plant test bed that aids and accelerates the design and control of wind farms.

The Summit supercomputing systems power capability is equivalent to 70 million iPhone 11s and can help test and solve challenges in energy, artificial intelligence, human health and other research areas.

Li said: Were now able to study wind patterns that span hundreds of metres in height across tens of kilometres of territory down to the resolution of airflow over individual turbine blades.

You simply couldnt gather and run experiments on this volume and complexity of data without a supercomputer. These simulations allow us to characterise and understand poorly understood phenomena like coastal low-level jets in ways previously not possible.

Go here to see the original:

GE taps into US supercomputer to advance offshore wind - reNEWS

Summit supercomputer to advance research on wind power for renewable energy – ZDNet

Scientists from GE are looking into the potential of offshore wind power to support renewable energy, using one of the world's fastest supercomputers to help it progress research.

The researchers have been granted access to the IBM Summit supercomputer at Oak Ridge National Laboratory (ORNL) in Tennessee through the US Department of Energy's (DOE) Advanced Scientific Computing Research Leadership Computing Challenge (ALCC) program.

The team plans on using supercomputer-driven simulations to conduct what project lead GE Research aerodynamics engineer Jing Li said would otherwise be infeasible research that will lead to improved efficiencies in offshore wind energy production.

"The Summit supercomputer will allow our GE team to run computations that would be otherwise impossible," Li said.

"This research could dramatically accelerate offshore wind power as the future of clean energy and our path to a more sustainable, safe environment."

GE said the main focus of the project will be to study coastal low-level jets, which it said produce a distinct wind velocity profile of potential importance to the design and operation of future wind turbines.

Using Summit, the GE team will run simulations to study and inform new ways of controlling and operating offshore turbines to best optimise wind production.

"We're now able to study wind patterns that span hundreds of meters in height across tens of kilometres of territory down to the resolution of airflow over individual turbine blades," Li added. "You simply couldn't gather and run experiments on this volume and complexity of data without a supercomputer. These simulations allow us to characterise and understand poorly understood phenomena like coastal low-level jets in ways previously not possible."

GE said the researchers will work closely with teams at the National Renewable Energy Laboratory (NREL) and ORNL to advance the ExaWind platform, which focuses on the development of computer software to simulate different wind farm and atmospheric flow physics.

The simulations, GE said, will allow the researchers to better understand wind dynamics and their impact on wind farms.

ExaWind is one of the applications of the DOE's Exascale Computing Project (ECP).

According to the director of DOE's Exascale Computing Project (ECP), Doug Kothe, the goal is to establish a virtual wind plant test bed that aids and accelerates the design and control of wind farms to inform the researcher's ability to predict the response of the farms to a given atmospheric condition.

"ExaWind's development efforts are building progressively from predictive petascale simulations of a single turbine to a multi-turbine array of turbines in complex terrain," Kothe added.

IBM Summit supercomputer joins fight against COVID-19

Oak Ridge National Laboratory says early research on existing drug compounds via supercomputing could combat coronavirus.

IBM's latest supercomputer will be used... to build even more computers

AiMOS, the 24th most powerful supercomputer worldwide, was recently unveiled at the Rensselaer Polytechnic Institute. Its main job? To find out how to build smarter hardware to support ever-more sophisticated applications of AI.

'Fastest' AI supercomputer in academia to work on climate change, coronavirus projects

The University of Florida has revealed a partnership with Nvidia to upgrade its existing supercomputer to a next-level device that will support AI research.

This new supercomputer promises faster and more accurate weather forecasts

New hardware will support hundreds of researchers working on medium- and long-range forecasting.

The rest is here:

Summit supercomputer to advance research on wind power for renewable energy - ZDNet

BSC Researchers Create Spin-Off Platform to Accelerate the Development of New Chemicals – HPCwire

Aug. 6, 2020 Two researchers from Barcelona Supercomputing Center (BSC), Mnica de Mier and Stephan Mohr, have created a new spin-off, Nextmol (Bytelab Solutions SL), which develops tools for atomistic simulation and data analysis to accelerate the design of new chemicals.

Using these tools,Nextmolcharacterizes the behavior of chemical molecules, predicts their performance and identifies the best candidate molecules to meet certain physicochemical properties, by means of the computer and without the need to synthesize the molecule.

In this way, Nextmol shortens the path of innovation and makes chemical R&D more efficient compared to the traditional approach based exclusively on the laboratory, indicates Mnica de Mier, director of the company. De Mier adds that Nextmols mission is to democratize the computational techniques in the chemical industry, accompanying it in its digital transformation, and thus contribute to its competitiveness.

Nextmol markets these tools through its software-as-a-service platform, which enables the creation of the molecules and the system to be studied, build workflows with the sequence of calculations to be carried out, execute the calculation on a supercomputer and analyze the results. It is a collaborative, easy-to-use and cloud-based web platform, which enables computing resources to be immediately scaled to the volume of calculations required by users. Thanks to the catalog of preconfigured workflows, the platform can be used by non-experts in computational chemistry, says Stephan Mohr, scientific director of Nextmol.

BSC is the main partner of this spin-off, which has been founded by researcher Stephan Mohr and by Mnica de Mier, previously head of business development in the CASE department of BSC. The team currently consists of five people. Nextmol also has the support of the Repsol Foundation through its startup acceleration programEntrepeneurs Fund. In addition, it has obtained theStartup Capital grant from ACCI.

Nextmol leads the projectBigDFT4CHEMthat won the call for Spin-off Initiatives (SPI) launched by the EXDCI-2 consortium, has been one of the five startups finalists of theEntrepreneur Awards XXIorganized by Caixabank and Enisa, has obtained the Seal of Excellence in theEIC Acceleratorcall, has been awarded with the Tech Transfer Award granted by the European Network on High-performance Embedded Architecture and Compilation (HiPEAC) and aEurolab4HPC Business Prototyping project.

About BSC

Barcelona Supercomputing Center-Centro Nacional de Supercomputacin (BSC-CNS) is the national supercomputing centre in Spain. The center is specialised in high performance computing (HPC) and manage MareNostrum, one of the most powerful supercomputers in Europe, located in the Torre Girona chapel. BSC is involved in a number of projects to design and develop energy efficient and high performance chips, based on open architectures like RISC-V, for use within future exascale supercomputers and other high performance domains. The centre leads the pillar of the European Processor Project (EPI), creating a high performance accelerator based on RISC-V. More information:www.bsc.es

Source: Barcelona Supercomputing Center

Read the original here:

BSC Researchers Create Spin-Off Platform to Accelerate the Development of New Chemicals - HPCwire

Supercomputer study of mobility in Spain at the peak of COVID-19 using Facebook and Google data – Science Business

Researchers from the Barcelona Supercomputing Center have published a study based on mobility data from Google and Facebook at the peak of the COVID-19 pandemic, to demonstrate how this can be a sound source of information for epidemiological and socioeconomic analyses.

The data were collected between March 1 and June 27, from the phones of volunteers who had agreed to use tracking apps.

The findings show the Spanish population was closely following health guidelines and restrictions imposed by the government throughout the seven weeks of the study. Sunday was the day when mobility was at its lowest, which could point to it being the best day for lifting control measures. Meanwhile, Friday was the day when movement was most different from normal, suggesting extra support, and reminders of the need to adhere to control measures, is needed as the weekend begins.

The mobility data align with various announcements by the government on the state of the pandemic, the travel restrictions, and on the timing of easing lockdown measures, indicating analysis of tracking data could be used as a decision support tool to assess adherence and guide real time responses in future health crises.

View original post here:

Supercomputer study of mobility in Spain at the peak of COVID-19 using Facebook and Google data - Science Business

Julia and PyCaret Latest Versions, arXiv on Kaggle, UK’s AI Supercomputer And More In This Week’s Top AI News – Analytics India Magazine

Every week, we at Analytics India Magazine aggregate the most important news stories that affect the AI/ML industry. Lets take a look at all the top news stories that took place recently. The following paragraphs summarise the news, and you can click on the hyperlinks for the full coverage.

This was one of the biggest news of the week for all data scientists and ML enthusiasts. arXiv, the most comprehensive repository of research papers, has recently stated that they are offering a free and open pipeline of its dataset, with all the relevant features like article titles, authors, categories, abstracts, full-text PDFs, and more. Now, with the machine-readable dataset of 1.7 million articles, the Kaggle community would benefit tremendously from the rich corpus of information.

The objective of the move is to promote developments in fields such as machine learning and artificial intelligence. arXiv hopes that Kaggle users can further drive the boundaries of this innovation using its knowledge base, and it can be a new outlet for the research community to collaborate on machine learning innovation. arXiv has functioned as the knowledge hub for public and research communities by providing open access to scholarly articles.

The India Meteorological Department (IMD) is aiming to use artificial intelligence in weather forecasting. The use of AI here is particularly focused on issuing nowcasts, which can help in almost real-time (3-6 hours) prediction of drastic weather episodes; the Director-General Mrutunjay Mohapatra said last week. In this regard, IMD has invited research firms to evaluate how AI is of value to enhance weather forecasting.

Weather forecasting has typically been done by physical models of the atmosphere, which are uncertain to perturbations, and therefore are erroneous for significant periods. Since machine learning methods are more robust against perturbations, researchers have been investigating their application in weather forecasting to produce more precise weather predictions for substantial periods of time. Artificial intelligence helps in understanding past weather models, and this can make decision-making faster, Mohapatra said.

PyCaret- the open-source low-code machine learning library in Python has come up with the new version PyCaret 2.0. The design and simplicity of PyCaret is inspired by the emerging role of citizen data scientists and users who can perform both simple and moderately sophisticated analytical tasks that would previously have required more expertise.

The latest release aims to reduce the hypothesis to insights cycle time in an ML experiment and enables data scientists to perform end-to-end experiments quickly and efficiently. Some major updates in the new release of PyCaret include features like Logging back-end, Modular Automation, Command Line Interface (CLI), GPU enabled training, and Parallel Processing.

Global manufacture of mobile devices and technology solutions company Nokia said it would set up a robotics lab at Indian Institute of Science to drive research on use cases on 5G and emerging technologies. The lab will be hosted by Nokia Center of Excellence for Networked Robotics and serve as an interdisciplinary laboratory which will power socially relevant use cases across areas like disaster and emergency management, farming and manufacturing automation.

Apart from research activity, the lab will also promote engagement among ecosystem partners and startups in generating end-to-end use cases. This will also include Nokia student fellowships which will be granted to select IISC students that engage in the advancement of innovative use cases.

Julia recently launched its new version. The launch introduces many new features and performance enhancements for users. Some of the new features and updates include Struct layout and allocation optimisations, multithreading API stabilisation & improvements, Per-module optimisation levels, latency improvements, making Pkg Protocol the default, Automated rr-based bug reports and more.

It has also brought about some impressive algorithmic improvements for some popular cases such as generating normally-distributed double-precision floats.

In an important update relating to the technology infrastructure, the Ministry of Electronics and Information Technology (MeitY) may soon launch a national policy framework for building data centres across India. Keeping in sync with the demands of Indias burgeoning digital sector, the data centre national framework will make it easy for companies to establish hardware necessary to support the rising data workloads, and support business continuity.

The data centre policy framework will focus on the usage of renewable power, state-level subsidy in electricity costs for data centres, and easing other regulations for companies. According to a report, the national framework will boost the data centre industry in India and facilitate a single-window clearance for approvals. Read more here.

A new commission has been formed by Oxford University to advise world leaders on effective ways to use Artificial Intelligence (AI) and machine learning in public administration and governance.

The Oxford Commission on AI and Good Governance (OxCAIGG) will bring together academics, technology experts and policymakers to analyse the AI implementation and procurement challenges faced by governments around the world. Led by the Oxford Internet Institute, the Commission will make recommendations on how AI-related tools can be adapted and adopted by policymakers for good governance now and in the near future. The report outlines four significant challenges relating to AI development and application that need to be overcome for AI to be put to work for good governance and leverage it as a force for good in government responses to the COVID-19 pandemic.

The University of Oxford has partnered with Atos to build the UKs AI-focused supercomputer. The AI supercomputer will be built on the Nvidia DGX SuperPOD architecture and comprises 63 nodes. The deal with Atos has cost 5 million ($6.5 million) and is funded by the Engineering and Physical Sciences Research Council (EPSRC) and Joint Academic Data Science Endeavor, a consortium of 20 universities and the Turing Institute.

Known as JADE2, the AI supercomputer aims to build on the success of the current JADE^1 facility a national resource in the United Kingdom, which provides advanced GPU computing capabilities to AI and machine learning experts.

comments

Vishal Chawla is a senior tech journalist at Analytics India Magazine and writes about AI, data analytics, cybersecurity, cloud computing, and blockchain. Vishal also hosts AIM's video podcast called Simulated Reality- featuring tech leaders, AI experts, and innovative startups of India. Reach out at vishal.chawla@analyticsindiamag.com

Continue reading here:

Julia and PyCaret Latest Versions, arXiv on Kaggle, UK's AI Supercomputer And More In This Week's Top AI News - Analytics India Magazine

Audi To Over-Complicate Cars With Supercomputers And Repair Costs Could Skyrocket – Top Speed

In all honesty, just about every automaker out there is making a run toward new technology and innovation. It makes our lives easier and safer, they say. And, sometimes it does. The fact that our cars can now automatically control torque distribution between wheels and braking force as needed to prevent the loss of control is amazing. But, you have to take the good with the bad, and the bad in this case is that the replacement of electronics when the fail is expensive, especially on newer cars.

But, with everything separated into somewhat individual units, a single failure doesnt necessarily mean your car is undrivable. Audis new Integrated Vehicle Dynamics Computer, on the other hand, could change all that.

Audis Integrated Vehicle Dynamics Computer is far more sophisticated than anything we have in cars in 2020 even when you look to the most advanced cars like the Tesla Model S or Porsche Taycan. The IVDC in future Audis will serve as a central facility or hub for all the cars dynamic systems, from passive safety features like automatic braking and stability control to engine management and door lock control.

Audi claims that its new IVDC is ten times more powerful than the computers found in current models and will be able to control up to 90 different systems.

I bet you didnt know that your car had 90 different controllable systems built into it, did you?

In just a short time from now, Audis new IVDC will land in every car in the brands lineup from the compact A3, all the way up to the Q8 SUV and even its entire offering of EVs.

To give you an example of some of the things the IVDC will control, important systems like torque vectoring and brake regeneration will be on the priority list in electric cars. Performance cars with the RS badge will see it control anti-roll stabilization, active suspension, and engine control.

In short, the IVDC will mark the very first time in automotive history that chassis and powertrain controls are controlled by the same computer.

Its a big step forward, and Audi claims that it will bring a greater range of performance and comfort to its vehicles, but thats only the good side of things.

All of this sounds good in theory, but as a mechanic, I cant help but think about repair costs. Replacing certain control modules on cars today can already be very expensive, so the thought of having everything housed in one unit is concerning. A single failure of the IVDC can render your new car inoperable and, to top it off, the company has you over a barrel once your warranty has passed. Should that IVDC experience any type of failure, you may have no choice but to replace it or be stuck with a car you cant drive potentially one that youre still making payments on. With this being proprietary and new technology, there wont be an aftermarket offering for some time to come, and since its a must-have, Audi will either be able to charge you a small fortune for replacement or push you to trade-in and buy a new car.

I like the idea in theory, and maybe itll work out well, but as an all-new technology, there will be flaws, and until those are ironed out, things could be very dicey. Fortunately, all cars equipped will have some kind of warranty as a bit of a safety shield, but in the end, replacement down the road will still end up being a lot more expensive than replacing one of many stand-alone control units in the event of a random failure.

The rest is here:

Audi To Over-Complicate Cars With Supercomputers And Repair Costs Could Skyrocket - Top Speed

Supercomputer COVID-19 insights, ionic spiderwebs, the whiteness of AI TechCrunch – Best gaming pro

Zoom lets you nearly meet with associates, members of the family, and colleagues. With a few clicks, folks you need to see all pop up in your display, every with whom you may collaborate and even simply chat.

Distant employees typically use Zoom, however now its turning into recognized for connecting college students, households, and places of work. If in case you have by no means used Zoom earlier than, listed below are a couple of suggestions and tips to make utilizing Zoom lots simpler.

In case your workspace at dwelling isnt fairly so tidy, you may apply a background to nearly clear up the true litter. Its like being a information anchor sitting in entrance of a inexperienced display. Zooms choice of landscapes can add a bit of caprice to your conferences too.

Executives who must painting a extra skilled look when doing video calls with shoppers may even add their firms brand to be used as a background, making it seem like youre having the decision in an precise convention room at work.

The characteristic is easy to make use of, and the digital camera in your handset or pc can apply the background even when youre not sitting in entrance of a inexperienced display. Right heres how you can get began:

Step 1: Launch Zoom in your pc

Step 2: Go to the cogs button on the higher right-hand nook of your show to launch the Settings menu.

Step three: Choose Digital Background within the left menu pane.

Step four: After you try this, you may select from quite a lot of built-in background, like a scene from the seaside, a view of San Franciscos iconic Golden Gate Bridge, and even the Aurora Borealis. A reside preview will present how youll look in entrance of the background.

Step 5: To decide on your individual customized background, click on on the + icon subsequent to Select Digital Background. The choice will allow you to add your individual customized video or photograph to be used as a digital background. If in case you have a video of an aquarium, you may conduct your assembly in entrance of what would seem as a reside fish tank, for example. Should you want inspiration for some enjoyable animated backgrounds to make use of, Lightricks, the maker of in style photograph enhancing app FaceTune, has uploaded some content material to Dropbox that you should utilize.

Utilizing a reside background will reveal some artifacts across the edges, which might look uneven when youre shifting round lots throughout video calls. Moreover, digital backgrounds shouldnt be used when youre planning on demonstrating or pointing to issues together with your fingers fingers get canceled out with using digital backgrounds. The background will seem extra clean when youre sitting in entrance of a inexperienced display.

Step 6: In case your video appears slightly bizarre, it is best to attempt adjusting brightness within the video settings. Its a comparatively new characteristic in Zoom that may assist your backgrounds look slightly extra pure.

From years of expertise utilizing a pc, everyone knows in style keyboard shortcuts for copy, paste, and undo. Zoom has its personal set of in style shortcuts that aid you rapidly mute and unmute the microphone, begin or cease your digital camera, and extra. There are a variety of shortcuts you can allow and use, however listed below are among the hottest shortcuts to get you began in Zoom.

The shortcuts instructions are listed for Home windows PC customers. Mac customers will need to substitute the Apple or Command key for the Alt key above.

A full checklist of the keyboard shortcuts will be discovered by navigating to the Zoom settings menu and selecting Keyboard Shortcuts on the left pane.

You is probably not a newscaster on the night information, however youll nonetheless need to look your finest throughout your digital conferences with colleagues. A characteristic thats borrowed from the wonder mode from the selfie cameras on many in style Android smartphones, Zooms Contact Up My Look helps to clean out your pores and skin, take away the darkish luggage underneath your eyes, and aid you look your finest.

Better of all, the outcomes look pretty pure, so that you dont seem like an over-sharpened blob of pixels when considered on the screens of fellow collaborators.

Step 1: Go to Zooms settings menu

Step 2: Click on on the Video choice within the left panel

Step three: Underneath My Video, choose the choice for Contact Up My Look.

Professional tip: If in case you have the web bandwidth at dwelling, enhance your stream by additionally choosing Allow HD underneath the Video choice. Moreover, when youre assembly with a big staff, you may enhance the gallery view of your videoconferencing session by selecting Show As much as 49 Individuals Per Display screen in Gallery View. By default, Zoom solely exhibits 25 contributors per display, however you may view extra if you choose this feature and have quick sufficient web at dwelling.

If your corporation subscribes to a extra superior Zoom plan with cloud recording, you may report your assemblys audio to the cloud. Zooms A.I. will assist transcribe your assembly full with timestamps and save the transcript as a .vtt textual content file. The assembly notes will be edited, if wanted, for accuracy.

Whenever you evaluate your assembly video, theres even an choice to show the transcription instantly inside the video, making it seem like closed captioning.

Step 1: Open the Zoom net portal and sign in.

Step 2: Click on on the Recordings tab on the left-hand facet and select Cloud Recordings. Youll want a premium Zoom account to make use of this characteristic, so you could have to inquire together with your IT administrator or supervisor to see if your corporation is a subscriber.

Step three: Allow Audio Transcript underneath Cloud Recordings, and save your modifications.

Step four: Whenever you begin a gathering, youll want to hit the File button and select File to the Cloud.

Step 5: After the conclusion of the assembly, youll obtain an electronic mail alerting you that the transcript is prepared.

Although its possible youll not have entry to a projector, you may nonetheless make shows and present assembly attendees whats in your display. You can begin a display sharing by hitting the Alt+S keyboard shortcut, which is able to lower out of your webcam feed to displaying to all assembly contributors what youre seeing in your display.

That is helpful when youre attempting to indicate an essential graphics, for instance, in a gathering, or need to show a PowerPoint deck. Should youre the speaker, you may allow extra settings to offer an much more polished presentation.

Step 1: From the Zoom Settings menu, navigate to Share Display screen.

Step 2: Allow Enter Full Display screen When a Participant Shares a Display screen.

Step three: If you wish to present a video feed of the speaker together with the shared display, you can too allow Aspect-by-Aspect Mode.

Step four: Make sure you choose the Silence System Notifications When Sharing Desktop. This manner, your presentation is not going to get interrupted by all of your alerts and system chimes. Additionally, professional presenters may additionally need to allow Home windows 10s Focus Help characteristic, which is able to assist conceal notification sounds and pop-ups from the system or your electronic mail and calendar apps.

Should youre collaborating on a mission, you can too use a digital whiteboard to work collectively. When youre in a gathering, you may click on on the Display screen Share button after which select Whiteboard. After that, annotation instruments will seem, permitting you to attract and plan concepts and tasks with collaborators. Whiteboard periods will be saved as separate photographs or compiled right into a single PDF. Within the Whiteboard management, hit the Save button, and you may select the format you want. Enter an electronic mail deal with to share the saved whiteboard.

Be certain to have what youre attempting to current queued up and able to go earlier than the assembly begins. This manner, you may leap into the PowerPoint presentation, Excel spreadsheet, or net doc instantly whenever you hit the display share button.

And when youre something like me and have a cluttered desktop, use Home windows 10s Desktop characteristic to launch Zoom in a brand new, clear, clutter-free desktop area. This manner, youre not revealing any confidential recordsdata you could have saved in your desktop, and also youll additionally look extra skilled within the course of.

https://www.youtube.com/watch?v=XhZW3iyXV9U

Whereas impromptu conferences are generally wanted for large modifications, when youre engaged on long-term tasks, it might be helpful to order some digital time to have common Zoom check-ins.

If in case you have common standing mission conferences, you may schedule them prematurely. This manner, everybody could have a set time and digital place to convene on a weekly foundation.

Step 1: From the Zoom homescreen, click on Schedule.

Step 2: Fill within the assembly particulars, together with time, matter, and period. You may as well set this as a Recurring assembly.

Step three: If its a confidential assembly, assign a gathering password to be shared with contributors.

Step four: Should youre creating the assembly, you may additionally need to discover the Superior Choices as effectively. This manner, you may Allow Ready Room and Allow Be part of Earlier than Host to permit attendees to affix the assembly early. An essential choice to pick out is to Mute Individuals on Entry to reduce distracting background noise.

You may as well obtain the Chrome browser extension and Microsoft Outlook extension to schedule conferences with out having to launch the Zoom app.

Helpful for managing bigger teams, breakout rooms allow you to break up a staff into smaller ones to deal with completely different areas of a giant mission concurrently. For instance, when youre engaged on a brand new product launch, you may schedule a product launch assembly, after which create breakout room for the advertising and marketing staff to provide you with a advertising and marketing plan, the finance staff to run fashions and make gross sales projections, and a 3rd room for gross sales to provide you with a gross sales technique.

As much as 50 separate breakout rooms will be created, and the host can pre-assign attendees to particular rooms. A complete of 200 contributors can be a part of between all of the rooms, and every room will perform like a stand-alone Zoom assembly. Cloud recording, nevertheless, is restricted to the principle room. Like among the extra superior Zoom options, your group will must be a subscriber to a paid plan to have this characteristic.

Step 1: Check in to the Zoom net portal.

Step 2: Click on on Settings.

Step three: Underneath the Assembly tab, ensure that the Breakout Room choice is enabled. You might also need to choose the Enable Host to Assign Individuals to Breakout Rooms When Scheduling choice as effectively to offer your assembly host the power to pre-assign you to particular room(s) earlier than the assembly begins.

In a Zoom session, the host could have entry to the Breakout Rooms of their videoconferencing menu. The host will be capable to leap between the completely different rooms to speak to the contributors.

Whichever advance options you allow in Zoom, keep in mind to at all times comply with widespread finest practices for videoconferencing to make sure all of your staff conferences run easily.

Find out how to use

Zoom additionally lets you change settings so that you simply at all times enter a gathering with a muted mic. This can be a good protocol and well mannered to different customers, since coming into with a reside mic could cause sudden noises or bursts of static that may be disruptive.

Step 1: Choose the arrow subsequent to your mic button and select Audio Settings from the pop-up menu.

Step 2: Within the Audio Settings window that opens, scroll to the underside the place you will see a number of options that may be enabled or disabled. Ensure that the choice to Mute microphone when becoming a member of a gathering is enabled.

A typical free Zoom assembly can maintain as much as 100 contributors. Nonetheless, for sure organizations or conditions this is probably not sufficient: In case your net conferencing group is bigger than 100 and you continue to need to use Zoom in your conferences, then its essential to enhance the assembly cap. There are a number of ways in which you are able to do this, relying in your circumstances and the way a lot you need to pay.

Use a faculty Zoom account: Varied faculties and universities have institutional Zoom accounts designed to facilitate on-line courses. This may often maintain as much as 300 contributors. In case you are a pupil or instructor at a college that provides these accounts, you might be able to use it in your deliberate assembly.

Improve to a small business account: These accounts price $20 per thirty days, per host, however one of many options they add is upgrading your cap to 300 contributors.

Improve to 500 contributors with the Massive Assembly add-on: This prices $50 per thirty days, per host, however you may significantly increase the variety of contributors that may be a part of. Browse all add-ons within the Billing part of your Zoom account after you sign in. You may as well improve to the Enterprise account tier to unlock this feature.

Improve to 1,000 contributors if essential: An alternate add-on lets you may have conferences with as much as 1,000 folks for notably giant occasions. This will increase the value to $90 per thirty days, per host.

Are you a future stuffed with Zoom conferences? If theyre turning into a daily characteristic in your work life and arent going away any time quickly, it is best to contemplate customizing a workspace particularly for web-conference conferences.

Zoom has a product to assist with that: the DTEN Me from Zoom for Home. Obtainable to pre-order for $599 in the USA, its a super-thin, 27-inch pc monitor that may be arrange wherever and comes with a wise digital camera array, eight microphones to choose up sound as precisely as doable, and a touchscreen with built-in whiteboarding and annotation capabilities which can be constructed to work seamlessly with Zoom.

You might also need to take a look at our information on how you can do business from home for extra concepts about how you can arrange the fitting distant workstation.

See the original post:

Supercomputer COVID-19 insights, ionic spiderwebs, the whiteness of AI TechCrunch - Best gaming pro

Every Superman Movie Climax, Ranked From Worst To Best – Screen Rant

After debuting in the pages of the comic books nearly a century ago, Superman has been perhaps the most iconic superhero of all time. With that level of popularity, of course, came movies, with the classic films starring Christopher Reeve right up to the more recent films in the DCEU.

RELATED:The 10 Best Superman Movies, According To IMDb

There is a thrill to seeing a hero like Superman take on the bad guys and save the day. The climaxes of these live-action films have thrilled audiences of all ages while some of them have maybe been a bit disappointing. Looking back on each of the live-action films featuring Superman, here are the climaxes ranked from worst to best.

Superman III is often criticized for being too cartoonish and filled with slapstick humor. When looking at the film's climax, it's hard to argue with that criticism. As an evil corporate villain is trying to use a super-computer to carry out disasters around the world that will benefit him financially, Superman flies in to save the day.

RELATED:You'll Believe A Man Can Fly: 10 Behind-The-Scenes Facts About Superman (1978)

The sequence is filled with ridiculous moments, like the villain playing a video game that fires rockets at Superman. There is also a moment when the computer inexplicably turns one of the villains into a cyborg which Superman then has to fight. It was a strange way to end an already strange movie.

Christopher Reeve gives his final performance as Superman in this fourth adventure. After giving such an iconic performance in the role, it's a shamehe didn't have a better movie to end on.

Superman IV: The Quest For Peace centers around Superman's attempts to rid the world of nuclear weapons, but Lex Luthor succeeds in creating a new villain named Nuclear Man. The movie is little more than Superman and Nuclear Man chasing each other around the world and getting into dull fights. Even when they take their battle to the moon, it's not very exciting, largely due to the boring villain.

Fans had been waiting for years to see the Justice League team up in a live-action film. Though Justice League has some fun moments in it, the fact that it is overall a rather forgettable film is a massive disappointment for the DCEU.

Even the climactic battle against the villainous Steppenwolf fails to deliver much fun. There are some good character moments and action sequences, and the return of Superman is pretty thrilling. But once Superman shows up, the fight is pretty much over and it makes you wonder why the team was even necessary in the first place.

The long-awaited face-off between Batman and Superman was another big letdown for the DCEU, with many fans put off by the dark and over-plotted story. However, the movie did deliver some exciting action, including the fight between the two heroes and Batman's raid of a criminal warehouse.

RELATED:Batman V Superman: 10 Interesting Behind-The-Scenes Details

The climactic battle against Doomsday is, sadly, not as thrilling as those other sequences. There is some fun seeing Batman, Superman, and Wonder Woman team up, withWonder Woman easily stealing the show. But the last-minute addition of Doomsday feels tacked on as does Superman's death at the end of the fight.

After a long hiatus from any big-screen adventures, Superman Returns sought to, well, return the character to his heroic roots in the continuation of the original Richard Donner films. Some fans found this new entrytoo reliant on nostalgia, but it does effectively show Superman at his most heroic.

After Lex Luthor creates a new island made out of Kryptonite, Superman is weakened. But with the new expanding land threatening others, Superman uses all of his strength to fly the island into space before crashing back down to Earth. It is a moment that reminds everyone of the kind of hero Superman is one who will do whatever it takes to save the day.

Though not all of the Christopher Reeve's sequels were successful, Superman II is considered by many to be an equal or possibly greater film than the original. The film sees Superman meet his match as three Kryptonian criminals, led by General Zod, arrive on Earth, and challenge the son of Jor-El.

After losing his power for a short time, Superman returns, ready to take on the challengers. The exciting fight begins in the skies over Metropolis before heading to the Fortress of Solitude. Though the action might not be as fast-paced as modern superhero movies, it's a fun and clever battle with villains who are Superman's equals.

Superman took on Zod once again in Zack Snyder's Man of Steel. Zod comes to Earth seeking out Superman as an ally. Hehopes to transform the planet into a new Kryptonian home, but it means the death of the human race. Obviously, Superman will not stand for that.

RELATED:10 Actors Who Were Almost Cast As Superman Villains

The massive destruction of the battle drew criticism, but it feels like it carries consequences in the later films. Snyder does shape some pretty thrilling fight sequences and the moment when Superman is forced to kill Zod, while controversial, was as an interesting new path to take the iconic hero.

While there are plenty of fans of Snyder's version of Superman, many feel that this darker version of was missing the fundamental hero component of his character. It is not just that he is a godlike being, he is capable of saving lives and is a beacon of good around the world.

The original Superman movie really embraced this aspect of the character, especially in the climax. After Luthor generates an earthquake that threatens the entire West Coast, Superman flies around saving school buses from bridges and trains filled with passengers. And when he finds the quake killed Lois Lane, he flies around the world as fast as he can and reverses time to save her. It's ridiculous, but it's the kind of feel-good heroics the character has always embodied.

NEXT:Superman: 10 Best TV & Film Costumes, Ranked

Next The Simpsons: 10 Funniest Bart Simpson Memes That Make Us Laugh

A writer and film fan. I always enjoy keeping up with the latest films in theaters as well as discovering some hidden gems I may have overlooked. Glad to be a part of Screen Rant's positive and fun community and have the opportunity to share my thoughts with you.

More here:

Every Superman Movie Climax, Ranked From Worst To Best - Screen Rant

Atos Partners with University of Oxford on Largest AI Supercomputer in the UK – HPCwire

PARIS and LONDON, Aug. 4, 2020 Atos has signed a four-year contract worth 5 million with the University of Oxford to deliver a new, state of the art, deep learning supercomputer built on the NVIDIA DGX SuperPOD architecture, which will enable UK academics and industry to drive forward scientific discoveries and innovation in machine learning and artificial intelligence, as part of the JADE2 project.

The largest AI-focused supercomputer in the UK with over 500 NVIDIA GPUs, this high-performance system is funded by the Engineering and Physical Sciences Research Council (EPSRC),and aims to build on the success of the current JADE1(Joint Academic Data Science Endeavour)facility a national resource providing advanced GPU computing facilities to world-leading AI and machine learning experts from a consortium of eight UK universities and the Alan Turing Institute.

The JADE2 supercomputer is built on NVIDIA DGX systems, and uses the DDN A3I storage solution. It will more than triple the capacity of the originalJADEmachine and provide increased computing capabilities to a wider consortium of over twenty universities and the Turing Institute, helping to meet the level of demand for AI-focused facilities created as a result of the success of the JADE resource.

Professor Wes Armour at the University of Oxfordsaid:The successful delivery of JADE has created more demand among UK researchers and industry for powerful computing facilities which can accommodate high end, data intensive AI workloads.Building on the success of the JADE collaboration with Atos, and by significantly expanding the JADE consortiums computing capacity, the new Deep Learning supercomputer supplied by Atos will allow us to meet this demand and help many institutions to make some potentially ground-breaking discoveries. It will cement JADEs status as the de facto national computing facility for academic AI research.

Alison Kennedy, Director, STFC Hartree Centresaid:We are pleased that the delivery and hosting of this cutting-edge JADE-2 hardware and the resultant increase in capability will support the development and expansion of research computing skills across industry and academia. This aligns directly with the Hartree Centres mission and we are delighted to continue the collaboration with Atos and Oxford University which builds upon the previous success of JADE.

Agns Boudot,Senior Vice President, Head of HPC & Quantumat Atos,concluded:We are proud to be working with the University of Oxford on the delivery of JADE2, which will provide researchers and industry with more computing power to enable new scientific breakthroughs and innovation in machine learning and AI. We believe this high-performance system, coupled with our expertise, will help the UK to address key AI and machine learning challenges, while supporting the UKs ambition to be a world-leader in these areas.

The DGX SuperPOD system will comprise a cluster of 63 DGX nodes, having 504 NVIDIA V100 Tensor Core GPUs in total, interconnected with NVIDIA Mellanox InfiniBand networking, all being fed by DDNs AI400 storage, making it the largest such system in the UK.

The system is to be hosted at theSTFC Hartree Centrein Daresbury, near Warrington, UK.

1JADE Joint Academic Data science Endeavour. This proposal, led by the University of Oxford, with support from the Alan Turing Institute (ATI), and 22 universities, is the national GPU system supporting multidisciplinary science with a focus on machine learning and molecular dynamics.

About Atos

Atos is a global leader in digital transformation with 110,000 employees in 73 countries and annual revenue of 12 billion. European number one in Cloud, Cybersecurity and High-Performance Computing, the Group provides end-to-end Orchestrated Hybrid Cloud, Big Data, Business Applications and Digital Workplace solutions. The Group is the Worldwide Information Technology Partner for the Olympic & Paralympic Games and operates under the brands Atos, Atos|Syntel, and Unify. Atos is a SE (Societas Europaea), listed on the CAC40 Paris stock index.

Source: Atos

Read this article:

Atos Partners with University of Oxford on Largest AI Supercomputer in the UK - HPCwire

Break it down: A new way to address common computing problem – Washington University in St. Louis Newsroom

In this era of big data, there are some problems in scientific computing that are so large, so complex and contain so much information that attempting to solve them would be too big of a task for most computers.

Now, researchers at the McKelvey School of Engineering at Washington University in St. Louis have developed a new algorithm for solving a common class of problem known as linear inverse problems by breaking them down into smaller tasks, each of which can be solved in parallel on standard computers.

The research, from the lab of Jr-Shin Li, professor in the Preston M. Green Department of Electrical & Systems Engineering, was published July 30 in the journal Scientific Reports.

In addition to providing a framework for solving this class of problems, the approach, called Parallel Residual Projection (PRP), also delivers enhanced security and mitigates privacy concerns.

Linear inverse problems are those that attempt to take observational data and try to find a model that describes it. In their simplest form, they may look familiar: 2x+y = 1, x-y = 3. Many a high school student has solved for x and y without the help of a supercomputer.

And as more researchers in different fields collect increasing amounts of data in order to gain deeper insights, these equations continue to grow in size and complexity.

We developed a computational framework to solve for the case when there are thousands or millions of such equations and variables, Li said.

This project was conceived while working on research problems from other fields involving big data. Lis lab had been working with a biologist researching the network of neurons that deal with the sleep-wake cycle.

In the context of network inference, looking at a network of neurons, the inverse problem looks like this, said Vignesh Narayanan, a research associate in Lis lab:

Given the data recorded from a bunch of neurons, what is the model that describes how these neurons are connected with each other?

In an earlier work from our lab, we showed that this inference problem can be formulated as a linear inverse problem, Narayanan said.

If the system has a few hundred nodes in this case, the nodes are the neurons the matrix which describes the interaction among neurons could be millions by millions; thats huge.

Storing this matrix itself exceeds the memory of a common desktop, said Wei Miao, a PhD student in Lis lab.

Add to that the fact that such complex systems are often dynamic, as is our understanding of them. Say we already have a solution, but now I want to consider interaction of some additional cells, Miao said. Instead of starting a new problem and solving it from scratch, PRP adds flexibility and scalability. You can manipulate the problem any way you want.

Even if you do happen to have a supercomputer, Miao said, There is still a chance that by breaking down the big problem, you can solve it faster.

In addition to breaking down a complex problem and solving in parallel on different machines, the computational framework also, importantly, consolidates results and computes an accurate solution to the initial problem.

An unintentional benefit of PRP is enhanced data security and privacy. When credit card companies use algorithms to research fraud, or a hospital wants to analyze its massive database, No one wants to give all of that access to one individual, Narayanan said.

This was an extra benefit that we didnt even strive for, Narayanan said.

Read more:

Break it down: A new way to address common computing problem - Washington University in St. Louis Newsroom

Five Movies Worth Watching About the Threat of Nuclear War – Council on Foreign Relations

Yesterday marked the seventy-fifth anniversary of the bombing of Hiroshima. This Sunday marks the seventy-fifth anniversary of the bombing on Nagasaki. Thankfully, nuclear weapons have never been used since. However, nuclear war remains an ever-present danger. Nine countries currentlypossess nuclear weapons, and several others potentiallyaspire to acquire them. TheBulletin of Atomic Scientistshas its famed Doomsday Clock set to100 seconds before midnight, the closest it has ever been to the twelve oclock hour. So it seems appropriate this week to recommend films that remind us of the threat we prefer to forget.

We wont rehashall the ruleswe follow in making recommendations. Just remember that we recommend a movieonly once. SoDr. Strangeloveisnt on this weeks list because it was on our list offive foreign-policy satires worth watching.

More on:

Nuclear Weapons

Nonproliferation, Arms Control, and Disarmament

Wars and Conflict

Here are our five recommendations, plus bonus picks from two colleagues.

The Water's Edge

Crimson Tide(1995). The submarine USSAlabamaheads to sea after rebels seize nuclear missile sites in eastern Russia. While on patrol, the submarine receives part of a command that mayor may notbe an order to launch its nuclear missiles against the rebel-controlled sites. TheAlabamascaptain and its executive officer (XO) clash over whether to proceed with the attack or wait for additional information. Convinced that the combat-tested captain, played byGene Hackman, is too eager to act, the cautious, young XO, played byDenzel Washington, stages a mutiny. The fate of the world hangs in the balance. DirectorTony Scottleverages the submarines close quarters tohighlightthe isolation and tension of the crew. WatchingCrimson Tideis even more anxiety-inducing if you know that the plot isnt far-fetched. During the Cuban missile crisis,commanders aboard a Soviet submarine clashedover whether to fire their vessels nuclear-armed torpedo after they lost contact with Moscow. Perhaps as Washingtons character contends, in the nuclear world, the true enemy is war itself. You can watchCrimson TideonAmazon Prime,Google Play, orYouTube.

Miracle Mile(1988).Miracle Milebegins as a standard romance: Harry (Anthony Edwards) meets Julie (Mare Winningham) at a caf, and its love at first sight. The cheery start turns sour when Harry accidentally receives a panicked phone call warning of an imminent nuclear attack. He has one hour to find Julie and flee to safety.Miracle Milewas released at the tail end of the Cold War, when duck-and-cover drills had faded from U.S. schools but latentfears of a nuclear attackpersisted. DirectorSteve De Jarnettheightens the audiences anxiety by never cutting away from Harrys perspective: You are Harry Washello, De Jarnett said. These days, a warning about an incoming nuclear attack is less likely to come through a pay phone than through our smart phones, as people in Hawaiilearned two years ago. Thankfully, that early-morning emergency warning was a mistake. You can watchMiracle MileonAmazon PrimeoriTunes.

On the Beach(1959). How would you spend your last days if you knew the world was ending? This is the question that drivesOn the Beach, an adaptation of Nevil Shutes 1957best-selling novel. A nuclear war has devastated the Northern Hemisphere, leaving Australia as the worlds only safe haven. As deadly radioactive fallout steadily drifts toward the continent, many survivors resign themselves to their fate. But when a radio signal is detected coming from the rubble of the west coast of the United States, the nuclear submarineUSSSawfishheads off to discover its sourceand a possible reason for hope. Looking to stress the films universal theme and capitalize on the star power of its castwhich includesGregory Peck,Ava Gardner, andFred AstairedirectorStanley Kramerscheduled the movies premiere on the same day ineighteen different citiesacross the world, including Moscow. You can findOn the Beachon theDigital ArchiveoriTunes.

Thirteen Days(2000). For thirteen days in 1962, the world stood at the brink of nuclear war as the United States and Soviet Union faced off in theCuban missile crisis.Thirteen Daysdramatizes the view from inside the Kennedy administration after the discovery that the Soviets were deploying medium-range nuclear missiles just ninety miles off the U.S. coast. We know how the crisis ends. Still,Bruce Greenwoodas President John F. Kennedy andKevin Costneras advisor Kenneth ODonnell convey the intense anxiety of those thirteen days as they desperately search for a peaceful solution while trying to maintain the upper hand against the Soviets. DirectorRoger Donaldsontakes a fewliberties with history, but University of Virginia professor Philip Zekilow observed that the film is accurate where it counts. You can watchThirteen DaysonAmazon Prime,Google Play, orYouTube.

More on:

Nuclear Weapons

Nonproliferation, Arms Control, and Disarmament

Wars and Conflict

WarGames(1983). A youngMatthew Broderickplays teenager David Lightman, a technology whiz who nearly sparks World War III when he hacks into a computer network and begins playing an interactive game. Ominously titled Global Thermonuclear War, it is actually a program running on a U.S. military supercomputer. The program thinks David is the Soviet Union preparing to launch a nuclear attack, so it tries to strike first. DirectorJohn Badhamuses lighthearted humor to explore the deadly serious concept of mutual assured destruction as the teen tries to stop the United States from retaliating against a false threat.WarGamespremiered the heroic hacker archetype andinspired a generationof budding techies. After seeing the movie at Camp David, President Ronald Reagangrew concernedthat a similar hack could happen in real life. That led him to issue the first national security directive on computer security. You can watchWarGamesonAmazon Prime,Google Play, orYouTube.

For bonus picks this week, we turned to CFRs two visitingStanton Nuclear Security Fellows,Jooeun KimandJoseph Torigian. Jooeun, who specializes in nuclear nonproliferation in East Asia, recommends:

Fail Safe(1964). When a communications system error sends U.S. bombers with nuclear payloads to attack Moscow, the president of the United States (Henry Fonda) scrambles to prevent doomsday.Fail Safesshowing at the box office washurtby the release just ten months earlier ofDr. Strangelove, which had a similar plot but with a satiric edge. (Stanley Kubrick,Dr. Strangeloves director,suedthe producers ofFail Safefor plagiarism and won an agreement that delayed its release.) DirectorSidney Lumetdistinguishes his drama, however, with thehumanity of its charactersand anarguably smarter takeon how a nuclear war might start. Jooeun says: Fail Safeis a somber reminder of how machines can fail and cause nuclear accidents. It also reminds us how rational decision-making by the commander-in-chief is so important in stopping a nuclear war and gaining trust even from an adversary. The depiction of the political scientist in the movie reminds me personally that scholarly and theoretical contributions to nuclear nonproliferation are important, but in the war room it is practical application that matters most. You can findFail SafeonAmazon Prime,Google Play, orYouTube.

Joseph, who works on Chinese and Russian foreign policy, suggests:

Colossus: The Forbin Project(1970). Hoping to avoid any human error that could lead to war, the United States entrusts its nuclear arsenal to a new supercomputer: Colossus. In appropriate science-fiction fashion, the artificial intelligence (AI) soon outsmarts its creators and connects to a similar electronic brain built in the Soviet Union. The AIs threaten both countries with nuclear war if scientists interfere in their pursuit of world dominationall in the name of peace. The film is based on the 1967 novelColossusby D. F. Jones and directed byJoseph Sargent. Joseph says: Although less well-known thanDr. Strangelove,Colossus: The Forbin Projectis another classic satire on nuclear war with a similarly negative view of humanity's future. WhileDr.Strangelovecaptured the essence of mutually assured destruction, the focus inColossuson out of control AI makes it an especially prescient warning for today's world. You can watchColossus: The Forbin ProjectonHoopla,Vimeo, orXfinity Stream.

Next week we will recommend World War II films worth watching.

Check out our other foreign-policy movie recommendations:

Originally posted here:

Five Movies Worth Watching About the Threat of Nuclear War - Council on Foreign Relations

GE Research uses summit supercomputer for study on wind power – Windtech International

Data Protection and Privacy policy Windtech International

This Data Protection and Privacy Policy sets out how Windtech International uses and protects any information that you give while using http://www.windtech-international.com and by subscribing to magazine Windtech International.

The purpose of this Data Protection and Privacy Statement is to inform you of how Windtech International manages Personal Data which is subject to the European General Data Protection Regulation (GDPR).

Should we ask you to provide certain information by which you can be identified when using this website, you can be assured that it will only be used in accordance with this privacy statement.

Windtech International may change this policy from time to time by updating this page. You should check this page from time to time to ensure that you are happy with any changes. This policy is effective from 1 January 2017.

What we collectWe may collect the following information:

What we do with the information we gatherWe require this information to understand your needs and provide you with a better service, and in particular for the following reasons:

SecurityWe are committed to ensuring that your information is secure. In order to prevent unauthorised access or disclosure, we have put in place suitable physical, electronic and managerial procedures to safeguard and secure the information we collect online.

How we use cookiesA cookie is a small file which asks permission to be placed on your computers hard drive. Once you agree, the file is added and the cookie helps analyse web traffic or lets you know when you visit a particular site. Cookies allow web applications to respond to you as an individual. The web application can tailor its operations to your needs, likes and dislikes by gathering and remembering information about your preferences.

We use traffic log cookies to identify which pages are being used. This helps us analyse data about web page traffic and improve our website in order to tailor it to customer needs. We only use this information for statistical analysis purposes and then the data is removed from the system.

Overall, cookies help us provide you with a better website, by enabling us to monitor which pages you find useful and which you do not. A cookie in no way gives us access to your computer or any information about you, other than the data you choose to share with us.

You can choose to accept or decline cookies. Most web browsers automatically accept cookies, but you can usually modify your browser setting to decline cookies if you prefer. This may prevent you from taking full advantage of the website.

AdvertisingSome of our advertisers occasionally serve you cookies as well. We do not have control over cookies placed by advertisers. We may use advertising service vendors to help present advertisements on the website. These vendors may use cookies, web beacons, or similar technologies to serve you advertisements tailored to interests you have shown by browsing on this and other sites you have visited, to determine whether you have seen a particular advertisement before and to avoid sending you duplicate advertisements. In doing so, these vendors may collect non-personal data such as your browser type, your operating system, Web pages visited, time of visits, content viewed, ads viewed, and other clickstream data. The use of cookies, web beacons, or similar technologies by these advertising service vendors is subject to their own privacy policies, not ours, and Service Provider disclaims all liability in connection therewith.

Links to other websitesOur website may contain links to other websites of interest. However, once you have used these links to leave our site, you should note that we do not have any control over that other website. Therefore, we cannot be responsible for the protection and privacy of any information which you provide whilst visiting such sites and such sites are not governed by this privacy statement. You should exercise caution and look at the privacy statement applicable to the website in question.

We will not sell, distribute or lease your personal information to third parties unless we are required by law to do so. We may use your personal information to send you promotional information about third parties which we think you may find interesting if you tell us that you wish this to happen.

Opt OutIn all emails we send we include an opt out option in case you do not want to receive certain information from us anymore. Should you choose to unsubscribe from our mailing list or if your membership expires, please note that your Personal Data may still be retained on our database to the extent permitted by law.

If you believe that any information we are holding on you is incorrect or incomplete, please contact us at info @ windtech-international.com.

Originally posted here:

GE Research uses summit supercomputer for study on wind power - Windtech International

Research: A Survey of Numerical Methods Utilizing Mixed Precision Arithmetic – HPCwire

Within the past years, hardware vendors have started designing low precision special function units in response to the demand of the machine learning community and their demand for high compute power in low precision formats. Also, server-line products are increasingly featuring low-precision special function units, such as the Nvidia tensor cores in the Oak Ridge National Laboratorys Summit supercomputer, providing more than an order of magnitude of higher performance than what is available in IEEE double precision.

At the same time, the gap between the compute power on the one hand and the memory bandwidth on the other hand keeps increasing, making data access and communication prohibitively expensive compared to arithmetic operations. Having the choice between ignoring the hardware trends and continuing the traditional path, and adjusting the software stack to the changing hardware designs, the Department of Energys Exascale Computing Project decided for the aggressive step of building a multiprecision focus effort to take on the challenge of designing and engineering novel algorithms exploiting the compute power available in low precision and adjusting the communication format to the application-specific needs.

To start the multiprecision focus effort, we have written a survey of the numerical linear algebra community and summarized all existing multiprecision knowledge, expertise, and software capabilities in this landscape analysis report. We also include current efforts and preliminary results that may not yet be considered mature technology, but have the potential to grow into production quality within the multiprecision focus effort. As we expect the reader to be familiar with the basics of numerical linear algebra, we refrain from providing a detailed background on the algorithms themselves but focus on how mixed- and multiprecision technology can help to improve the performance of these methods and present highlights of application significantly outperforming the traditional fixed precision methods.

This report covers low precision BLAS operations, solving systems of linear systems, least squares problems, eigenvalue computations using mixed precision. These are demonstrated with dense and sparse matrix computations and direct and iterative methods. The ideas presented try to exploit low precision computations for the bulk of the compute time and then use mathematical techniques to enhance the accuracy of the solution to bring it to full precision accuracy with less time to solution.

On modern architectures, the performance of 32-bit operations is often at least twice as fast as the performance of 64-bit operations. There are two reasons for this. Firstly, a 32-bit floating point arithmetic rate of execution is usually twice as fast as a 64-bit floating point arithmetic on most modern processors. Secondly, the number of bytes moved through the memory system is halved. It may be possible to care out the computation in lower precision, say 16-bit operations.

One approach exploiting the compute power in low precision is motivated by the observation that in many cases, a single precision solution of a problem can be refined to the point where double precision accuracy is achieved. The refinement can be accomplished, for instance, by means of the Newtons algorithm (see Equation (1)) which computes the zero of a function f (x) according to the iterative formula:

In general, we would compute a starting point and f (x) in single precision arithmetic, and the refinement process will be computed in double precision arithmetic. If the refinement process is cheaper than the initial computation of the solution, then double precision accuracy can be achieved nearly at the same speed as the single precision accuracy.

Stunning results can be achieved. In Figure 1, we are comparing the solution of a general system of linear equations using a dense solver on an Nvidia V100 GPU comparing the performance for 64-, 32-, and 16-bit floating point operations for the factorization and then using refinement techniques to improve the solution for the 32- and 16-bit solution to what was achieved using 64-bit factorization.

The survey report presents much more detail on the methods and approaches using these techniques, see https://www.icl.utk.edu/files/publications/2020/icl-utk-1392-2020.pdf.

Author Bio Hartwig Anzt

Hartwig Anzt is a Helmholtz-Young-Investigator Group leader at the Steinbuch Centre for Computing at the Karlsruhe Institute of Technology (KIT). He obtained his PhD in Mathematics at the Karlsruhe Institute of Technology, and afterwards joined Jack Dongarras Innovative Computing Lab at the University of Tennessee in 2013. Since 2015 he also holds a Senior Research Scientist position at the University of Tennessee. Hartwig Anzt has a strong background in numerical mathematics, specializes in iterative methods and preconditioning techniques for the next generation hardware architectures. His Helmholtz group on Fixed-point methods for numerics at Exascale (FiNE) is granted funding until 2022. Hartwig Anzt has a long track record of high-quality software development. He is author of the MAGMA-sparse open source software package managing lead and developer of the Ginkgo numerical linear algebra library, and part of the US Exascale computing project delivering production-ready numerical linear algebra libraries.

Author Bio Jack Dongarra

Jack Dongarra received a Bachelor of Science in Mathematics from Chicago State University in 1972 and a Master of Science in Computer Science from the Illinois Institute of Technology in 1973. He received his PhD in Applied Mathematics from the University of New Mexico in 1980. He worked at the Argonne National Laboratory until 1989, becoming a senior scientist.He now holds an appointment as University Distinguished Professor of Computer Science in the Computer Science Department at the University of Tennessee, has the position of a Distinguished Research Staff member in the Computer Science and Mathematics Division at Oak Ridge National Laboratory (ORNL), Turing Fellow in the Computer Science and Mathematics Schools at the University of Manchester, and an Adjunct Professor in the Computer Science Department at Rice University.

See original here:

Research: A Survey of Numerical Methods Utilizing Mixed Precision Arithmetic - HPCwire

Atos signs 5m supercomputing deal to support Oxford University-led AI research push – ComputerWeekly.com

Oxford University is expanding its artificial intelligence (AI)-focused supercomputing capabilities with the help of Atos, after signing a four-year, 5m contract with the French IT services company.

The deal will see Atos provide the university with access to a deep-learning supercomputer, based on Nvidias DGX SuperPOD architecture, which will be used by the UK academic community to accelerate research into AI, machine learning and molecular dynamics.

The setup, once deployed, will be the largest AI-focused supercomputer in the UK, it is claimed, with 500 Nvidia-based graphics processing units (GPUs).

The deployment is being funded by the Engineering and Physical Sciences Research Council (EPSRC), and will be the second supercomputer of its kind to be deployed under the universitys ongoing Joint Academic Data science Endeavour (JADE) project.

The first phase of the JADE project saw the university pool resources with a consortium of eight other universities, and the data science research-focused organisation, the Alan Turing Institute, to create a supercomputing facility.

The second phase of the project is being supported by an additional 14 universities, and is designed to build on the success of the first by tripling the amount of supercomputing capacity made available to these organisations.

The JADE facility has provided a nucleus around which a national consortium of AI researchers has formed, making it the de facto national compute facility for AI research, said a posting on the EPSRC website. By providing a much-needed shared resource to these communities, JADE has also delivered an outstanding level of world-leading science.

JADE2 will build upon these successes by providing increased computational capabilities to these communities and delivering a stronger, more robust service to address the lessons learned from the initial service.

Wes Armour, a professor at the university, said the success of the first phase of JADE had led to increased demand from UK researchers for access to supercomputing capabilities, which is why JADE2 is needed.

Building on the success of the JADE collaboration with Atos, and by significantly expanding the JADE consortiums computing capacity, the new deep learning supercomputer supplied by Atos will allow us to meet this demand and help many institutions to make some potentially ground-breaking discoveries, said Armour.

It will cement JADEs status as the de facto national computing facility for academic AI research.

Agns Boudot,senior vice-president and head of high-performance computing and quantumat Atos,said the system which will be hosted at a specialist facility in Daresbury, Warrington will support the UK in its bid to become a world leader in the field of AI and machine learning.

We are proud to be working with the University of Oxford on the delivery of JADE2, which will provide researchers and industry with more computing power to enable new scientific breakthroughs and innovation in machine learning and AI, said Boudot.

We believe this high-performance system, coupled with our expertise, will help the UK to address key AI and machine learning challenges, while supporting the UKs ambition to be a world leader in these areas.

Read the original:

Atos signs 5m supercomputing deal to support Oxford University-led AI research push - ComputerWeekly.com

Atos partners with University of Oxford on largest AI supercomputer in the UK – Yahoo Finance

Paris and London, 4 August 2020 Atos, a global leader in digital transformation, has signed a four-year contract worth 5 million with the University of Oxford to deliver a new, state of the art, deep learning supercomputer built on the NVIDIA DGX SuperPOD architecture, which will enable UK academics and industry to drive forward scientific discoveries and innovation in machine learning and artificial intelligence, as part of the JADE2 project.

The largest AI-focused supercomputer in the UK with over 500 NVIDIA GPUs, this high-performance system is funded by the Engineering and Physical Sciences Research Council (EPSRC),and aims to build on the success of the current JADE1 (Joint Academic Data Science Endeavour)facility - a national resource providing advanced GPU computing facilities to world-leading AI and machine learning experts from a consortium of eight UK universities and the Alan Turing Institute.

The JADE2 supercomputer is built on NVIDIA DGX systems, and uses the DDN A3I storage solution. It will more than triple the capacity of the original JADE machine and provide increased computing capabilities to a wider consortium of over twenty universities and the Turing Institute, helping to meet the level of demand for AI-focused facilities created as a result of the success of the JADE resource.

Professor Wes Armour at the University of Oxfordsaid: The successful delivery of JADE has created more demand among UK researchers and industry for powerful computing facilities which can accommodate high end, data intensive AI workloads. Building on the success of the JADE collaboration with Atos, and by significantly expanding the JADE consortiums computing capacity, the new Deep Learning supercomputer supplied by Atos will allow us to meet this demand and help many institutions to make some potentially ground-breaking discoveries. It will cement JADEs status as the de facto national computing facility for academic AI research.

Alison Kennedy, Director, STFC Hartree Centre said: We are pleased that the delivery and hosting of this cutting-edge JADE-2 hardware and the resultant increase in capability will support the development and expansion of research computing skills across industry and academia. This aligns directly with the Hartree Centres mission and we are delighted to continue the collaboration with Atos and Oxford University which builds upon the previous success of JADE.

Agns Boudot, Senior Vice President, Head of HPC & Quantum at Atos,concluded:We are proud to be working with the University of Oxford on the delivery of JADE2, which will provide researchers and industry with more computing power to enable new scientific breakthroughs and innovation in machine learning and AI. We believe this high-performance system, coupled with our expertise, will help the UK to address key AI and machine learning challenges, while supporting the UKs ambition to be a world-leader in these areas.

The DGX SuperPOD system will comprise a cluster of 63 DGX nodes, having 504 NVIDIA A100 Tensor Core GPUs in total, interconnected with NVIDIA Mellanox InfiniBand networking, all being fed by DDNs AI400 storage, making it the largest such system in the UK.

The system is to be hosted at theSTFC Hartree Centrein Daresbury, near Warrington, UK.

1JADE - Joint Academic Data science Endeavour. This proposal, led by the University of Oxford, with support from the Alan Turing Institute (ATI), and 22 universities, is the national GPU system supporting multidisciplinary science with a focus on machine learning and molecular dynamics.

###

About AtosAtos is a global leader in digital transformation with 110,000 employees in 73 countries and annual revenue of 12 billion. European number one in Cloud, Cybersecurity and High-Performance Computing, the Group provides end-to-end Orchestrated Hybrid Cloud, Big Data, Business Applications and Digital Workplace solutions. The Group is the Worldwide Information Technology Partner for the Olympic & Paralympic Games and operates under the brands Atos, Atos|Syntel, and Unify. Atos is a SE (Societas Europaea), listed on the CAC40 Paris stock index.

The purpose of Atos is to help design the future of the information space. Its expertise and services support the development of knowledge, education and research in a multicultural approach and contribute to the development of scientific and technological excellence. Across the world, the Group enables its customers and employees, and members of societies at large to live, work and develop sustainably, in a safe and secure information space.

Press contacts:

Global: Laura Fau |laura.fau@atos.net| +33 6 73 64 04 18 |@laurajanefau

UK: Helena Shadbolt helena.shadbolt@mhpc.com +44 (0)20 3128 8799, nick.collins@mhpc.com +44 (0)20 3128 8897

Attachment

Read this article:

Atos partners with University of Oxford on largest AI supercomputer in the UK - Yahoo Finance