Why tech didnt save us from covid-19 – MIT Technology Review

Posted: June 21, 2020 at 1:53 pm

In the US, manufacturing jobs dropped by almost a third between 2000 and 2010 and have barely recovered since. Manufacturing productivity has been particularly poor in recent years (chart 5). What has been lost is not only jobs but also the knowledge embedded in a strong manufacturing base, and with it the ability to create new products and find advanced and flexible ways of making them. Over the years, the country ceded to China and other countries the expertise in competitively making many things, including solar panels and advanced batteriesand, it now turns out, swabs and diagnostic tests too.

No country should aim to make everything, says Fuchs, but the US needs to develop the capacity to identify the technologiesas well as the physical and human resourcesthat are critical for national, economic, and health security, and to invest strategically in those technologies and assets.

Regardless of where products are made, Fuchs says, manufacturers need more coordination and flexibility in global supply chains, in part so they arent tied to a few sources of production. That quickly became evident in the pandemic; for example, US mask makers scrambled to procure the limited supply of melt-blown fiber required to make the N95 masks that protect against the virus.

The problem was made worse because manufacturers keep inventories razor-thin to save money, often relying on timely shipments from a sole provider. The great lesson from the pandemic, says Suzanne Berger, a political scientist at MIT and an expert on advanced manufacturing, is how we traded resilience for low-cost and just-in-time production.

Berger says the government should encourage a more flexible manufacturing sector and support domestic production by investing in workforce training, basic and applied research, and facilities like the advanced manufacturing institutes that were created in the early 2010s to provide companies with access to the latest production technologies. We need to support manufacturing not only [to make] critical products like masks and respirators but to recognize that the connection between manufacturing and innovation is critical for productivity growth and, out of increases in productivity, for economic growth, she says.

The good news is that the US has had this discussion during previous crises. The playbook exists.

In June 1940, Vannevar Bush, then the director of the Carnegie Institution for Science in Washington, DC, went to the White House to meet President Franklin D. Roosevelt. The war was under way in Europe, and Roosevelt knew the US would soon be drawn into it. As Simon Johnson and Jonathan Gruber, both economists at MIT, write in their recent book Jump-Starting America, the country was woefully unprepared, barely able to make a tank.

Bush presented the president with a plan to gear up the war effort, led by scientists and engineers. That gave rise to the National Defense Research Committee (NDRC); during the war, Bush directed some 30,000 people, including 6,000 scientists, to steer the countrys technological development.

The inventions that resulted are well known, from radar to the atomic bomb. But as Johnson and Gruber write, the investment in science and engineering continued well after the war ended. The majorand now mostly forgottenlesson of the post-1945 period is that modern private enterprise proves much more effective when government provides strong underlying support for basic and applied science and for the commercialization of the resulting innovations, they write.

A similar push to ramp up government investment in science and technology is clearly what we need now, says Johnson. It could have immediate payoffs both in technologies crucial to handling the current crisis, such as tests and vaccines, and in new jobs and economic revival. Many of the jobs created will be for scientists, Johnson acknowledges, but many will also go to trained technicians and others whose work is needed to build and maintain an enlarged scientific infrastructure.

This matters especially, he says, because with an administration that is pulling back from globalization and with consumer spending weak, innovation will be one of the few options for driving economic growth. Scientific investment needs to be a strategic priority again, says Johnson. Weve lost that. It has become a residual. Thats got to stop.

Johnson is not alone. In the middle of May, a bipartisan group of congressmen proposed what they called the Endless Frontier Act to expand funding for the discovery, creation, and commercialization of technology fields of the future. They argued that the US was inadequately prepared for covid-19 and that the pandemic exposed the consequences of a long-term failure to invest in scientific research. The legislators called for $100 billion over five years to support a technology directorate that would fund AI, robotics, automation, advanced manufacturing, and other critical technologies.

Around the same time, a pair of economists, Northwesterns Ben Jones and MITs Pierre Azoulay, published an article in Science calling for a massive government-led Pandemic R&D Program to fund and coordinate work in everything from vaccines to materials science. The potential economic and health benefits are so large, Jones argues, that even huge investments to accelerate vaccine development and other technologies will pay for themselves.

Vannevar Bushs approach during the war tells us its possible, though the funding needs to be substantial, says Jones. But increased funding is just part of what is required, he says. The initiative will need a central authority like Bushs NDRC to identify a varied portfolio of new technologies to supporta function that is missing from current efforts to tackle covid-19.

The thing to note about all these proposals is that they are aimed at both short- and long-term problems: they are calling for an immediate ramp-up of public investment in technology, but also for a bigger government role in guiding the direction of technologists work. The key will be to spend at least some of the cash in the gigantic US fiscal stimulus bills not just on juicing the economy but on reviving innovation in neglected sectors like advanced manufacturing and boosting the development of promising areas like AI. Were going to be spending a great deal of money, so can we use this in a productive way? Without diminishing the enormous suffering that has happened, can we use this as a wake-up call? asks Harvards Henderson.

Historically, it has been done a bunch of times, she says. Besides the World War II effort, examples include Sematech, the 1980s consortium that revived the ailing US semiconductor industry in the face of Japans increasing dominance, by sharing technological innovations and boosting investment in the sector.

Can we do it again? Henderson says she is hopeful, though not necessarily optimistic.

The test of the countrys innovation system will be whether over the coming months it can invent vaccines, treatments, and tests, and then produce them at the massive scale needed to defeat covid-19. The problem hasnt gone away, says CMUs Fuchs. The global pandemic will be a fact of lifethe next 15 months, 30 monthsand offers an incredible opportunity for us to rethink the resiliency of our supply chains, our domestic manufacturing capacity, and the innovation around it.

It will also take some rethinking of how the US uses AI and other new technologies to address urgent problems. But for that to happen, the government has to take on a leading role in directing innovation to meet the publics most pressing needs. That doesnt sound like the government the US has now.

Visit link:

Why tech didnt save us from covid-19 - MIT Technology Review

Related Posts