What Really Is Cryptocurrency? – TheStreet

Courtesy of Matt Sauer, MWSWNB Investments:

Is it a derivative, an asset or a currency? Look at its effect on the efficient frontier. Confidence is the key in any currency.

The advent of crypto assets and specifically cryptocurrency has garnered a significant amount of attention from investors. Is this the dawning of a new asset class or tulip bulbs? While there is diverse opinion on the subject, we observe cryptocurrency from three vantage points:

Asset Allocation

Modern Portfolio Theory constitutes the prevailing wisdom on selecting assets based on forecasted returns of each asset together with the correlation to the others to determine the mix of the allocation. The risk measurement is introduced through the volatility of the returns and is utilized to determine the weights of the assets in the portfolio. The introduction of uncorrelated assets has been labeled a free lunch because of the potential of return not being solely driven by correlated risk. Investors have observed the traditional asset classes of bonds, stocks and cash increase in correlation especially since the correction that occurred in the early 2000s.

The goal of asset allocation is to generate a return pattern for different risk profiles that minimize the drawdown. The more correlated the assets, the increase in difficulty of achieving the goal. The factor that MPT left out is how path dependent asset returns are. Additionally, because financial asset prices are lognormally distributed rather than normally distributed, the tails are fatter than occurs in normal distributions.

One of the fallacies that occurs in asset allocation is the belief that the investors are arbitraging time so short-term volatility is not a concern. This ignores the path dependency of the returns because geometric averages are non-ergodic. The result of the differences in return pattern is labeled the volatility tax which is simply the difference between arithmetic and geometric averages. While the arithmetic averages are simply the average of the returns the geometric returns are the average of the logarithms of the average price changes. A logarithm is a concave function so the bigger the loss the steeper the curve and the greater the penalty paid by the portfolio.

So how does an investor mitigate the volatility tax? We will explore the costs of hedging versus the introduction of non-traditional assets in the portfolio to obtain the objective of maximizing the compounded annual growth rate.

Crisis Proofing

In a recent academic study (Harvey et al. 2019), the goal of crisis proofing portfolios was analyzed. Purchasing rolling S&P 500 puts was costly at 7.4% annualized cost (thus wiping out the expected performance) while also finding the strategy of utilizing 10-year US Treasuries as unreliable. The best solution found was a long/short strategy of quality minus junk utilizing Fama and Frenchs work. This leaves a long only portfolio manager with few options to avoid the impact on the geometric averages of swift downturns.

Introducing crypto assets as a low cost non-correlated asset will provide a different return pattern. After the original transaction costs, the carrying cost of 50-100 basis points per year of the asset only accrue 10-20 basis points overall to the cost of the portfolio for a 2% position. As a diversifying asset it is the cheapest form of owning the largest sigma of returns. This strategy does not require dynamic hedging by either rolling puts or simply buying the dip. These strategies both increase the size of drawdowns while rolling options becomes more expensive as implied volatility increases during swift downturns.

Research has established that the risk-return tradeoff is distinct from stocks, currencies and macroeconomic factors (Liu and Tsyvinski 2018). Additionally, factors specific to cryptocurrency markets (momentum and investor attention) drove returns.

The cost of dynamically hedging a portfolio is too expensive so the best alternative for the portfolio manager is to introduce assets that exhibit anti-fragility. Investments that benefit from chaos are extremely valuable to a portfolio manager and as cryptocurrency exhibits these characteristics once, no portfolio manager will sit on the sidelines the second time.

Portfolio managers must acknowledge that cryptocurrency does not act as a traditional hedge, but it does raise the efficient frontier. This becomes more important if bond prices become more correlated with stock prices and flatten the efficient frontier.

We suggest a new version of MPT that has a z axis that is defined by purchasing power of the portfolio in a basket of currencies. The efficient frontier becomes a plane that is advantaged by owning diversified currencies. This is another building block in crisis proofing portfolios as the debasement of a currency held as the primary currency can be a much greater tax than the volatility.

Asset Class

Critics of cryptocurrency have stated that the growth is not based on investment factors but derivative factors such as money laundering. These attempts to undermine an emerging asset class are not new and have always been part of the growth.

The recent volatility of prices may suggest to some pundits that cryptocurrency is not an asset class but a speculative game. This is the expected reaction to volatility as common stocks were treated with the same apprehension following the 1929-32 price movement. Lawrence Chamberlains 1931 quote in Investment and Speculation was that only bonds could be bought for investment. Obviously, this quote was a result of market movements at that time rather than an attempt to protect wealth against inflation in the long run but that is the point. Todays fears over volatility are tomorrows missed returns.

High yield bonds have long been known by their pejorative title of junk bonds. Investors were lambasted in 1990 for being so foolhardy as to have invested in an asset class that was down 8.46% when high grade bonds were up 8.96%. There was a media frenzy over the death of junk bonds as an asset class. However, 1991 was a rebound year as junk was up 43.23% versus 16% for the high-grade bonds. Back to back 18% years in 1992-93 put junk back to asset class status and it has been high yield ever since.

Speculators are deemed to trade strictly on greed and fear. Invoking Keynes beauty contest analogy, they are evaluating what the other speculators will do and attempt to do it first. The coordination game they are playing involves looking for the Schelling point for the path of least resistance in the short run. Therefore, momentum and investor attention has driven the price, investors are trading short term momentum from the media coverage and have been utilizing investor attention to create liquidity both up and down. As the asset class matures, the Schelling point is not media opinion but focal point investors. This allows institutional investors to coordinate without communication. As the allocations of the new asset class are made independently, the coordination game played by market participants is utilizing their own Schelling point.

Asset classes become relevant because of recent superior performance whether it is quality stocks, gold or high-grade bonds. Institutionalization occurs after the speculator fallout as investors slowly migrate to the investment.

Currency

Currencies are the domain of countries because of taxing capability, regulation and the ability to affect supply. The idea that cryptocurrency is a true currency has been a question. The growth in government debts and obligations globally has not been perceived by the financial markets as an issue yet the risks are mounting. The dollar is no longer backed by gold or any liquid asset other than taxing ability. Bretton Woods is only a memory.

There is no risk that an investor in United States Treasury bonds will not be repaid principal. The question is what the purchasing power of that principal in terms of a basket of other currencies, oil and gold will be.

What are the experts thinking and writing about government policies and the effects on purchasing power?

International Monetary Fund viewpoint

A recent IMF working paper espoused the following thoughts on debt:Sovereign debt is a Janus-faced asset class. In the best of times it relaxes the domestic constraint on savings, smooths consumption, and finances investment. Investors see it as a safe haven, as delivering alpha, and as a means of portfolio diversification. In the worst of times it is associated with debt overhangs, banking collapses, exchange-rate crises and inflationary explosions. Investors see it unenforceable, illiquid and prone to messy debt workouts.

Currently global debt is about $244 trillion and about 318% of GDP. In January 2019, the International Monetary Fund warned governments to reign in debt and build buffers against risks. If a government can borrow at very low rates or in some cases negative rates, the path of least resistance is to increase the money supply and borrow more to keep the world economy rolling.

United States Congress viewpoint

Congressman Brad Sherman has proposed cryptocurrency be banned because it poses a threat to the dollars role as a reserve currency and the fact that the international settlement of oil is in dollars. As unworkable a policy that he suggests, there is a hint of fear among those responsible in the United States for debasing the dollar.

Our viewpoint

Developing countries have gold and currency reserves to underpin their currency. Developed countries utilize sovereign stability to create demand for their currency and then exploit the demand by increasing the amount of currency to maximize domestic asset values. As governments find it more difficult to sustain asset values necessary for tax receipts the debasement will continue. The global lack of the velocity of money supply has had governments pushing on a string since the last asset crisis of 2009. The next crisis will be led by asset price downturns but will morph into a crisis of currency value as governments try to flood the market with paper money to support asset prices. In this case the assets that rise in value will not be tied to a currency, be portable and not debasing.

What exactly is Cryptocurrency?

Cryptocurrencys strength is that it can solve several fundamental problems in the global marketplace. Like having the transient properties of being a solid, liquid and gas; crypto assets can function in as a derivative, an asset and a currency. The investment holding that reconstitutes Modern Portfolio Theory and adds a z axis of currency risk also allows for a solution that is the new efficient frontier as it progresses into an asset class that is anti-fragile. Inflation protection is the tangible outcome of rethinking risk as defined by purchasing power across all currencies rather than the loss of it in one.

--Matt Sauer, MWSWNB Investments. Originally published here.

Read the original here:
What Really Is Cryptocurrency? - TheStreet

BLOG: How to capitalise on the Artificial Intelligence theme – Your Money – Your Money

Robotics and Artificial Intelligence (AI) are expected to disrupt numerous sectors and industries. But how can investors capitalise on this theme?

Artificial Intelligence, robotics and automation are all themes which are becoming more prevalent within todays society, and for investors, certainly have a lot of potential. We do not yet fully understand and are unable to predict the true impact of these technological advancements, yet the speed at which business and operational transformation is taking place via the implementation of these digital technologies is staggering.

Artificial intelligence (AI) is a branch of computer science which is allowing companies to move to a new standard of analysing data and helping them to garner more value from their assets, both physical and digital. By utilising rapidly growing datasets, businesses are able to drive innovation, increase efficiency and empower this data to generate societal and corporate profits.

Robotics have been around for some time with UNIMATE being the first robot to be used on a production line in 1962. Todays examples include welding robots in factories, order picking robots in goods warehouses and even surgical robots used to improve clinical outcomes of patients through minimally invasive surgery. Additionally, as automation has allowed companies to use software to perform administrative tasks, robots now input digital signatures, auto-filling of online forms and employee analytics. The automation of manufacturing processes has also allowed for greater efficiency and reduced costs.

The intent to embrace these technologies already exists and is growing. In Morgan Stanleys Q3 2019 CIO Survey, artificial intelligence and machine learning implementation was listed as the second highest priority IT spend for companies, preceded only by cloud computing. Traditional business models are certainly being disrupted. The benefits of these new and ever-improving technologies will expand well beyond just technology stocks; they will influence and drive change and disruption through numerous sectors and industries.

The investment case for these themes is clear for anyone to see. However, identifying the correct investments to exploit these substantial opportunities and putting them together in an efficient way is somewhat trickier. Below are a number of actively managed funds which look to capitalise on these increasingly important and impactful themes:

The fund is a particularly unique offering giving investors not only a chance to access companies benefiting from, or set to benefit from, AI but also giving access to an investment process using AI itself. Their proprietary AI platform is used to identify companies where economic value is directly affected by AI.

As more and more companies engage with AI, this fund is well positioned to provide strong exposure to secular investment growth of long duration, resulting in potential for very strong returns. The fund is well diversified and doesnt rely solely on high allocation to the US and tech stocks; however, investors will need to accept a higher level of overall risk.

The team managing and contributing toward the investment process is thought to be the largest dedicated technology investing team in Europe. Their expertise and experience helps them to identify companies standing to benefit and capture the growth created by these long-term transformational themes.

The fund gives great exposure to companies enabling and involved in robotics, automation, AI and materials science. In doing so it has delivered annualised returns of over 15.5% since its inception in late 2017 double that of both the benchmark and sector.

The fund mainly invests in companies contributing to, or profiting from, developments in robotics and enabling technologies. Pictet is arguably the leading thematic investing firm in Europe and their pedigree within this space speaks for itself. On a three-year basis, this fund has generated the highest excess return over its respective benchmark of any of Pictets funds demonstrating the potential of this investment opportunity in particular.

The team believe the robotics sector is set to grow significantly faster than the broader economy over the coming years due to the ability of robotics to increase productivity, reduce costs and help solve challenges such as an increasingly elderly population.

Tom Rosser is investment research analyst at The Share Centre

View original post here:
BLOG: How to capitalise on the Artificial Intelligence theme - Your Money - Your Money

NIH harnesses AI for COVID-19 diagnosis, treatment, and monitoring – National Institutes of Health

News Release

Wednesday, August 5, 2020

Collaborative network to enlist medical imaging and clinical data sciences to reveal unique features of COVID-19.

The National Institutes of Health has launched the Medical Imaging and Data Resource Center (MIDRC), an ambitious effort that will harness the power of artificial intelligence and medical imaging to fight COVID-19. The multi-institutional collaboration, led by the National Institute of Biomedical Imaging and Bioengineering (NIBIB), part of NIH, will create new tools that physicians can use for early detection and personalized therapies for COVID-19 patients.

This program is particularly exciting because it will give us new ways to rapidly turn scientific findings into practical imaging tools that benefit COVID-19 patients, said Bruce J. Tromberg, Ph.D., NIBIB Director. It unites leaders in medical imaging and artificial intelligence from academia, professional societies, industry, and government to take on this important challenge.

The features of infected lungs and hearts seen on medical images can help assess disease severity, predict response to treatment, and improve patient outcomes. However, a major challenge is to rapidly and accurately identify these signatures and evaluate this information in combination with many other clinical symptoms and tests. The MIDRC goals are to lead the development and implementation of new diagnostics, including machine learning algorithms, that will allow rapid and accurate assessment of disease status and help physicians optimize patient treatment.

This effort will gather a large repository of COVID-19 chest images, explained Guoying Liu, Ph.D., the NIBIB scientific program lead on this effort, allowing researchers to evaluate both lung and cardiac tissue data, ask critical research questions, and develop predictive COVID-19 imaging signatures that can be delivered to healthcare providers.

Maryellen L. Giger, PhD, the A.N. Pritzker Professor of Radiology, Committee on Medical Physics at the University of Chicago, is leading the effort, which includes co-Investigators Etta Pisano, MD, and Michael Tilkin, MS, from the American College of Radiology (ACR), Curtis Langlotz, MD, PhD, and Adam Flanders, MD, representing the Radiological Society of North America (RSNA), and Paul Kinahan, PhD, from the American Association of Physicists in Medicine (AAPM).

This major initiative responds to the international imaging communitys expressed unmet need for a secure technological network to enable the development and ethical application of artificial intelligence to make the best medical decisions for COVID-19 patients, added Krishna Kandarpa, M.D., Ph.D., director of research sciences and strategic directions at NIBIB. Eventually, the approaches developed could benefit other conditions as well.

The MIDRC will facilitate rapid and flexible collection, analysis, and dissemination of imaging and associated clinical data. Collaboration among the ACR, RSNA, and AAPM is based on each organizations unique and complementary expertise within the medical imaging community, and each organizations dedication to imaging data quality, security, access, and sustainability.

About the National Institute of Biomedical Imaging and Bioengineering (NIBIB):NIBIBs mission is to improve health by leading the development and accelerating the application of biomedical technologies. The Institute is committed to integrating engineering and physical science with biology and medicine to advance our understanding of disease and its prevention, detection, diagnosis, and treatment. NIBIB supports emerging technology research and development within its internal laboratories and through grants, collaborations, and training. More information is available at the NIBIB websitehttps://www.nibib.nih.gov.

About the National Institutes of Health (NIH):NIH, the nation's medical research agency, includes 27 Institutes and Centers and is a component of the U.S. Department of Health and Human Services. NIH is the primary federal agency conducting and supporting basic, clinical, and translational medical research, and is investigating the causes, treatments, and cures for both common and rare diseases. For more information about NIH and its programs, visit http://www.nih.gov.

NIHTurning Discovery Into Health

###

The rest is here:
NIH harnesses AI for COVID-19 diagnosis, treatment, and monitoring - National Institutes of Health

Artificial Intelligence (AI) in the Freight Transportation Industry Market – Global Industry Growth Analysis, Size, Share, Trends, and Forecast 2020 …

Global Artificial Intelligence (AI) in the Freight Transportation Industry Market 2020 report focuses on the major drivers and restraints for the global key players. It also provides analysis of the market share, segmentation, revenue forecasts and geographic regions of the market.

The Artificial Intelligence (AI) in the Freight Transportation Industry market research study is an extensive evaluation of this industry vertical. It includes substantial information such as the current status of the Artificial Intelligence (AI) in the Freight Transportation Industry market over the projected timeframe. The basic development trends which this marketplace is characterized by over the forecast time duration have been provided in the report, alongside the vital pointers like regional industry layout characteristics and numerous other industry policies.

Request a sample Report of Artificial Intelligence (AI) in the Freight Transportation Industry Market at:https://www.marketstudyreport.com/request-a-sample/2833612?utm_source=Algosonline.com&utm_medium=AN

The Artificial Intelligence (AI) in the Freight Transportation Industry market research report is inclusive of myriad pros and cons of the enterprise products. Pointers like the impact of the current market scenario about investors have been provided. Also, the study enumerates the enterprise competition trends in tandem with an in-depth scientific analysis about the downstream buyers as well as the raw material.

Unveiling a brief of the competitive scope of Artificial Intelligence (AI) in the Freight Transportation Industry market:

Unveiling a brief of the regional scope of Artificial Intelligence (AI) in the Freight Transportation Industry market:

Ask for Discount on Artificial Intelligence (AI) in the Freight Transportation Industry Market Report at:https://www.marketstudyreport.com/check-for-discount/2833612?utm_source=Algosonline.com&utm_medium=AN

Unveiling key takeaways from the Artificial Intelligence (AI) in the Freight Transportation Industry market report:

For More Details On this Report: https://www.marketstudyreport.com/reports/covid-19-outbreak-global-artificial-intelligence-ai-in-the-freight-transportation-industry-market-report-development-trends-threats-opportunities-and-competitive-landscape-in-2020

Related Reports:

1. COVID-19 Outbreak-Global Liposomes Drug Delivery Industry Market Report-Development Trends, Threats, Opportunities and Competitive Landscape in 2020Read More: https://www.marketstudyreport.com/reports/covid-19-outbreak-global-liposomes-drug-delivery-industry-market-report-development-trends-threats-opportunities-and-competitive-landscape-in-2020

2. COVID-19 Outbreak-Global Radio Frequency (RF) Cable Industry Market Report-Development Trends, Threats, Opportunities and Competitive Landscape in 2020Read More: https://www.marketstudyreport.com/reports/covid-19-outbreak-global-radio-frequency-rf-cable-industry-market-report-development-trends-threats-opportunities-and-competitive-landscape-in-2020

Related Report : https://www.marketwatch.com/press-release/Automated-Parcel-Delivery-Terminals-Market-2020-08-06

Contact Us:Corporate Sales,Market Study Report LLCPhone: 1-302-273-0910Toll Free: 1-866-764-2150 Email: [emailprotected]

Read this article:
Artificial Intelligence (AI) in the Freight Transportation Industry Market - Global Industry Growth Analysis, Size, Share, Trends, and Forecast 2020 ...

Open Source Software Market Report 2020 Industry Size, Share, Growth, Trends, Sales Revenue, Business Strategies, Key Countries Analysis by Leading…

According to Orbis Research industry statistics, the Global Open Source Software Market will inventory a CAGR of about xx% by 2026. This market research report offers a comprehensive analysis of the event management service markets growth based on end-users and geography.

The study report offers a comprehensive analysis of Open Source Software market size across the globe as regional and country level market size analysis, CAGR estimation of industry growth during the forecast period, revenue, key drivers, competitive background and sales analysis of the payers. Along with that, the report explains the major challenges and risks to face in the forecast period.

Request a sample of this report @ https://www.orbisresearch.com/contacts/request-sample/4499249?utm_source=golden

The research report on the global Open Source Software market helps clients to understand the structure of the market by identifying its various segments such as product type, end user, competitive landscape and key regions. Further, the report helps users to analyze trends in each sub segment of the global Open Source Software industry. Moreover, research reports help the users to take the industry in long term with the help of these key segments.

This report focuses on the global Open Source Software status, future forecast, growth opportunity, key market and key players. The study objectives are to present the Open Source Software development in North America, Europe, China, Japan, Southeast Asia, India and Central & South America.

The key players covered in this study

IntelEpsonIBMTranscendOracleAcquiaOpenTextAlfrescoAstaroRethinkDBCanonicalClearCenterCleversafeCompiereContinuent

Browse the complete report @ https://www.orbisresearch.com/reports/index/global-open-source-software-market-size-status-and-forecast-2020-2026?utm_source=golden

Furthermore, the report on the global Open Source Software market offers an in depth analysis about the market size on the basis of regional and country level analysis worldwide. Geographical regional analysis is another largely important part of the analysis study and research of the global Open Source Software market.

Market segment by Type, the product can be split into

SharewareBundled SoftwareBSD(Berkeley Source Distribution)Advanced Driver Assistance Systems (ADAS)

Market segment by Application, split into

BMForumphpBBPHPWind

The study objectives of this report are:To analyze global Open Source Software status, future forecast, growth opportunity, key market and key players.To present the Open Source Software development in North America, Europe, China, Japan, Southeast Asia, India and Central & South America.To strategically profile the key players and comprehensively analyze their development plan and strategies.To define, describe and forecast the market by type, market and key regions.

In this study, the years considered to estimate the market size of Open Source Software are as follows:History Year: 2015-2019Base Year: 2019Estimated Year: 2020Forecast Year 2020 to 2026

For the data information by region, company, type and application, 2019 is considered as the base year. Whenever data information was unavailable for the base year, the prior year has been considered.

Table of Content:Chapter One: Report Overview1.1 Study Scope1.2 Key Market Segments1.3 Players Covered: Ranking by Open Source Software Revenue1.4 Market Analysis by Type1.4.1 Global Open Source Software Market Size Growth Rate by Type: 2020 VS 20261.4.2 Aviation Logistics1.4.3 Maritime Logistics1.4.4 Land Logistics1.5 Market by Application1.5.1 Global Open Source Software Market Share by Application: 2020 VS 20261.5.2 For Personal1.5.3 For Business1.5.4 For Government1.6 Coronavirus Disease 201Chapter Nine: (Covid-19): Open Source Software Industry Impact1.6.1 How the Covid-1Chapter Nine: is Affecting the Open Source Software Industry1.6.1.1 Open Source Software Business Impact Assessment Covid-191.6.1.2 Supply Chain Challenges1.6.1.3 COVID-19s Impact On Crude Oil and Refined Products1.6.2 Market Trends and Open Source Software Potential Opportunities in the COVID-1Chapter Nine: Landscape1.6.3 Measures / Proposal against Covid-191.6.3.1 Government Measures to Combat Covid-1Chapter Nine: Impact1.6.3.2 Proposal for Open Source Software Players to Combat Covid-1Chapter Nine: Impact1.7 Study Objectives1.8 Years Considered

Chapter Two: Global Growth Trends by Regions2.1 Open Source Software Market Perspective (2015-2026)2.2 Open Source Software Growth Trends by Regions2.2.1 Open Source Software Market Size by Regions: 2015 VS 2020 VS 20262.2.2 Open Source Software Historic Market Share by Regions (2015-2020)2.2.3 Open Source Software Forecasted Market Size by Regions (2021-2026)2.3 Industry Trends and Growth Strategy2.3.1 Market Top Trends2.3.2 Market Drivers2.3.3 Market Challenges2.3.4 Porters Five Forces Analysis2.3.5 Open Source Software Market Growth Strategy2.3.6 Primary Interviews with Key Open Source Software Players (Opinion Leaders)

Chapter Three: Competition Landscape by Key Players3.1 Global Top Open Source Software Players by Market Size3.1.1 Global Top Open Source Software Players by Revenue (2015-2020)3.1.2 Global Open Source Software Revenue Market Share by Players (2015-2020)3.1.3 Global Open Source Software Market Share by Company Type (Tier 1, Tier Chapter Two: and Tier 3)3.2 Global Open Source Software Market Concentration Ratio3.2.1 Global Open Source Software Market Concentration Ratio (CRChapter Five: and HHI)3.2.2 Global Top Chapter Ten: and Top 5 Companies by Open Source Software Revenue in 20193.3 Open Source Software Key Players Head office and Area Served3.4 Key Players Open Source Software Product Solution and Service3.5 Date of Enter into Open Source Software Market3.6 Mergers & Acquisitions, Expansion Planscontinued.

If enquiry before buying this report @ https://www.orbisresearch.com/contacts/enquiry-before-buying/4499249?utm_source=golden

About Us :

Orbis Research (orbisresearch.com) is a single point aid for all your market research requirements. We have vast database of reports from the leading publishers and authors across the globe. We specialize in delivering customized reports as per the requirements of our clients. We have complete information about our publishers and hence are sure about the accuracy of the industries and verticals of their specialization. This helps our clients to map their needs and we produce the perfect required market research study for our clients.

Contact Us :

Read the original post:
Open Source Software Market Report 2020 Industry Size, Share, Growth, Trends, Sales Revenue, Business Strategies, Key Countries Analysis by Leading...

A Radioactive Plague: The secrecy and censorship surrounding civilian deaths from World War II – Milwaukee Independent

The atomic bombing of Hiroshima and Nagasaki 75 years ago, is one of the most studied events in modern history. And yet significant aspects of that bombing are still not well known.

I published a social history of U.S. censorship in the aftermath of the bombings, Radiation Secrecy and Censorship after Hiroshima and Nagasaki, which this piece is based on. The material was drawn from a dozen different manuscript collections in archives around the US.

I found that military and civilian officials in the U.S. sought to contain information about the effects of radiation from the blasts, which helps explain the persistent gaps in the publics understanding of radiation from the bombings.

Heavy handed

Although everything related to the effects of the Hiroshima and Nagasaki bombs was defined at the time as a military secret, U.S. officials treated the three main effects blast, fire, and radiation very differently. They publicized and celebrated the powerful blast but worked to suppress information about the bombs radiation.

The world learned a month later a few details about that radiation that some type of atomic plague related to the atomic bomb was causing death and illness in the two bombed cities. But for years radiation remained the least publicized and least understood of the atomic bomb effects.

To this day we have no fully accepted accounting of the atomic bomb deaths in both cities; it has remained highly contested because of the politics surrounding the bombing, because of problems with the wartime Japanese census, and, importantly, because of the complexity of defining what constituted radiation-caused deaths over decades.

In my research, I found U.S. officials controlled information about radiation from the atomic bombs dropped over Japan by censoring newspapers, by silencing outspoken individuals, by limiting circulation of the earliest official medical reports, by fomenting deliberately reassuring publicity campaigns, and by outright lies and denial.

The censorship of the Japanese began quickly. As soon as Japanese physicians and scientists reached Hiroshima after the bombing, they collected evidence and studied the mysterious symptoms in the ill and dying. American officials confiscated Japanese reports, medical case notes, biopsy slides, medical photographs, and films and sent them to the U.S. where much remained classified for years -some for decades.

Historians note the irony of American Occupation officials claiming to bring a new freedom of the press to Japan, but censoring what the Japanese said in print about the atomic bombs. One month after the war ended, Occupation authorities restricted public criticism of the U.S. actions in Japan and denied any radiation aftereffects from exposure to the nuclear bombs.

In the US, too, newspapers omitted or obscured anything about radiation or ongoing radioactivity. Military officials encouraged editors to continue some kind of wartime censorship especially about the bombs radiation. Four official U.S. investigating teams sent to Japan in the months immediately after the surrender wrote reports about the biomedical effects of the two atomic bombs. Several of the reports minimized the radiation effects and all received classifications as secret or top secret so the circulation of the majority of their information remained constrained for years.

Traditional combat bomb

The censorship has several explanations. Even Manhattan Project scientists had only theoretical calculations about what to expect about the bombs radiation. As scientists studied the complex effects in the next years, the U.S. government classified information from Japan as well as related radiation information from medical research and the atomic bomb tests at the Nevada Test Site.

American officials wanted reassurance that Allied troops landing in Japan would not be endangered by any remaining radiation. Based on pre-bomb calculations, U.S. officials did not think that U.S. troops would be endangered by exposure to residual radiation but the concept of radiological weapons and uncertainty created fear.

An additional explanation for the censorship of information pertaining to radiation is that U.S. officials did not want the new weapon to be associated with radiological or chemical warfare, both of which were expanding in scope and funding after the war. Those associated with the atomic bomb wanted it to be viewed as a powerful but regular military weapon, a traditional combat bomb.

The results of the radiation censorship campaign have been hard to pin down both because of the nature of the silencing itself (including its incompleteness), and because knowledge leaked into public awareness in many ways and forms.

Historian Richard Miller observes that, In the long run, the radiation from the bomb was more significant than the blast or thermal effects. Yet, for years that radiation remained the least publicized and least understood of the atomic bomb effects.

Legacy of secrecy

Censorship about the radiation deaths and sickness from the atomic bombs in Japan was never, of course, entirely successful. American magazines featured fictional stories about cities ravaged by radiation. John Herseys searing account, Hiroshima, became a bestseller in 1946 just as the summers Crossroads atomic bomb tests in the Pacific received massive publicity including reports about the disastrous radioactive spray that contaminated eighty of the Navys unmanned test vessels.

Campaigns from governmental officials as well as military, scientific and industrial leaders sought to ease the publics fears with the alluring promises of miraculous medical cures and cheap energy from commercial nuclear power.

Historians have described the American publics reactions to Hiroshima as muted ambivalence and psychic numbing. Historian John Dower observes that although Americans demonstrated a longterm cyclical interest in what happened beneath the mushroom cloud, the nations more persistent response to Hiroshima and Nagasaki has been the averted gaze.

Secrecy, extraordinary levels of classification, lies, denial, and deception became the chief legacy of the initial impulse to censor radiation information from the Hiroshima and Nagasaki bombs.

Original post:

A Radioactive Plague: The secrecy and censorship surrounding civilian deaths from World War II - Milwaukee Independent

Institute for Pure and Applied Mathematics awarded $25M renewal from NSF – UCLA Newsroom

UCLAs Institute for Pure and Applied Mathematics, through which mathematicians work collaboratively with a broad range of scholars of science and technology to transform the world through math, has received a five-year, $25 million funding renewal from the National Science Foundation, effective Sept. 1.

The new award represents the latest investment by the NSF, which has helped to support IPAMs innovative multidisciplinary programs, workshops and other research activitiessince the institutes founding in 2000.

The continued NSF funding will enable IPAM to further its mission of creating inclusive new scientific communities and to bring the full range of mathematical techniques to bear on the great scientific challenges of our time, said Dimitri Shlyakhtenko, IPAMs director and a UCLA professor of mathematics. We will be able to continue to sponsor programs that bring together researchers from different scientific disciplines or from different areas of mathematics with the goal of sparking interdisciplinary collaboration that continues long after the IPAM program ends.

Mathematics has become increasingly central to science and technology, with applications in areas as diverse as search engines, cryptography, medical imaging, data science and artificial intelligence, to name a few, Shlyakhtenko said. Future developments, from sustainable energy production to autonomous vehicles andquantum computers, will require further mathematical innovation as well as the application of existing mathematics.

IPAMs goal is to foster interactions between mathematicians and doctors, engineers, physical scientists, social scientists and humanists that enable such technological and social progress. In the near future, for example, IPAM will be partnering with the new NSF Quantum Leap Challenge Institute for Present and Future Quantum Computation, which was launched in July with a five-year, $25 million award to UC Berkeley, UCLA and other universities.

Over its two decades of existence, IPAM has helped to stimulate mathematical developments that advance national health, prosperity and welfare through a variety of programs and partnerships that address scientific and societal challenges. Its workshops, conferences and longer-term programs, which last up to three months, bring in thousands of visitors annually from academia, government and industry.

IPAM also helps to train new generations of interdisciplinary mathematicians and scientists and places a particular emphasis on the inclusion of women and underrepresented minorities in the mathematics community.

In addition to the IPAM funding, the NSF recently announced five-year awards to five other mathematical sciences research institutes.

The influence of mathematical sciences on our daily lives is all around us and far-reaching, said Juan Meza, director of the NSF Division of Mathematical Sciences. The investment in these institutes enables interdisciplinary connections across fields of science, with impacts across sectors of computing, engineering and health.

Read the original:
Institute for Pure and Applied Mathematics awarded $25M renewal from NSF - UCLA Newsroom

Eight trends accelerating the age of commercial-ready quantum computing – TechCrunch

Ethan BatraskiContributor

Ethan Batraski is a partner at Venrock, where he invests across sectors with a particular focus on hard engineering problems such as developer infrastructure, advanced computing and space.

Every major technology breakthrough of our era has gone through a similar cycle in pursuit of turning fiction to reality.

It starts in the stages of scientific discovery, a pursuit of principle against a theory, a recursive process of hypothesis-experiment. Success of the proof of principle stage graduates to becoming a tractable engineering problem, where the path to getting to a systemized, reproducible, predictable system is generally known and de-risked. Lastly, once successfully engineered to the performance requirements, focus shifts to repeatable manufacturing and scale, simplifying designs for production.

Since theorized by Richard Feynman and Yuri Manin, quantum computing has been thought to be in a perpetual state of scientific discovery. Occasionally reaching proof of principle on a particular architecture or approach, but never able to overcome the engineering challenges to move forward.

Thats until now. In the last 12 months, we have seen several meaningful breakthroughs from academia, venture-backed companies, and industry that looks to have broken through the remaining challenges along the scientific discovery curve. Moving quantum computing from science fiction that has always been five to seven years away, to a tractable engineering problem, ready to solve meaningful problems in the real world.

Companies such as Atom Computing* leveraging neutral atoms for wireless qubit control, Honeywells trapped ions approach, and Googles superconducting metals, have demonstrated first-ever results, setting the stage for the first commercial generation of working quantum computers.

While early and noisy, these systems, even at just 40-80 error-corrected qubit range, may be able to deliver capabilities that surpass those of classical computers. Accelerating our ability to perform better in areas such as thermodynamic predictions, chemical reactions, resource optimizations and financial predictions.

As a number of key technology and ecosystem breakthroughs begin to converge, the next 12-18 months will be nothing short of a watershed moment for quantum computing.

Here are eight emerging trends and predictions that will accelerate quantum computing readiness for the commercial market in 2021 and beyond:

1. Dark horses of QC emerge: 2020 will be the year of dark horses in the QC race. These new entrants will demonstrate dominant architectures with 100-200 individually controlled and maintained qubits, at 99.9% fidelities, with millisecond to seconds coherence times that represent 2x-3x improved qubit power, fidelity and coherence times. These dark horses, many venture-backed, will finally prove that resources and capital are not sole catalysts for a technological breakthrough in quantum computing.

Read the original here:
Eight trends accelerating the age of commercial-ready quantum computing - TechCrunch

OODAcast: Bradley Rotter On The Future Of Work, CryptoCurrencies, Quantum Computing and Leadership – OODA Loop

Bradley Rotter is a visionary investor who has pioneered investments in many new alternative investments classes including having been an early backer of hedge funds in 1982 while speculating on the Chicago Mercantile Exchange. He was also an early investor in Bitcoin and other cryptocurrency ecosystems and at a dinner with OODA CEO Matt Devost in 2012 predicted Bitcoin would exceed the price of gold.

Bradley moved to San Francisco in mid 80s to be close to the technology fountainhead of the Bay Area. In 1995 he was famously quoted saying this internet thing is going to be big and this vision guided his investments in several successful technology companies.

Bradley has made numerous VC and PE investments, with a particular focus on internet and technology and spanning from hedge funds to satellites.

This wide ranging conversation hits on multiple high tech topics including quantum computing, crypto currencies and the data analytics.

Podcast Version:

Additional Reading:

Quantum Computing Sensemaking

Is Quantum Computing Ushering in an Era of No More Secrets?

View original post here:
OODAcast: Bradley Rotter On The Future Of Work, CryptoCurrencies, Quantum Computing and Leadership - OODA Loop

A Quintillion Calculations a Second: DOE Calculating the Benefits of Exascale and Quantum Computers – SciTechDaily

By U.S. Department of EnergyAugust 6, 2020

To keep qubits used in quantum computers cold enough so scientists can study them, DOEs Lawrence Berkeley National Laboratory uses a sophisticated cooling system. Credit: Image courtesy of Thor Swift, Lawrence Berkeley National Laboratory

A quintillion calculations a second. Thats one with 18 zeros after it. Its the speed at which an exascale supercomputer will process information. The Department of Energy (DOE) is preparing for the first exascale computer to be deployed in 2021. Two more will follow soon after. Yet quantum computers may be able to complete more complex calculations even faster than these up-and-coming exascale computers. But these technologies complement each other much more than they compete.

Its going to be a while before quantum computers are ready to tackle major scientific research questions. While quantum researchers and scientists in other areas are collaborating to design quantum computers to be as effective as possible once theyre ready, thats still a long way off. Scientists are figuring out how to build qubits for quantum computers, the very foundation of the technology. Theyre establishing the most fundamental quantum algorithms that they need to do simple calculations. The hardware and algorithms need to be far enough along for coders to develop operating systems and software to do scientific research. Currently, were at the same point in quantum computing that scientists in the 1950s were with computers that ran on vacuum tubes. Most of us regularly carry computers in our pockets now, but it took decades to get to this level of accessibility.

In contrast, exascale computers will be ready next year. When they launch, theyll already be five times faster than our fastest computer Summit, at Oak Ridge National Laboratorys Leadership Computing Facility, a DOE Office of Science user facility. Right away, theyll be able to tackle major challenges in modeling Earth systems, analyzing genes, tracking barriers to fusion, and more. These powerful machines will allow scientists to include more variables in their equations and improve models accuracy. As long as we can find new ways to improve conventional computers, well do it.

Once quantum computers are ready for prime time, researchers will still need conventional computers. Theyll each meet different needs.

DOE is designing its exascale computers to be exceptionally good at running scientific simulations as well as machine learning and artificial intelligence programs. These will help us make the next big advances in research. At our user facilities, which are producing increasingly large amounts of data, these computers will be able to analyze that data in real time.

Quantum computers, on the other hand, will be perfect for modeling the interactions of electrons and nuclei that are the constituents of atoms. As these interactions are the foundation for chemistry and materials science, these computers could be incredibly useful. Applications include modeling fundamental chemical reactions, understanding superconductivity, and designing materials from the atom level up. Quantum computers could potentially reduce the time it takes to run these simulations from billions of years to a few minutes. Another intriguing possibility is connecting quantum computers with a quantum internet network. This quantum internet, coupled with the classical internet, could have a profound impact on science, national security, and industry.

Just as the same scientist may use both a particle accelerator and an electron microscope depending on what they need to do, conventional and quantum computing will each have different roles to play. Scientists supported by the DOE are looking forward to refining the tools that both will provide for research in the future.

For more information, check out this infographic:

Read the original:
A Quintillion Calculations a Second: DOE Calculating the Benefits of Exascale and Quantum Computers - SciTechDaily