The Culture Wars Arent Real. The People They Hurt Are. – BuzzFeed News

Earlier this month, British media once again platformed a talking point pushed by anti-trans activists. The Sunday Times, among other outlets, "reported" on a backlash to a hypothetical scenario in which a sex offender might choose to identify as a woman. That an imaginary notion was elevated into a news item speaks to the entrenched anti-transness of British media.

But then Harry Potter author J.K. Rowling tweeted the article, further elevating the narrative; she became a trending topic on Twitter, promoting a link between transness and sexual violence. US media covered her tweet as a controversy that implied there might be a real threat posed by trans self-determination. It might be too obvious to state, but there isnt. In fact, there is actually an ongoing epidemic of violence against trans women, and no such pattern of trans women committing violence against cis women or anyone else.

This was, however, the latest point of panic in a wave reacting to the so-called transgender tipping point of visibility. And Rowling in particular has chosen to make herself the face of this backlash.

Just last year, she publicly targeted clinics where trans youth receive lifesaving gender-affirming treatment, turning trans bodily self-determination into a story about the supposedly threatened safety of cis children. She has also mocked evolving public health language that includes trans men and nonbinary people, hijacking that recognition to create a story about the supposed erasure of cis women.

Like fellow billionaire Peter Thiel, she has even reportedly deployed her money and power to try to silence criticism. And yet all the while her anti-trans campaigning is generally characterized by the media as controversial views: not part of an explicit agenda, but an ongoing human interest mystery chronicled as a perplexing personal evolution.

Rowlings status as a celebrity billionaire affords her extra protection and the benefit of the doubt while also helping to amplify her talking points. But its specifically because she speaks as a white woman with concerns about the safety of women and children that her anti-trans framing is accepted on Twitter and treated by the media at large as worthy of debate.

Never explicitly framed as a misinformation agent who might merit deplatforming, Rowling is a symptom of the current media ecosystem, in which disinformation about minority identities is accepted as legitimate controversy.

This scenario comes into play whenever powerful people, institutions, or political organizations raise public concerns about the protection of majority groups, especially white women and children.

In fact, two of the biggest, seemingly unrelated, culture war stories this year were propelled by a similar reframing of misinformation as legitimate debate. These so-called controversies were supposedly about trans people, especially trans girls and women, and teaching history, known as the critical race theory debate.

Both were part of political backlashes that came in response to increased visibility for minority groups: increased representation of trans people in media and public debate about gender and the Black Lives Matter protests in the wake of the murder of George Floyd.

Framed largely by right-wing activists and think tanks as human interest issues about fairness in sports and classrooms, they circulated into national legacy media including publications like USA Today, CBS, The Atlantic, and the New York Times through first-person opinion pieces by mothers of cis athletes raising fears about trans inclusion or human interest reports featuring on-the-ground stories of white moms airing complaints about supposed radical ideas being introduced in schools.

Whatever the content of the reporting or articles, in platforming these issues through the concerns of cis and white people, mainstream media helped distort what constitutes legitimate perspectives for coverage, and in doing so sidelined the actual difficulties experienced by marginalized communities, including Black and trans youth.

Ultimately, this kind of coverage raises deeper questions about news organizations and who decides the perspective of culture war journalism.

Theres a long history in the US of setting the terms of debate by centering media narratives around the well-being of white women and children. Its usually associated with anti-Black and anti-gay right-wing activism and can be traced back to antischool integration campaigns in the 60s, through save the children anti-gay campaigns in the 70s, and even the coverage questioning how children would fare under marriages between same-sex couples in the aughts.

Right-wing activists used similar framing to introduce the so-called controversy over critical race theory. Attempts to eradicate histories of race in the US are nothing new. As recently as 2011, activists attempted to ban ethnic studies and Mexican American studies curricula in Arizona. But ethnic studies simply doesnt have the polarizing or concerning ring necessary to stoke a national panic about existing curricular offerings like studying civil rights leader Martin Luther King Jr.

The term critical race theory was perfect for right-wing campaigns, though, because, as one activist told the New Yorker this summer, to most Americans it connotes hostile, academic, divisive, race-obsessed, poisonous, elitist, anti-American. So a long-term campaign to dismantle any talk of race and history in schools was rebranded as a crusade against critical race theory, even though that term actually refers to a graduate-level theory about the intersections of law, culture, and structural racism that has nothing to do with elementary history in classrooms.

Theres a long history in the US of setting the terms of debate by centering media narratives around the well-being of white women and children.

The idea of the country as race-obsessed and race discourse as destructively divisive was already percolating in the wake of Black Lives Matter protests, especially after George Floyds murder.

Outlets like the New York Times and the Atlantic dedicated valuable resources to reporting on the supposed excesses of anti-racism. These nuance stories by white journalists included one about a Black fathers school board campaign against anti-racism. In the Times, there was a story about a Black student who made a supposedly false accusation of racial profiling at Smith College. Even attempts at self-reflection centered whiteness and painted anti-racism as an elitist concern; a story about the Times own newsroom racism was used to highlight how privileged white high schoolers now felt entitled to call out racism.

Right-wing think tanks, like the Manhattan Institute, promoted that precise notion online and in legacy media to activate parents into believing anti-racism was out of control. Quotes from concerned moms further stoked these fears: They are making my son feel like a racist because of the pigmentation of his skin.

The idea of talking about race wasnt necessarily new to many Black and brown parents for whom discussing the realities of inequality and existing in a white world isnt an option in the same way. Yet outlets including CBS and the Atlantic picked up that framing too, feeding into the sense that radically new ideas were suddenly being introduced with headlines like, When the culture war comes for the kids, and How young is too young to teach kids about race? (The latter headline was changed after a backlash.)

As the November elections neared, news stories about suburban or small-town parents battling over school curricula started popping up as well. The framing of these battles through reported human interest stories, rather than, say, misinformation explainers, suggested that these were newsworthy grassroots issues that spoke to broad parental fears rather than a vocal minority stoking social media disinformation.

To some degree, the stories discredited the panic about race education in schools by pointing out the organizations and dark money groups (like the Judicial Crisis Network) who helped fund these campaigns and including voices of supporters of existing curriculums. But they still promoted the idea that these battles represented two equal sides of inflamed national feelings, rather than a strategically invented controversy and well-funded top-down disinformation campaigns.

Ultimately, the timing and framing of these stories about race and education highlight that they were not deemed newsworthy because of concerns of the community members at the center, Black parents and youth, or the massive ongoing inequality around race and class that still permeates public schools. Instead, they helped reframe debate to center white parental anxieties.

In many ways, this same scenario misinformation platformed as debate has been playing out in the coverage of trans people, long before J.K. Rowling seized the moment as a major anti-trans voice. Newsrooms lacking in trans journalists had been framing trans existence through concerns that trans people were incapable of deciding their bodily self-determination on their own.

This type of clueless question about how young is too young for children to know their gender, coming from outside the trans community, was epitomized by a now-infamous 2018 Atlantic cover story. It explicitly addressed imagined anxious white parents with the (misgendering) headline, Your child says shes trans...shes 13.

The story and its cis panic about trans identity as some kind of trend among teens was later completely debunked by other news outlets. Since then, gender historians have shown the long history of trans children, studies have confirmed that trans children are just as certain about their gender as cis teens. The Atlantic never officially apologized for the storys framing, including misgendering and outing the cover model. (The writer, however, has since been placed on watch lists for anti-LGBTQ journalism).

This year, right-wing activists expanded their concern to sports. And it wasnt an accident they set that arena as a location to invent debate.

In the real world, all trans people are not white and not middle class and have little access to healthcare even if they can find an affirming clinic, especially when most insurance companies refuse to cover such care. Trans people struggle not with identity itself, but with an anti-trans world that restricts access to resources for transition and features gatekeepers who set rules and timelines on cis terms. And unsurprisingly, Novembers elections saw right-wing activists promoting a new wave of bills blocking access to healthcare for trans people. As GLAAD pointed out, that Atlantic cover story was used in a legal brief filed by seven state attorneys general in a federal lawsuit seeking to roll back existing healthcare access for trans people.

This year, right-wing activists expanded their concern to sports. And it wasnt an accident they set that arena as a location to invent debate. Like classrooms, sports are imagined by white Americans as a neutral space of meritocracy, and right-wing think tanks purposely promoted that setting for human interest stories about fairness.

Publications including USA Today and the Economist took the bait, uncritically platforming first-person pieces by white mothers and white athletes airing out concerns about maybe having to compete against trans girls. The misinformation spread by cis athletes about hormonal or strength differences was ultimately debunked.

But real questions about meritocracy, including around race and class inequality, did not even get folded into these chronicles, revealing that narratives were about the protection of supposedly endangered young white women. This becomes clearer when considering that the surveillance regarding testosterone levels has primarily targeted cis Black women athletes.

Given the minority status of trans people in society at large, its unsurprising that trans athletes never even materialized in most states where the bills were being pushed. Yet even positive human interest pieces about trans athletes were reactive ones in which trans humanity was rendered visible only in terms of the wave of cis fears.

As with the CRT coverage, the focus on questions about youth transition or sports isnt actually about the struggles of trans people at all, which include disproportionately high rates of housing insecurity and under or unemployment.

The sidelining of actual trans issues in order to debate imaginary fears does, however, speak to broader systemic problems with media and the way that trans people circulate as objects of coverage for cis people rather than subjects of their own reality. Even media attempts to cover anti-trans activism have turned into debates between cis women about transness through controversies about trans exclusionary radical feminists (TERFS).

That moniker itself, recently used to describe Rowling, platforms anti-trans activists within the context of feminism and has lent legitimacy to their efforts, portraying bigotry as some kind of newfangled intellectual exercise over the meaning of feminism or queer community. In fact, anti-transness is part of a long history of class and racial exclusions in feminism, both in media and, most importantly, in the real world, where trans identity has been made into a scapegoat for anger about inequality more broadly.

Its unquestionable that the CRT and trans debates have been pushed into the media by right-wing activism and conservative politicians through strategic waves of anti-CRT and anti-trans bills. Theyre even timed to purposely inflame conservatives and rally the base for elections.

But at this point, its too easy to see anti-trans and anti-Black concern-mongering as just an issue of right-wing misinformation. After all, these framings are accepted for coverage via the editorial judgments of majority white and cis newsrooms.

So-called culture war issues are where the media allows itself maximal editorializing on behalf of cis white anxieties and fears about a changing world. But the terms for what becomes a culture war story are not decided by the public. Instead, they are decided in newsrooms that dont mirror reality but certainly help shape it.

American newsrooms are even whiter than the country as a whole, and its in that context of media echo chambers that critical race theory is repackaged as controversial. Most Americans believe the history of slavery should be taught, for instance. And after the 2021 November election, polls showed that even the idea that critical race theory drove elections was overstated.

Similarly, trans rights are actually not controversial in the US population at large. But trans journalists are woefully underrepresented in newsrooms. Its predictable that cis journalists talking to each other about transness results in stories that home in on and magnify cis debates about trans identity. This dynamic sidelines the potential richness of good faith exchanges within the trans community about the complexity of existing in a cis world.

Current thinking about misinformation is focused on anti-science or partisan campaigns that exist in the social media ether. But there are other important questions, like the way the media feeds into misinformation by platforming sources that reframe debate outside the terms of the communities these debates actually affect.

Trans people struggle not with identity itself, but with an anti-trans world.

Partisanship is still the favored term in journalism for talking about media balance. But considering editorial judgment through partisanship simply recreates existing power imbalances by focusing on issues about race, class, and gender only if theyre legible through the lens of Republican vs. Democrat. It would mean something quite different if corporate media held itself accountable to the communities it covers rather than political parties.

Categorizing questions about ethical coverage through partisanship issues also helps ignore uncomfortable realities about news capitalism, like the fact that newsrooms need to make a profit and stories are often packaged for advertisers and imagined white readers.

Financial incentives are a major reason why its hard to wean media off engaging with misinformative framings to capture cis and white readers, which still constitute a majority of the public. After all, these panicked stories feed engagement for Twitter, Facebook, legacy media, and new venture capitalist corporate platforms like Substack.

Its not an accident that in all the race and trans backlash stories, class is invoked not to call out how white middle- and upper-class perspectives shape newsrooms (including through media CEOs). Instead, it is invoked to imply that anti-racism or trans rights are somehow an elitist concern. This framing takes pressure off the publications themselves to engage with these issues as a labor concern in their own newsrooms. But divorcing stories about class and identity from the real world and existing power structures is a distortion. Framing and context shouldnt only be dictated by cis white fears and concerns.

Still, there have been some changes by newsrooms around the framing of stories to acknowledge power imbalances in the real world. The Verge has updated its policies for giving big tech companies anonymity as background sources for articles. Some news organizations are questioning the uncritical use of police sources when ascertaining the truth of events. Cis and white concerned parents might be less obviously identifiable as problematic sources, but its a powerful category of people due for a similar reckoning.

Tellingly, after a backlash to the white framing of its how young is too young CRT story, CBS changed the headline not to, say, White Parents Are Finally Having to Grapple with Questions Others Routinely Do. Instead, it was replaced with a nonclickbait-y mouthful: Documentary explores debate over how and when race should be taught in schools.

That shift of the framing to debate is the customary way mainstream media dodges any pressure about taking sides. But platforming both sides implies we live in an already equal world. We dont. And thats a fact.

Here is the original post:

The Culture Wars Arent Real. The People They Hurt Are. - BuzzFeed News

Jack Dorsey Goes Bananas Against VCs and the Centralization of Web3 – BeInCrypto

Former Twitter CEO Jack Dorsey said on Twitter that VCs were gaining all the benefits of Web3 and that it was another form of centralization. Many notable names fought back, dismissing the criticism.

Former Twitter CEO and co-founder Jack Dorsey launched a series of disparaging tweets against venture capitalists and corporations on Dec 21. Dorsey stepped down as Twitter CEO early in December 2021, sparking much discussion about the move.

Dorseys latest tweets, made over the course of two days, remarked that the problem was with VCs, not end-users. He said that retail investors and users dont own Web3, but that VCs and their limited partners do. He called it ultimately a centralized entity with a different label and that users should and know what [theyre] getting into.

The overall sentiment of the tweets was that all the benefits of Web3 would be going towards VCs and not the public as it was originally envisioned to. VCs like Balaji Srinivasan and a partner at VC a16z fought back, disagreeing with the view.

Srinivasan said that Twitter itself has become a slave to corporate and political incentives, which led to deplatforming and censorship. A16z partner Chris Dixon also responded with criticism to the tweet, but Dorsey retorted by calling a16z a fund determined to be a media empire that cant be ignorednot Gandhi.

Even Elon Musk replied to the tweet, saying he couldnt find Web3. Dorsey responded by quipping that its somewhere between a and z. Musk has also criticized Web3 in the past.

Dorsey also rebutted Srinivasans statement on Twitter, saying that the platform had begun as a corporation. He elaborated that Web3 had corporate incentives but was hiding them under decentralization.

While Dorsey is clearly opposed to how Web3 is being taken to by VCs, he is all for decentralization. The entrepreneur has on multiple occasions shown support for bitcoin and decentralized initiatives.

Dorseys departure from Twitter brought about some speculation as to why it happened. Some believe it had to do with Twitters increasing censorship. Indeed, after Dorsey left, Twitter changed its privacy policy and began implementing bans.

Its hard to say why Dorsey left, but its clear that he was not happy with how things were generally proceeding in the tech space. Censorship has become a hot topic in 2021 and will remain a point of contention in the future.

Dorsey is now leading the initiatives of Block, formerly known as Square. The company has made multiple forays into the cryptocurrency space. He announced in Nov. 2021 that there would be a new product for at home bitcoin mining, besides asking for help in building an open, decentralized exchange for bitcoin.

DisclaimerAll the information contained on our website is published in good faith and for general information purposes only. Any action the reader takes upon the information found on our website is strictly at their own risk.

Follow this link:

Jack Dorsey Goes Bananas Against VCs and the Centralization of Web3 - BeInCrypto

BlockbusterDAO: A New Experiment in Harnessing the Power of Tokenized Nostalgia – The Tokenist

Neither the author, Tim Fries, nor this website, The Tokenist, provide financial advice. Please consult ourwebsite policyprior to making financial decisions.

Many believe GameStop was saved from bankruptcy by retail trader-fueled nostalgia. Can the same work for the iconic Blockbuster franchise, but in a new tokenized DAO form?

Payment integration is not the only thing FinTech revolutionized. By pushing the envelope of convenience, FinTech sparked the growth of crowdfunding campaigns. Last year, in North America alone, crowdfunding funds increased by 33%, generating $17.2 billion. On average, such campaigns manage to raise about $29k.

However, as we near the end of 2021, we are seeing the evolution of crowdfunding through decentralization. More specifically, tokenized DAOs decentralized autonomous organizations are ending the year with a bang.

After all, DAOs are resistant to deplatforming while providing a transparent form of staking and voting through tokens. Notwithstanding the potential of its success, BlockbusterDAO shows another possible through smart contracts. In this case, the DAO aims to revitalize the spirit of the long-lost video-rental store chain, Blockbuster

BlockbusterDAO is both a Twitter handle, having joined the platform this month, and a name for a new DAO. Its stated goal in a recent Twitter thread from December 25th is to raise enough funds to buy the Blockbuster brand from the current owner, Dish Network, a satellite TV provider with a 15.6% market share in the US.

From the thread, it is clear that BlockbusterDAO wants to buy the brand to achieve the following objectives:

We have seen variations of this approach in play before. ConstitutionDAO, a collective of politically-minded crypto enthusiasts, attempted to buy a rare copy of the U.S. Constitution at Sothebys after having raised an impressive $27 million within a week of its launch. However, even though the auction house estimated the selloff price between $15 $20 million, Ken Griffin outbid the DAO at $43 million.

This may be a double-edged sword for such crowdfunding campaigns. They rely on public traction, but this also brings unknown bidders out of the woodwork. The nostalgia for days long gone is a big factor as well, having changed from negative to positive.

Moreover, a Netflix documentary came out in 2020, titled The Last Blockbuster, depicting the last remaining Blockbuster retail store in Bend, Oregon. It is safe to say this further increased the nostalgia factor, and the Blockbuster price tag with it. In Q3 2021, Dish Network reported a total revenue of $4.45 billion, so $5 million for the brand to BlockbusterDAO would constitute 0.1% of the companys earnings.

BlockbusterDAO seems to be aware of this problem, proposing to begin awareness and PR campaign to build pressure to sell:

However, it was also the case with ConstitutionDAO that they managed to raise well-above the initial threshold, overshooting it by 137.42%. If that happens again, fortunes may turn out differently.

Unfortunately, ETH gas fees are so severe and volatile, they alone amounted to $800,000 to $1 million in refund expenditure, according to figures pulled by Richard Chen. Whatever happens, the DAO funding collection and refund mechanisms clearly work. Even if BlockbusterDAO fails in its original mission, we are seeing a new trend emerge in real-time.

Outside of costly ETH gas fees, DAO funding provides a non-mediated guarantee of refunds thanks to smart contracts. In turn, this creates public confidence and trust.

Likewise, both BlockbusterDAO and ConstitutionDAO have made good grounds in informing the public what is possible with smart contract blockchains. These are all tailwinds for future overfunded projects with secondary goals that may turn into Big DeFi juggernauts.

Join ourTelegram groupand never miss an update from the world of DeFi.

The story of Blockbluster is eerily similar to GameStop. Launched in Dallas in 1985, Blockbusters in-store video rental business model grew to a massive network of 9,000 outfits across the country. Unfortunately, it couldnt last long beyond its peak in 2004 when Blockbuster scored $5.9 billion in revenue.

The mass adoption of broadband internet, cheap storage, and on-demand online video, all conspired to erode Blockbusters business model. This was further exacerbated by eliminating late fees, exerting a cost at around $200 million, alongside the unsuccessful launch of Blockbuster Online. Moreover, the emerging Netflix had no brick & mortar baggage.

Lastly, the CEO of Blockbuster, John Antioco, made a critical mistake. He could have bought Netflix in its early stage for a mere $50 million, according to Netflix co-founder Marc Randolph in his book That Will Never Work. This was just after the dot.com bubble crash, so Netflix was in dire straits having to still rely on an unprofitable DVD-by-mail rental service.

Fast forward to Blockbusters bankruptcy in September 2010, and there were no avenues of profit-generating revenue left to explore.

Finance is changing.

Learn how, with Five Minute Finance.

A weekly newsletter that covers the big trends in FinTech and Decentralized Finance.

Awesome

Youve subscribed.

Youre well on your way to being in the know.

Do you think BlockbusterDAO will find traction? What kind of institution would benefit most from DAO funding, staking, and governance? Let us know in the comments below.

About the author

Tim Fries is the cofounder of The Tokenist. He has a B. Sc. in Mechanical Engineering from the University of Michigan, and an MBA from the University of Chicago Booth School of Business. Tim served as a Senior Associate on the investment team at RW Baird's US Private Equity division, and is also the co-founder of Protective Technologies Capital, an investment firm specializing in sensing, protection and control solutions.

Original post:

BlockbusterDAO: A New Experiment in Harnessing the Power of Tokenized Nostalgia - The Tokenist

What is Web 3.0 and why it is being called next generation internet?s – Business Standard

The current version of the world wide web or Web 2.0 is characterised by social media platforms, which allow greater proliferation of user-generated content. This is a far cry from Web 1.0, which was all static and non-interactive -- an entirely top-down approach towards information dissemination. Right now, five big tech companies, namely, Twitter, Facebook (now Meta), Google, Apple, Microsoft and Amazon, control how our data will be used and where it will be stored and processed. Their algorithms decide the information that we consume, which has left alarm bells ringing. Now, Web 3.0, with its crypto, blockchain and metaverse use cases, is being touted as a movement that will wrest back the control of the internet from the five big tech companies. Instead of our data residing with centralised organisations today, Web 3.0 would see it residing on blockchain networks and thus, being owned by users themselves. It could be as simple as a user based in India and another based in the US, having a business meeting inside a virtual reality metaverse such as Decentraland, which is built on the Ethereum blockchain. They could then complete their planned business deal using their crypto wallets linked to their metaverse accounts. And thats that. Facebook realises that this is the future of the internet, hence its rebranding to Meta. Such is the craze around metaverse that people and organisations are spending millions of dollars to buy land that only exists inside these virtual worlds. But it makes business sense. Because in a future when people are going to wear their VR headsets and meet inside these virtual worlds for social gatherings, music concerts and art auctions, you need land here for advertising and events. However, as there are proponents, so there are sceptics as well.

Twitter cofounder and former CEO Jack Dorsey has denounced the much-hyped decentralised feature of Web 3.0.

Business Standard has always strived hard to provide up-to-date information and commentary on developments that are of interest to you and have wider political and economic implications for the country and the world. Your encouragement and constant feedback on how to improve our offering have only made our resolve and commitment to these ideals stronger. Even during these difficult times arising out of Covid-19, we continue to remain committed to keeping you informed and updated with credible news, authoritative views and incisive commentary on topical issues of relevance.We, however, have a request.

As we battle the economic impact of the pandemic, we need your support even more, so that we can continue to offer you more quality content. Our subscription model has seen an encouraging response from many of you, who have subscribed to our online content. More subscription to our online content can only help us achieve the goals of offering you even better and more relevant content. We believe in free, fair and credible journalism. Your support through more subscriptions can help us practise the journalism to which we are committed.

Support quality journalism and subscribe to Business Standard.

Digital Editor

Read more here:

What is Web 3.0 and why it is being called next generation internet?s - Business Standard

What is Quantum Computing? | IBM

Let's look at example that shows how quantum computers can succeed where classical computers fail:

A supercomputer might be great at difficult tasks like sorting through a big database of protein sequences. But it will struggle to see the subtle patterns in that data that determine how those proteins behave.

Proteins are long strings of amino acids that become useful biological machines when they fold into complex shapes. Figuring out how proteins will fold is a problem with important implications for biology and medicine.

A classical supercomputer might try to fold a protein with brute force, leveraging its many processors to check every possible way of bending the chemical chain before arriving at an answer. But as the protein sequences get longer and more complex, the supercomputer stalls. A chain of 100 amino acids could theoretically fold in any one of many trillions of ways. No computer has the working memory to handle all the possible combinations of individual folds.

Quantum algorithms take a new approach to these sorts of complex problems -- creating multidimensional spaces where the patterns linking individual data points emerge. In the case of a protein folding problem, that pattern might be the combination of folds requiring the least energy to produce. That combination of folds is the solution to the problem.

Classical computers can not create these computational spaces, so they can not find these patterns. In the case of proteins, there are already early quantum algorithms that can find folding patterns in entirely new, more efficient ways, without the laborious checking procedures of classical computers. As quantum hardware scales and these algorithms advance, they could tackle protein folding problems too complex for any supercomputer.

How complexity stumps supercomputers

Proteins are long strings of amino acids that become useful biological machines when they fold into complex shapes. Figuring out how proteins will fold is a problem with important implications for biology and medicine.

A classical supercomputer might try to fold a protein with brute force, leveraging its many processors to check every possible way of bending the chemical chain before arriving at an answer. But as the protein sequences get longer and more complex, the supercomputer stalls. A chain of 100 amino acids could theoretically fold in any one of many trillions of ways. No computer has the working memory to handle all the possible combinations of individual folds.

Quantum computers are built for complexityQuantum algorithms take a new approach to these sorts of complex problems -- creating multidimensional spaces where the patterns linking individual data points emerge. Classical computers can not create these computational spaces, so they can not find these patterns. In the case of proteins, there are already early quantum algorithms that can find folding patterns in entirely new, more efficient ways, without the laborious checking procedures of classical computers. As quantum hardware scales and these algorithms advance, they could tackle protein folding problems too complex for any supercomputer.

See more here:
What is Quantum Computing? | IBM

What Is Quantum Computing? | NVIDIA Blog

Twenty-seven years before Steve Jobs unveiled a computer you could put in your pocket, physicist Paul Benioff published a paper showing it was theoretically possible to build a much more powerful system you could hide in a thimble a quantum computer.

Named for the subatomic physics it aimed to harness, the concept Benioff described in 1980 still fuels research today, including efforts to build the next big thing in computing: a system that could make a PC look in some ways quaint as an abacus.

Richard Feynman a Nobel Prize winner whose wit-laced lectures brought physics to a broad audience helped establish the field, sketching out how such systems could simulate quirky quantum phenomena more efficiently than traditional computers. So,

Quantum computing is a sophisticated approach to making parallel calculations, using the physics that governs subatomic particles to replace the more simplistic transistors in todays computers.

Quantum computers calculate using qubits, computing units that can be on, off or any value between, instead of the bits in traditional computers that are either on or off, one or zero. The qubits ability to live in the in-between state called superposition adds a powerful capability to the computing equation, making quantum computers superior for some kinds of math.

Using qubits, quantum computers could buzz through calculations that would take classical computers a loooong time if they could even finish them.

For example, todays computers use eight bits to represent any number between 0 and 255. Thanks to features like superposition, a quantum computer can use eight qubits to represent every number between 0 and 255, simultaneously.

Its a feature like parallelism in computing: All possibilities are computed at once rather than sequentially, providing tremendous speedups.

So, while a classical computer steps through long division calculations one at a time to factor a humongous number, a quantum computer can get the answer in a single step. Boom!

That means quantum computers could reshape whole fields, like cryptography, that are based on factoring what are today impossibly large numbers.

That could be just the start. Some experts believe quantum computers will bust through limits that now hinder simulations in chemistry, materials science and anything involving worlds built on the nano-sized bricks of quantum mechanics.

Quantum computers could even extend the life of semiconductors by helping engineers create more refined simulations of the quantum effects theyre starting to find in todays smallest transistors.

Indeed, experts say quantum computers ultimately wont replace classical computers, theyll complement them. And some predict quantum computers will be used as accelerators much as GPUs accelerate todays computers.

Dont expect to build your own quantum computer like a DIY PC with parts scavenged from discount bins at the local electronics shop.

The handful of systems operating today typically require refrigeration that creates working environments just north of absolute zero. They need that computing arctic to handle the fragile quantum states that power these systems.

In a sign of how hard constructing a quantum computer can be, one prototype suspends an atom between two lasers to create a qubit. Try that in your home workshop!

Quantum computing takes nano-Herculean muscles to create something called entanglement. Thats when two or more qubits exist in a single quantum state, a condition sometimes measured by electromagnetic waves just a millimeter wide.

Crank up that wave with a hair too much energy and you lose entanglement or superposition, or both. The result is a noisy state called decoherence, the equivalent in quantum computing of the blue screen of death.

A handful of companies such as Alibaba, Google, Honeywell, IBM, IonQ and Xanadu operate early versions of quantum computers today.

Today they provide tens of qubits. But qubits can be noisy, making them sometimes unreliable. To tackle real-world problems reliably, systems need tens or hundreds of thousands of qubits.

Experts believe it could be a couple decades before we get to a high-fidelity era when quantum computers are truly useful.

Predictions of when we reach so-called quantum computing supremacy the time when quantum computers execute tasks classical ones cant is a matter of lively debate in the industry.

The good news is the world of AI and machine learning put a spotlight on accelerators like GPUs, which can perform many of the types of operations quantum computers would calculate with qubits.

So, classical computers are already finding ways to host quantum simulations with GPUs today. For example, NVIDIA ran a leading-edge quantum simulation on Selene, our in-house AI supercomputer.

NVIDIA announced in the GTC keynote the cuQuantum SDK to speed quantum circuit simulations running on GPUs. Early work suggests cuQuantum will be able to deliver orders of magnitude speedups.

The SDK takes an agnostic approach, providing a choice of tools users can pick to best fit their approach. For example, the state vector method provides high-fidelity results, but its memory requirements grow exponentially with the number of qubits.

That creates a practical limit of roughly 50 qubits on todays largest classical supercomputers. Nevertheless weve seen great results (below) using cuQuantum to accelerate quantum circuit simulations that use this method.

Researchers from the Jlich Supercomputing Centre will provide a deep dive on their work with the state vector method in session E31941 at GTC (free with registration).

A newer approach, tensor network simulations, use less memory and more computation to perform similar work.

Using this method, NVIDIA and Caltech accelerated a state-of-the-art quantum circuit simulator with cuQuantum running on NVIDIA A100 Tensor Core GPUs. It generated a sample from a full-circuit simulation of the Google Sycamore circuit in 9.3 minutes on Selene, a task that 18 months ago experts thought would take days using millions of CPU cores.

Using the Cotengra/Quimb packages, NVIDIAs newly announced cuQuantum SDK, and the Selene supercomputer, weve generated a sample of the Sycamore quantum circuit at depth m=20 in record time less than 10 minutes, said Johnnie Gray, a research scientist at Caltech.

This sets the benchmark for quantum circuit simulation performance and will help advance the field of quantum computing by improving our ability to verify the behavior of quantum circuits, said Garnet Chan, a chemistry professor at Caltech whose lab hosted the work.

NVIDIA expects the performance gains and ease of use of cuQuantum will make it a foundational element in every quantum computing framework and simulator at the cutting edge of this research.

Sign up to show early interest in cuQuantum here.

Visit link:
What Is Quantum Computing? | NVIDIA Blog

What Is Quantum Computing? – Data Center Knowledge

In 1965, Intel co-founder Gordon Moore observed that the number of transistors per square inch on a microchip had doubled every year since their invention while the costs were cut in half a phenomenon that became known as Moores Law.

More than 50 years of chip innovation have allowed transistors to get smaller and smaller until the point where its no longer physically possible to reduce the size of transistors any further. As a result, improvements in computing are slowing down and new ways to process information will need to be found if we want to continue to reap the benefits from a rapid growth in computing.

Enter Quantum computing a radical new technology that could have a profound affect all our lives. It has, for example, the potential to transform medicine and revolutionize the fields of Artificial Intelligence and cybersecurity.

But what exactly is quantum computing and how does it vary from the computers we use today? In short, it is fundamentally different. Todays computers operate using bits which are best thought of as tiny switches that can either be in the off position (zero) or in the on position (one). Ultimately all of todays digital data whether thats a website or app you visit or image you download comprise millions of bits made up of ones and zeroes.

However, instead of bits, a quantum computer uses whats known as a qubit. The power of these qubits is their ability to scale exponentially so that a two-qubit machine allows for four calculations simultaneously, a three-qubit machine allows for eight calculations, and a four-qubit machine performs 16 simultaneous calculations.

According to Wired magazine, the difference between a traditional supercomputer and a quantum computer can best be explained by comparing the approaches that they might take in getting out of a maze. For example, a traditional computer will try every route in turn, ruling out each one until it finds the right one, whereas a quantum computer will go down every route at the same time. It can hold uncertainty in its head, claims Wired.

Rather than having a clear position, unmeasured quantum states occur in a mixed 'superposition', similar to a coin spinning through the air before it lands in your hand.

While a single qubit cant do much, quantum mechanics has another phenomenon called entanglement, which allows qubits to be set up in a way so that their individual probabilities are affected by the other qubits in the system. For example, a quantum computer with two entangled qubits is a bit like tossing two coins at the same time and while theyre in the air every possible combination of heads and tails can be represented at once. The more qubits that are entangled together, the more combinations of information that can be simultaneously represented.

Building a quantum computer is not without its problems. Not only does it have to hold an object in a superposition state long enough to carry out various processes on them, but the technology is also extremely sensitive to noise and environmental effects. Quantum chips must be kept colder than outer space to create superpositionsand information only remains quantum for so long before it is lost.

Nevertheless, researchers have predicted that quantum computers could help tackle certain types of problems, especially those involving a daunting number of variables and potential outcomes, like simulations or optimization questions. For example, they could be used to improve the software of self-driving cars, predict financial markets or model chemical reactions. Some scientists even believe quantum simulations could help find a breakthrough in beating diseases like Alzheimers.

Cryptography will be one key application. Currently, encryption systems rely on breaking down large numbers into prime numbers, a process called factoring. Whereas this a slow process for classical computers, for quantum computers it can be carried out very easily. As a result, all of our data could be put at risk if a quantum computer fell into the wrong hands. However, one way data could be protected is with quantum encryption keys which could not be copied or hacked.

Theres no question that quantum computing could be a revolutionary technology. And while the prospect of a quantum notebook or mobile phone look a very long way off, its likely that quantum computers will be widespread in academic and industrial settings at least for certain applications - within the next three to five years.

See more here:
What Is Quantum Computing? - Data Center Knowledge

Quantum computing now has an out-of-this-world problem: Cosmic rays – ZDNet

A new academic paper reveals a worrisome tendency for cosmic rays to disrupt quantum computer processors in a way that may be nearly impossible for current error correction techniques to reliably counteract.

One of the biggest obstacles faced by quantum computers is dealing with error correction. Traditionally, this has been most commonly handled by grouping together multiple qubits, the quantum equivalent of traditional computing's bits, into a sort of committee within quantum processing units. Rather than the system relying on a single qubit, which may or may not be correct, it instead relies on the consensus provided by an entire group of qubits. This strips away erroneous outliers and greatly reduces the error rate to a point where it's extremely unlikely that it will interfere with an ongoing processing job.

Unfortunately, in a very sci-fi-sounding turn of events, it appears that an unseen enemy from outer space may be threatening the viability of this error-correcting technology.

Cosmic rays are invisible, microscopic particle beams that constantly bombard the Earth from sources as far away as other galaxies. They typically collide harmlessly with the planet's atmosphere as well as objects within it. In fact, you'll likely be hit by several of them while reading this article. Luckily, for our peace of mind, they generally go completely unnoticed and do absolutely no harm before continuing on their cosmic journey. Unfortunately for quantum computing developers, it appears that quantum processors may be far, far more sensitive to these typically unnoticeable intruders than they realized.

In a paper published in Nature Physics and covered by Ars Technica, it's been revealed that one of these typically harmless rays could cause a major problem when it hits an operating quantum CPU. According to the findings of several researchers working at Google Quantum AI, a cosmic ray strike on an operating quantum computer core can result in the formation of a quasiparticle called a phonon.

These phonons have the capacity to disrupt operations by inverting the quantum state of not only a single qubit, but an entire entangled set of qubitsas they proliferate across the processor. This means a strike could distribute errors across an entire qubit set, essentially nullifying the protection provided by the committee-like error correction mentioned above.

In an experiment detailed within the paper, Google researchers tested a set of 26 qubits that were known to be amongst their most reliable. This set was then left in an idle state for 100 microseconds. While idling, reliable qubits should generally remain in their current state. To use a traditional, binary computing analogy, a 1 should remain a 1, a 0 should remain a 0.

On average, the 26 qubits set in question displayed an error rate of about 4 qubits that erroneously flipped their state within the 100 microsecond test period. This is well within the built-in error correction's ability to compensate by relying on the remaining majority of 22 qubits. However, during confirmed quantum ray strikes, 24 of the 26 qubits were found to have erroneously flipped to the opposite state. This result is well beyond traditional error correction's ability to compensate for. Such an outcome would place the entire group in error and could throw the entire processing job's continuity into question.

Cosmic ray interference is nothing new. As Ars noted, they can also interact with traditional CPUs by messing with the electrical charges they rely on to complete their logic operations. However, the unique and still-developing structure of quantum processors makes them far more prone to such interference, with Google's research indicating that a cosmic ray-induced error happens as often as every 10 seconds. This means the hours-long processing jobs most quantum CPUs are being tasked with could include hundreds, if not thousands of errors littered throughout their results.

Making matters worse is the fact that the processor these researchers used for their testing was rather small. As processing demands increase, so too must the size of the quantum processor. But, the larger the processor, the more surface area there is to potentially suffer a cosmic ray collision. It appears the threat of forced errors is only going to become direr as quantum CPUs continue making their way towards practical applications.

Unfortunately, there is no practical way to reliably block these problematic, intergalactic travelers. They are moving at almost the speed of light, after all. However, as pointed out by Ars Technica, some clever workarounds have already been developed to help devices like astronomical imaging equipment cope with quantum ray interference. While the paper does not specifically explore the viability of these potential solutions, they do seem to indicate the problem of cosmic ray interference is a surmountable one.

Here is the original post:
Quantum computing now has an out-of-this-world problem: Cosmic rays - ZDNet

To build the quantum internet, UChicago engineer teaches atoms how to remember – UChicago News

When the quantum internet arrives, researchers predict it will shift the computing landscape on a scale unseen in decades. In their estimation, it will make hacking a thing of the past. It will secure global power grids and voting systems. It will enable nearly limitless computing power and allow users to securely send information across vast distances.

But forTian Zhong, assistant professor at the Pritzker School of Molecular Engineering (PME) at the University of Chicago, the most tantalizing benefits of the quantum internet have yet to be imagined.

Zhong is a quantum engineer working to create this new global network. In his mind, the full impact of the quantum internet may only be realized after its been built. To understand his work and why the United States is spending$625 millionon the new technology, it helps to consider the science behind it: quantum mechanics.

Quantum mechanics is a theory created to explain fundamental properties of matter, particularly on the subatomic scale. Its roots trace back to the late 19th and early 20th century, when scientists tried to explain the unusual nature of light, which behaves as both a wave and a particle. In the hundred years since then, physicists have learned a great deal, particularly concerning the strange behavior of subatomic particles.

Theyve learned, for example, that some subatomic particles have the ability to be in two states at the same time, a principle called superposition. Another such principle is entanglement, which is the ability of two particles to communicate instantaneously despite being separated by hundreds of miles.

Over time, scientists have found ways to manipulate those principles, entangling particles at will or controlling an electrons spin. That new control allows researchers to encode, send, and process information using subatomic particleslaying the foundations of quantum computing and the quantum internet.

At the moment, both technologies are still hampered by certain physical limitationsquantum computers, for example, need to be kept in giant sub-zero freezersbut researchers like Zhong are optimistic those limitations will be resolved in the near future.

Were at a juncture where this is no longer science fiction, Zhong said. More and more, its looking like this technology will emerge from laboratories any day, ready to be adopted by society.

Zhongs research focuses on the hardware needed to make the quantum internet a reality, things like quantum chips that encrypt and decrypt quantum information, and quantum repeaters that relay information across network lines. To create that hardware, Zhong and his team work on the subatomic scale, using individual atoms to hold information and single photons to transmit it through optic cables.

Zhongs current work centers on finding ways to fight against quantum decoherence, which is when information stored on a quantum system degrades to the point that its no longer retrievable. Decoherence is an especially difficult obstacle to overcome because quantum states are extremely sensitive and any outside forcebe it heat, light, radiation, or vibrationcan easily destroy it.

Most researchers address decoherence by keeping quantum computers at a temperature near absolute zero. But the instant any quantum state is transmitted outside the freezer, say on a network line, it begins to break down within a few microseconds, severely limiting the potential for expansive interconnectivity.

See the rest here:
To build the quantum internet, UChicago engineer teaches atoms how to remember - UChicago News

Smart Internet Lab will deliver Quantum Data Centre of the Future – ITP.net

The University of Bristols pioneering Smart Internet Lab will work with industry partners to develop the first blueprint for a quantum data centre, as part of UKRIs 170 million pound Commercialising Quantum Technologies Challenge.

Quantum technologies, in the form of quantum computing and communications, promise to provide solutions to some of the worlds most challenging problems. However, to date, very little has been understood from a systems perspective about how to integrate them with existing data centres.

The Quantum Data Centre of the Future project will commence in early 2022, bringing experts in classical data centres and networking together with experts in quantum computing and quantum communications, to develop the first blueprint for a quantum data centre.

The project will leverage the significant research strengths of the University of BristolsHigh Performance Networks Groupin classical data centre, quantum Internet and quantum networking.

Professor Reza Nejabati, Head of High Performance Networks Research Group in theSmart Internet Lab, said: This is a truly exciting initiative. Adapting quantum computing and network systems to work in a data centre settingwill require significant acts of invention and creativity.

This will bring a more practical light to the field of quantum technologies so they can benefit businesses and support the emergence of new type quantum computing algorithms and applicationsthat will benefit from them far into the future.

Professor Dimitra Simeonidou, Director of Smart Internet Lab, added: In collaboration with the project partners, we aim to design, develop and demonstrate a solution for integrating a quantum computer in a classical data centre as well as providing remote quantum secure access to quantum computers at scale and in a data centre setting.

Quantum computers and communications systems are often described in isolation, but this misses the possibility for near term value to be created with quantum/classical hybrid systems. In this project, we will be investigating system-level solutions for optical metro quantum networks supporting remote access to quantum computing.

We are really excited to work with leading industrial and academic partners to connect and integrate our city scale test-bed to remote quantum accelerated data canter and demonstrate its use for future industrial applications.

Go here to see the original:
Smart Internet Lab will deliver Quantum Data Centre of the Future - ITP.net