Daily Archives: January 21, 2021

Quantum Computing Market Breaking New Grounds and Touch New Level in upcoming year by D-Wave Systems, Google, IBM, Intel, Microsoft KSU | The…

Posted: January 21, 2021 at 3:32 pm

Quantum computing make use of significant subatomic particle capability to be present in more than one state at any given point of time. Due to the peculiar behavior of these particles, processing can be done in a faster manner and with minimal power requirement than traditional computers. Traditional computers encode information in bits with the values 1 or 0. These values act as on/off switches that eventually drive computer functions. On the other hand, quantum computing uses quantum bits i.e. qubit. However, they can store more information than 1s or 0s. It works on the two very important principles of quantum physics i.e. entanglement and superposition.

The global Quantum Computing Market size is expected to Expand at Significant CAGR of +24% during forecast period (2021-2027).

The report, titled Global Quantum Computing Market defines and briefs readers about its products, applications, and specifications. The research lists key companies operating in the global market and also highlights the key changing trends adopted by the companies to maintain their dominance. By using SWOT analysis and Porters five force analysis tools, the strengths, weaknesses, opportunities, and threats of key companies are all mentioned in the report. All leading players in this global market are profiled with details such as product types, business overview, sales, manufacturing base, competitors, applications, and specifications.

Get Sample Copy (Including FULL TOC, Graphs and Tables) of this report @:

https://www.a2zmarketresearch.com/sample?reportId=705

Note In order to provide more accurate market forecast, all our reports will be updated before delivery by considering the impact of COVID-19.

Top Key Vendors of this Market are:

D-Wave Systems, Google, IBM, Intel, Microsoft, 1QB Information Technologies, Anyon Systems, Cambridge Quantum Computing, ID Quantique, IonQ, QbitLogic, QC Ware, Quantum Circuits, Qubitekk, QxBranch, Rigetti Computing.

Various factors are responsible for the markets growth trajectory, which are studied at length in the report. In addition, the report lists down the restraints that are posing threat to the global Quantum Computing market. It also gauges the bargaining power of suppliers and buyers, threat from new entrants and product substitute, and the degree of competition prevailing in the market. The influence of the latest government guidelines is also analyzed in detail in the report. It studies the Quantum Computing markets trajectory between forecast periods.

The report provides insights on the following pointers:

Market Penetration:Comprehensive information on the product portfolios of the top players in the Quantum Computing market.

Product Development/Innovation:Detailed insights on the upcoming technologies, R&D activities, and product launches in the market.

Competitive Assessment: In-depth assessment of the market strategies, geographic and business segments of the leading players in the market.

Market Development:Comprehensive information about emerging markets. This report analyzes the market for various segments across geographies.

Market Diversification:Exhaustive information about new products, untapped geographies, recent developments, and investments in the Quantum Computing market.

Get up to 30% Discount on this Premium Report @:

https://www.a2zmarketresearch.com/discount?reportId=705

Regions Covered in the Global Quantum Computing Market Report 2021:The Middle East and Africa(GCC Countries and Egypt)North America(the United States, Mexico, and Canada)South America(Brazil etc.)Europe(Turkey, Germany, Russia UK, Italy, France, etc.)Asia-Pacific(Vietnam, China, Malaysia, Japan, Philippines, Korea, Thailand, India, Indonesia, and Australia)

The cost analysis of the Global Quantum Computing Market has been performed while keeping in view manufacturing expenses, labor cost, and raw materials and their market concentration rate, suppliers, and price trend. Other factors such as Supply chain, downstream buyers, and sourcing strategy have been assessed to provide a complete and in-depth view of the market. Buyers of the report will also be exposed to a study on market positioning with factors such as target client, brand strategy, and price strategy taken into consideration.

Reasons for buying this report:

Table of Contents

Global Quantum Computing Market Research Report 2021 2027

Chapter 1 Quantum Computing Market Overview

Chapter 2 Global Economic Impact on Industry

Chapter 3 Global Market Competition by Manufacturers

Chapter 4 Global Production, Revenue (Value) by Region

Chapter 5 Global Supply (Production), Consumption, Export, Import by Regions

Chapter 6 Global Production, Revenue (Value), Price Trend by Type

Chapter 7 Global Market Analysis by Application

Chapter 8 Manufacturing Cost Analysis

Chapter 9 Industrial Chain, Sourcing Strategy and Downstream Buyers

Chapter 10 Marketing Strategy Analysis, Distributors/Traders

Chapter 11 Market Effect Factors Analysis

Chapter 12 Global Quantum Computing Market Forecast

Buy Exclusive Report @:

https://www.a2zmarketresearch.com/buy?reportId=705

If you have any special requirements, please let us know and we will offer you the report as you want.

About A2Z Market Research:

The A2Z Market Research library provides syndication reports from market researchers around the world. Ready-to-buy syndication Market research studies will help you find the most relevant business intelligence.

Our Research Analyst Provides business insights and market research reports for large and small businesses.

The company helps clients build business policies and grow in that market area. A2Z Market Research is not only interested in industry reports dealing with telecommunications, healthcare, pharmaceuticals, financial services, energy, technology, real estate, logistics, F & B, media, etc. but also your company data, country profiles, trends, information and analysis on the sector of your interest.

Contact Us:

Roger Smith

1887 WHITNEY MESA DR HENDERSON, NV 89014

sales@a2zmarketresearch.com

+1 775 237 4147

Visit link:

Quantum Computing Market Breaking New Grounds and Touch New Level in upcoming year by D-Wave Systems, Google, IBM, Intel, Microsoft KSU | The...

Posted in Quantum Computing | Comments Off on Quantum Computing Market Breaking New Grounds and Touch New Level in upcoming year by D-Wave Systems, Google, IBM, Intel, Microsoft KSU | The…

Mind the (skills) gap: Cybersecurity talent pool must expand to take advantage of quantum computing opportunities – The Daily Swig

Posted: at 3:32 pm

Experts at the CES 2021 conference stress importance of security education

The second age of quantum computing is poised to bring a wealth of new opportunities to the cybersecurity industry but in order to take full advantage of these benefits, the skills gap must be closed.

This was the takeaway of a discussion between two cybersecurity experts at the CES 2021 virtual conference last week.

Pete Totrorici, director of Joint Information Warfare at the Department of Defense (DoD) Joint Artificial Intelligence (AI) Center, joined Vikram Sharma, CEO of QuintessenceLabs, during a talk titled AI and quantum cyber disruption.

Quantum computing is in its second age, according to Sharma, meaning that the cybersecurity industry will soon start to witness the improvements in encryption, AI, and other areas that have long been promised by the technology.

BACKGROUND Quantum leap forward in cryptography could make niche technology mainstream

Quantum-era cybersecurity will wield the power to detect and deflect quantum-era cyber-attacks before they cause harm, a report from IBM reads.

It is the technology of our time, indeed, commented Sharma, who is based in Canberra, Australia.

QuintessenceLabs is looking at the application of advanced quantum technologies within the cybersecurity sphere, says Sharma, in particular the realm of data protection.

Governments and large organizations have also invested in the quantum space in recent years, with the US, UK, and India all providing funding for research.

The Joint AI Center was founded in 2018 and was launched to transform the Department of Defense to the adoption of artificial intelligence, said Totrorici.

A subdivision of the US Armed Forces, the center is responsible for exploring the use of AI and AI-enhanced communication for use in real-world combat situations.

Specifically, were trying to identify how we employ AI solutions that will have a mission impact, he said.

Across the department our day-to-day composes everything from development strategy, policy, product development, industry engagement, and other outreach activities, but if I need to identify something that I think is my most significant challenge today, its understanding the departments varied needs.

As with last year, CES took place virtually in 2021 due to the coronavirus pandemic

In order to reach these needs, Totrorici said that relationships between the center, academia, industry, and government need to be established.

There was a time when the DoD go it alone, [however] those days are long gone.

If were going to solve problems like AI employment or quantum development, [it] is going to require partnerships, he said.

Totrorici and Sharma both agreed that while the future is certainly in quantum computing, the ever-widening cyber skills gap needs to be addressed to take advantage of its potential.

Indeed, these partnerships cannot be formed if there arent enough experts in the field.

Totrorici said: Forefront in the mind of the DoD nowadays is, How do we how do we cultivate and retain talent?

I still think the United States does a great job of growing and building talent. Now the question becomes, Will we retain that talent, how do we leverage that time going forward, and where are we building it?

YOU MAY ALSO LIKE Quantum encryption the devil is in the implementation

The (ISC)2 2020 Workforce Study (PDF) found that the current cybersecurity industry needs to grow by 89% in order to effectively protect against cyber threats.

Of the companies surveyed, the study also revealed that 64% current have some shortage of dedicated cybersecurity staff.

Here in Australia weve recently established whats called the Sydney Quantum Academy, and that is an overarching group that sits across four leadings institutions that are doing some cutting-edge work in quantum in the country, said Sharma.

One of the aims of that academy is to produce quantum skilled folks broadly, but also looking specifically in the quantum cybersecurity area.

So certainly, some small initiatives that [have] kicked off, but I think theres a big gap there that that will need to be filled as we move forward.

READ MORE Infosec pro Vandana Verma on improving diversity and helping to grow the Indian security community

Here is the original post:

Mind the (skills) gap: Cybersecurity talent pool must expand to take advantage of quantum computing opportunities - The Daily Swig

Posted in Quantum Computing | Comments Off on Mind the (skills) gap: Cybersecurity talent pool must expand to take advantage of quantum computing opportunities – The Daily Swig

Securing the DNS in a Post-Quantum World: New DNSSEC Algorithms on the Horizon – CircleID

Posted: at 3:32 pm

This is the fourth in a multi-part series on cryptography and the Domain Name System (DNS).

One of the "key" questions cryptographers have been asking for the past decade or more is what to do about the potential future development of a large-scale quantum computer.

If theory holds, a quantum computer could break established public-key algorithms including RSA and elliptic curve cryptography (ECC), building on Peter Shor's groundbreaking result from 1994.

This prospect has motivated research into new so-called "post-quantum" algorithms that are less vulnerable to quantum computing advances. These algorithms, once standardized, may well be added into the Domain Name System Security Extensions (DNSSEC) thus also adding another dimension to a cryptographer's perspective on the DNS.

(Caveat: Once again, the concepts I'm discussing in this post are topics we're studying in our long-term research program as we evaluate potential future applications of technology. They do not necessarily represent Verisign's plans or position on possible new products or services.)

The National Institute of Standards and Technology (NIST) started a Post-Quantum Cryptography project in 2016 to "specify one or more additional unclassified, publicly disclosed digital signature, public-key encryption, and key-establishment algorithms that are capable of protecting sensitive government information well into the foreseeable future, including after the advent of quantum computers."

Security protocols that NIST is targeting for these algorithms, according to its 2019 status report (Section 2.2.1), include: "Transport Layer Security (TLS), Secure Shell (SSH), Internet Key Exchange (IKE), Internet Protocol Security (IPsec), and Domain Name System Security Extensions (DNSSEC)."

The project is now in its third round, with seven finalists, including three digital signature algorithms, and eight alternates.

NIST's project timeline anticipates that the draft standards for the new post-quantum algorithms will be available between 2022 and 2024.

It will likely take several additional years for standards bodies such as the Internet Engineering Task (IETF) to incorporate the new algorithms into security protocols. Broad deployments of the upgraded protocols will likely take several years more.

Post-quantum algorithms can therefore be considered a long-term issue, not a near-term one. However, as with other long-term research, it's appropriate to draw attention to factors that need to be taken into account well ahead of time.

The three candidate digital signature algorithms in NIST's third round have one common characteristic: all of them have a key size or signature size (or both) that is much larger than for current algorithms.

Key and signature sizes are important operational considerations for DNSSEC because most of the DNS traffic exchanged with authoritative data servers is sent and received via the User Datagram Protocol (UDP), which has a limited response size.

Response size concerns were evident during the expansion of the root zone signing key (ZSK) from 1024-bit to 2048-bit RSA in 2016, and in the rollover of the root key signing key (KSK) in 2018. In the latter case, although the signature and key sizes didn't change, total response size was still an issue because responses during the rollover sometimes carried as many as four keys rather than the usual two.

Thanks to careful design and implementation, response sizes during these transitions generally stayed within typical UDP limits. Equally important, response sizes also appeared to have stayed within the Maximum Transmission Unit (MTU) of most networks involved, thereby also avoiding the risk of packet fragmentation. (You can check how well your network handles various DNSSEC response sizes with this tool developed by Verisign Labs.)

The larger sizes associated with certain post-quantum algorithms do not appear to be a significant issue either for TLS, according to one benchmarking study, or for public-key infrastructures, according to another report. However, a recently published study of post-quantum algorithms and DNSSEC observes that "DNSSEC is particularly challenging to transition" to the new algorithms.

Verisign Labs offers the following observations about DNSSEC-related queries that may help researchers to model DNSSEC impact:

A typical resolver that implements both DNSSEC validation and qname minimization will send a combination of queries to Verisign's root and top-level domain (TLD) servers.

Because the resolver is a validating resolver, these queries will all have the "DNSSEC OK" bit set, indicating that the resolver wants the DNSSEC signatures on the records.

The content of typical responses by Verisign's root and TLD servers to these queries are given in Table 1 below. (In the table, . are the final two labels of a domain name of interest, including the TLD and the second-level domain (SLD); record types involved include A, Name Server (NS), and DNSKEY.)

For an A or NS query, the typical response, when the domain of interest exists, includes a referral to another name server. If the domain supports DNSSEC, the response also includes a set of Delegation Signer (DS) records providing the hashes of each of the referred zone's KSKs the next link in the DNSSEC trust chain. When the domain of interest doesn't exist, the response includes one or more Next Secure (NSEC) or Next Secure 3 (NSEC3) records.

Researchers can estimate the effect of post-quantum algorithms on response size by replacing the sizes of the various RSA keys and signatures with those for their post-quantum counterparts. As discussed above, it is important to keep in mind that the number of keys returned may be larger during key rollovers.

Most of the queries from qname-minimizing, validating resolvers to the root and TLD name servers will be for A or NS records (the choice depends on the implementation of qname minimization, and has recently trended toward A). The signature size for a post-quantum algorithm, which affects all DNSSEC-related responses, will therefore generally have a much larger impact on average response size than will the key size, which affects only the DNSKEY responses.

Post-quantum algorithms are among the newest developments in cryptography. They add another dimension to a cryptographer's perspective on the DNS because of the possibility that these algorithms, or other variants, may be added to DNSSEC in the long term.

In my next post, I'll make the case for why the oldest post-quantum algorithm, hash-based signatures, could be a particularly good match for DNSSEC. I'll also share the results of some research at Verisign Labs into how the large signature sizes of hash-based signatures could potentially be overcome.

Read the previous posts in this six-part blog series:

See original here:

Securing the DNS in a Post-Quantum World: New DNSSEC Algorithms on the Horizon - CircleID

Posted in Quantum Computing | Comments Off on Securing the DNS in a Post-Quantum World: New DNSSEC Algorithms on the Horizon – CircleID

With bioengineered food labels showing up in stores, here’s what you should know about GMOs – Genetic Literacy Project

Posted: at 3:31 pm

While genetic engineering is the term typically used by scientists, you will start seeing the bioengineered label on some of the GMO foods we eat in the United States.

Humans have used traditional ways to modify crops and animals to suit their needs and tastes for more than 10,000 years. Cross-breeding, selective breeding and mutation breeding are examples of traditional ways to make these changes. These breeding methods often involve mixing all the genes from two different sources. They are used to create common crops like modern corn varieties and seedless watermelon.

Modern technology now allows scientists to use genetic engineering to take just a specific beneficial gene, like insect resistance or drought tolerance, and transfer it into a plant. The reasons for genetic modification today are similar to what they were thousands of years ago: higher crop yields, less crop loss, longer storage life, better appearance, better nutrition, or some combination of these traits.

Since GMO foods were introduced in the 1990s, research has shown that they are just as safe as non-GMO foods. Since then, the U.S. Food and Drug Administration (FDA), U.S. Environmental Protection Agency (EPA), and U.S. Department of Agriculture (USDA) have worked together to ensure that crops produced through genetic engineering are safe for people, animals, and the environment.

Read the original post

Link:
With bioengineered food labels showing up in stores, here's what you should know about GMOs - Genetic Literacy Project

Posted in Genetic Engineering | Comments Off on With bioengineered food labels showing up in stores, here’s what you should know about GMOs – Genetic Literacy Project

Livestock Producers on Level Playing Field Thanks to MOU Between USDA and FDA – Pork Magazine

Posted: at 3:31 pm

A Memorandum of Understanding (MOU) has been finalized regarding regulation of certain animals developed using genetic engineering. USDA announced the MOU with the Food and Drug Administration that outlines responsibilities regarding genetically engineered animals that are intended for agricultural purposes such as human food, fiber and labor.

This MOU complements USDAs issuance of an Advanced Notice of Proposed Rulemaking (ANPR) on the Movement of Animals Modified or Developed by Genetic Engineering on December 28, 2020.

Todays Memorandum of Understanding clears a path to bring our regulatory framework into the 21st century, putting American producers on a level playing field with their competitors around the world. In the past, regulations stifled innovation, causing American businesses to play catch-up and cede market share, said U.S. Secretary of Agriculture Sonny Perdue in a release. America has the safest and most affordable food supply in the entire world thanks to the innovation of our farmers, ranchers and producers. Establishing a new, transparent, risk and science-based regulatory framework would ensure this continues to be the case.

The terms of the MOU support USDAs ANPR outlining a contemplated regulatory framework that would apply to certain animals (cattle, sheep, goats, swine, horses, mules, or other equines, catfish, and poultry) developed using genetic engineering intended for agricultural purposes, USDA explains. Under this framework, USDA would safeguard animal and human health by overseeing pre-market reviews through post-market food safety monitoring for certain farm animals modified or developed using genetic engineering that are intended for human food.

The National Pork Producers Council (NPPC) applauded the MOU signed between the USDA and the FDA, giving USDA primary regulatory jurisdiction over the development of gene-edited livestock.

NPPC has been calling for this decision for more than three years to ensure that U.S. agriculture maintains its competitive edge globally. We look forward to working with the Biden administration to implement a technology that has the potential to improve animal health, further reduce agricultures environmental footprint and improve production efficiency, NPPC said in a statement.

The MOU also allows for the transition of portions of FDAs pre-existing animal biotechnology regulatory oversight to USDA. USDA would continue to coordinate closely with FDA to fulfill oversight responsibilities and provide the appropriate regulatory environment, ensuring the safety of products derived from new technologies and fostering innovation at the same time, the release said.

FDA would continue its review of intentional genomic alterations intended for any purpose other than agricultural use, such as biopharma and non-heritable genomic alteration, and the regulation of dairy products, table and shell eggs, certain meat products and animal feed derived from animals developed using genetic engineering.

Read More:

FDA Stalls U.S. Gene-Edited Livestock Efforts

USDA Oversight of Gene-Edited Livestock: A Seismic Shift for Agriculture

Follow this link:
Livestock Producers on Level Playing Field Thanks to MOU Between USDA and FDA - Pork Magazine

Posted in Genetic Engineering | Comments Off on Livestock Producers on Level Playing Field Thanks to MOU Between USDA and FDA – Pork Magazine

Foodborne diseases kill thousands of Americans each year. Tracing food with genetically engineered spores could help. – The Counter

Posted: at 3:31 pm

Bhuyan said that his company has plans to win over both farmers and consumers, and that many industries have already signed on to use the technology.

The traction we have gained spans multiple countries and supply chains ranging from meat, dairy, cannabis, coffee, leafy greens, and a range of non-organic items, he said. Its unclear which companies Aanika is currently working with, as only a handful of partnershipsincluding a collaboration with the diamond company, De Beers Grouphave been disclosed publicly.

A normal food recall can affect a dozen farms, some of which were not actually responsible for the outbreak. Instead of recalling produce from all those growers, Aanikas spores could instead be used to pinpoint the outbreaks source, which would have the additional benefit of reducing the number of claims that an insurance company would have to pay out. This year, Aanika will work directly with agricultural insurers, offering them up to $10 million in guarantees, an incentive to protect insurers against loss in case claims submitted to them by farmers are not reduced as a result of using Aanikas spores. The move was made public in a blog post that the company posted to Medium in December, and it could offer enough of a financial incentive to get more large-scale traction with farmers.

As for consumer concerns, Bhuyan said that, even if spores did end up on your dinner plate, the average person poops out thousands of bacterial spores every daya claim supported by a recent studyso we dont think this will be a problem.

Visit link:
Foodborne diseases kill thousands of Americans each year. Tracing food with genetically engineered spores could help. - The Counter

Posted in Genetic Engineering | Comments Off on Foodborne diseases kill thousands of Americans each year. Tracing food with genetically engineered spores could help. – The Counter

Advarra Announces New Gene Therapy Ready Site Network – PRNewswire

Posted: at 3:31 pm

This will address an accelerating gene therapy market that is expected to grow globally by 16.6 percent from 2020-2027.

"The Gene Therapy Ready network demonstrates our commitment to empowering sites and supporting our industry partners as they pursue advanced genetic engineering to find cures for the world's most pressing health conditions," said Scott Uebele, President and Chief Research Services Officer at Advarra. "Our commitment to efficient study activation is unwavering, and this is another example of how Advarra bringslife sciences companies,CROs, research sites, investigators,andacademiatogether at the intersection of safety,compliance,technology, and collaboration."

All Gene Therapy Ready sites stand ready to help industry sponsors conduct clinical trials that advance cures, develop vaccines, and find treatments for rare disease. By placing clinical trials with a Gene Therapy Ready site, research sponsors can save significant time during study startup.

"This innovative network is truly the first of its kind. We constantly look for ways to support our sponsors in rapidly starting trials in a safe, compliant, and quality manner. With the Gene Therapy Ready network, we can improve study startup times by a month or more, potentially placing cures in the hands of patients faster," said James Riddle, Vice President of Research Services and Strategic Consulting at Advarra. "The Gene Therapy Ready site network charts a course to success by providing our sponsor clients with a clear choice for IBC review services."

About Advarra

Advarra advances the way clinical research is conducted: bringing life sciences companies, CROs, research sites, investigators, and academia together at the intersection of safety, technology, and collaboration. With trusted IRB and IBC review solutions, innovative technologies, experienced consultants, and deep-seated connections across the industry, Advarra provides integrated solutions that safeguard trial participants, empower clinical sites, ensure compliance, and optimize research performance. Advarra is advancing clinical trials to make them safer, smarter, and faster. For more information, visit advarra.com.

SOURCE Advarra

Homepage

See more here:
Advarra Announces New Gene Therapy Ready Site Network - PRNewswire

Posted in Genetic Engineering | Comments Off on Advarra Announces New Gene Therapy Ready Site Network – PRNewswire

Designer baby book trilogy explores the moral dilemmas humans may soon create – Big Think

Posted: at 3:31 pm

Imagine it's 2045. You start hearing rumors from your well-heeled friends about a mysterious corporation based on an undisclosed island that's offering an unprecedented service: the ability to genetically design your baby.

The baby will have some of your genetics, and some genetics from a sperm or egg donor, selected by you. But the rest of your child's genetic profile will be engineered by science. These changes will make it impossible for your child to develop genetic diseases. They'll also allow you to customize your child for dozens of traits, including intelligence level, emotional disposition, sexual orientation, height, skin tone, hair color, and eye color, to name a few.

This raises unsettling philosophical questions for some customers. "When does my child stop being my child?" they ask the corporate representatives. These wary customers are reminded of how risky it is to reproduce the old-fashioned way. The Better Genetics Corporation's motto sums it up: "Only God plays dicehumans don't have to."

This is the world described in a new science-fiction series by Eugene Clark titled "Genetic Pressure", which explores the moral and scientific implications of a future in which designer babies are becoming a major industry. The first book begins with the story of Rachel, a renowned horse breeder who befriends a billionaire client, and soon gets the funding to visit the tropical island on which the Better Genetics Corporation is headquartered.

There, corporate executives walk her through the process of designing a babyan experience that feels like an uncanny mix between visiting a doctor and designing a luxury car. The series is told from multiple perspectives, serving as a deep dive into a complex moral web that today's scientists may already be weaving.

Case in point: In 2018, Chinese scientist He Jiankui announced that he had helped create the world's first genetically engineered babies. Using the gene-editing tool CRISPR on embryos, He Jiankui modified a gene called CCR5, which enables HIV to enter and infect immune system cells. His goal was to engineer children that were immune to the virus.

It's unclear whether he succeeded. But what's certain is that the experiment shocked the international scientific community, which generally agreed that it's unethical to conduct gene-editing procedures on humans, given that scientists don't yet fully understand the consequences.

"This experiment is monstrous," Julian Savulescu, a professor of practical ethics at the University of Oxford, told The Guardian. "The embryos were healthy. No known diseases. Gene editing itself is experimental and is still associated with off-target mutations, capable of causing genetic problems early and later in life, including the development of cancer."

Importantly, He Jiankui wasn't treating a disease, but rather genetically engineering babies to prevent the future contraction of a virus. These kinds of changes are heritable, meaning the experiment could have major downstream effects on future generations. So, too, would a designer-baby industry, even if scientists can do it safely.

With major implications on inequality, discrimination, sexuality, and our conceptions of life, the introduction of designer babies would create a labyrinth of philosophical dilemmas that society is only beginning to explore.

One question the "Genetic Pressure" series explores: What would tribalism and discrimination look like in a world with designer babies? As designer babies grow up, they could be noticeably different from other people, potentially being smarter, more attractive and healthier. This could breed resentment between the groupsas it does in the series.

"[Designer babies] slowly find that 'everyone else,' and even their own parents, becomes less and less tolerable," author Eugene Clark told Big Think. "Meanwhile, everyone else slowly feels threatened by the designer babies."

For example, one character in the series who was born a designer baby faces discrimination and harassment from "normal people"they call her "soulless" and say she was "made in a factory," a "consumer product."

Would such divisions emerge in the real world? The answer may depend on who's able to afford designer baby services. If it's only the ultra-wealthy, then it's easy to imagine how being a designer baby could be seen by society as a kind of hyper-privilege, which designer babies would have to reckon with.

Even if people from all socioeconomic backgrounds can someday afford designer babies, people born designer babies may struggle with tough existential questions: Can they ever take full credit for things they achieve, or were they born with an unfair advantage? To what extent should they spend their lives helping the less fortunate?

Sexuality presents another set of thorny questions. If a designer baby industry someday allows people to optimize humans for attractiveness, designer babies could grow up to find themselves surrounded by ultra-attractive people. That may not sound like a big problem.

But consider that, if designer babies someday become the standard way to have children, there'd necessarily be a years-long gap in which only some people are having designer babies. Meanwhile, the rest of society would be having children the old-fashioned way. So, in terms of attractiveness, society could see increasingly apparent disparities in physical appearances between the two groups. "Normal people" could begin to seem increasingly ugly.

But ultra-attractive people who were born designer babies could face problems, too. One could be the loss of body image.

When designer babies grow up in the "Genetic Pressure" series, men look like all the other men, and women look like all the other women. This homogeneity of physical appearance occurs because parents of designer babies start following trends, all choosing similar traits for their children: tall, athletic build, olive skin, etc.

Sure, facial traits remain relatively unique, but everyone's more or less equally attractive. And this causes strange changes to sexual preferences.

"In a society of sexual equals, they start looking for other differentiators," he said, noting that violet-colored eyes become a rare trait that genetically engineered humans find especially attractive in the series.

But what about sexual relationships between genetically engineered humans and "normal" people? In the "Genetic Pressure" series, many "normal" people want to have kids with (or at least have sex with) genetically engineered humans. But a minority of engineered humans oppose breeding with "normal" people, and this leads to an ideology that considers engineered humans to be racially supreme.

On a policy level, there are many open questions about how governments might legislate a world with designer babies. But it's not totally new territory, considering the West's dark history of eugenics experiments.

In the 20th century, the U.S. conducted multiple eugenics programs, including immigration restrictions based on genetic inferiority and forced sterilizations. In 1927, for example, the Supreme Court ruled that forcibly sterilizing the mentally handicapped didn't violate the Constitution. Supreme Court Justice Oliver Wendall Holmes wrote, " three generations of imbeciles are enough."

After the Holocaust, eugenics programs became increasingly taboo and regulated in the U.S. (though some states continued forced sterilizations into the 1970s). In recent years, some policymakers and scientists have expressed concerns about how gene-editing technologies could reanimate the eugenics nightmares of the 20th century.

Currently, the U.S. doesn't explicitly ban human germline genetic editing on the federal level, but a combination of laws effectively render it illegal to implant a genetically modified embryo. Part of the reason is that scientists still aren't sure of the unintended consequences of new gene-editing technologies.

But there are also concerns that these technologies could usher in a new era of eugenics. After all, the function of a designer baby industry, like the one in the "Genetic Pressure" series, wouldn't necessarily be limited to eliminating genetic diseases; it could also work to increase the occurrence of "desirable" traits.

If the industry did that, it'd effectively signal that the opposites of those traits are undesirable. As the International Bioethics Committee wrote, this would "jeopardize the inherent and therefore equal dignity of all human beings and renew eugenics, disguised as the fulfillment of the wish for a better, improved life."

"Genetic Pressure Volume I: Baby Steps" by Eugene Clark is available now.

Read the rest here:
Designer baby book trilogy explores the moral dilemmas humans may soon create - Big Think

Posted in Genetic Engineering | Comments Off on Designer baby book trilogy explores the moral dilemmas humans may soon create – Big Think

Risk assessment of GE plants in the EU: Taking a look at the ‘dark side of the moon’ EUbusiness.com | EU news, business and politics – EUbusiness

Posted: at 3:31 pm

21 January 2021by testbiotech-- last modified 21 January 2021

Testbiotech has published a new report providing evidence that the European Food Safety Authority (EFSA) is intentionally keeping significant risks related to genetically engineered (GE) plants 'in the dark'.

Advertisement

While EFSA is aware that the data compiled by industry are insufficient to demonstrate the safety of the plants, it has nevertheless failed to take action to solve the problems. On the contrary, the authority has for years defended assumptions even if they are in contradiction to the facts. In addition, EFSA is intentionally trying to distract awareness away from the 'dark' sides of its risk assessment.

During the first 20 years of its existence, EFSA published more than 100 opinions on the risk assessment of GE crops, but was nevertheless unable to present sufficiently robust criteria and methods. The report published today reveals major gaps in risk assessment which can no longer be disputed. Moreover, the report also shows that specific areas of risk assessment are intentionally ignored .

"We need science more than ever to stop dangers such as climate change and pandemics. Science also has to be impartial, transparent and reliable when it comes to assessment of risky technologies and their profitable products. However, in the case of genetically engineered plants, the trade interests of industry are given priority when it comes to decision-making in the face of uncertainties," Christoph Then states for Testbiotech, an institute which is independent of the interests of biotech industry. Testbiotech has for more than ten years analysed the risks of genetically engineered organisms with a view to protecting health and the environment.

Testbiotech is accusing EFSA of a systematic failure to request sufficiently reliable data from industry. These problems concern, for example, field trials with genetically engineered herbicide-resistant plants that are sprayed with much lower rates of herbicide applications compared to current agricultural practice. Furthermore, the regions in which the field trials are carried out do not represent the bioclimatic conditions under which the GE plants are to be cultivated.

For the assessment of insecticidal Bt toxins produced in the plants, EFSA accepts experiments with toxins produced by bacteria. However, it is known that the toxins produced in the plants must be assumed to be much more toxic since plant constituents can multiply their toxicity. Furthermore, most of the approved GE plants carry a combination of (several) Bt toxins and (several) herbicide resistances. Nevertheless, EFSA does not request any empirical data on mixed toxicity or immunogenicity of the compounds present in the harvest.

At the same time, in regard to the potential spread of GE plants, EFSA makes assumptions that are outdated and therefore underestimates the actual risks. From a legal point of view, it also seems to be questionable that EFSA, in a self-assigned task, adopted a new guidance in 2015 allowing it to evade legally binding EU Commission standards in field trial assessments.

In conclusion, evidence has been provided to show that the genetic engineering of food plants has layers of complexity that go far beyond what can be assessed by current standards of risk assessment. The safety of the plants is claimed on basis of approval processes that only consider risks that are easiest to assess.

The Testbiotech analysis of the work of EFSA is also based on the outcomes of the RAGES (Risk Assessment of genetically engineered organisms in the EU and Switzerland) project. The RAGES project started in 2016 and ended in 2020; the outcomes were subsequently assessed by EFSA in June 2020.

Testbiotech is now urging the EU Commission to take action because the political responsibility for setting the standards in the risk assessment of GE organisms lies with the Commission.

What are the consequences of genetic engineering for humans and the environment? From a critical point of view, Testbiotech provides information and scientific expertise on the risks associated with these technologies, that is completely independent of the biotech industry.

View post:
Risk assessment of GE plants in the EU: Taking a look at the 'dark side of the moon' EUbusiness.com | EU news, business and politics - EUbusiness

Posted in Genetic Engineering | Comments Off on Risk assessment of GE plants in the EU: Taking a look at the ‘dark side of the moon’ EUbusiness.com | EU news, business and politics – EUbusiness

New computational method detects disrupted pathways in cancer – UB Now: News and views for UB faculty and staff – University at Buffalo Reporter

Posted: at 3:31 pm

Cancer is a notoriously complex disease, in part because it may be caused by mutations among hundreds or even thousands of genes. In addition, most cancers exhibit an extraordinary amount of variation among genetic mutations, even between patients with the same types of cancers.

Consequently, cancer researchers have chosen to study interactions among groups of genes in certain biological pathways that are disrupted.

When genes in certain pathways are frequently mutated or disrupted, that pathway may play a critical role in the initiation or development of cancer. But unraveling the molecular mechanisms underlying those disruptions is extremely complex.

Now, UB researchers have developed a new, statistically more powerful method called FDRnet that can more effectively detect key functional pathways in cancer using genomics data generated by next-generation sequencing technology.

Published in Nature Computational Science on Jan. 14, the new method has the potential to give biologists more precise data with which to zero in on therapeutic targets.

Using the new method, we can find biological pathways in which genes are significantly mutated or disrupted, explains Yijun Sun, associate professor of bioinformatics in the Department of Microbiology and Immunology, Jacobs School of Medicine and Biomedical Sciences at UB and the corresponding author. It addresses some key challenges in molecular pathway analysis in cancer studies. Once the tumor biologists obtain this information, they can use it to verify our findings, and from there develop new cancer treatments.

By overcoming the limitations of existing approaches, FDRnet can facilitate the detection of key functional pathways in cancer and other genetic diseases, he says.

When Sun and his co-authors tested FDRnet on simulation data and on breast cancer and B-cell lymphoma data, they found that FDRnet was able to detect which subnetworks or pathways are significantly perturbed in these cancers, potentially leading tumor biologists to identify new therapeutic targets.

Co-authors with Sun are Le Yang and Runpu Chen, both doctoral students in the Department of Computer Science and Engineering, School of Engineering and Applied Sciences, and Steven Goodison of the Department of Health Sciences Research at the Mayo Clinic.

The research was funded by the National Institutes of Health.

Read the rest here:
New computational method detects disrupted pathways in cancer - UB Now: News and views for UB faculty and staff - University at Buffalo Reporter

Posted in Genetic Engineering | Comments Off on New computational method detects disrupted pathways in cancer – UB Now: News and views for UB faculty and staff – University at Buffalo Reporter