6 Advantages and Disadvantages of Human Genetic …

Human genetic engineering pertains to the practice of adding new DNAs to a person to give him certain traits that he would not naturally have. Due to its concept, it has gotten much attention lately, making it one of the hottest topics in debates all around the world. Curious of its story? Here are its main advantages and disadvantages:

1. It can eliminate diseases.Though it may seem impossible now, but this technology can take diseases out of the equation. By detecting and removing bad genes inherited from parents, the next generation would be healthier. Any genetic mutations caused by environmental mutagens can be corrected, keeping mutations under control and making the human body less susceptible to infections.

2. It helps the pharmaceutical industry to advance.Aside from paving the way for xenotransplantation or the process of transplanting living tissues or organs through biotechnology, genetic engineering also acts as an aid for genetics, enabling the pharmaceutical industry to develop highly graded products that can help fight health conditions.

3. It has the potential to increase human life span.As human genetic engineering has the potential to make diseases a thing of the past, it can allow for a fuller and healthier life, not to mention a longer life span. As research suggests, it has the ability to increase the life span of humans anywhere between 100 and 150 years by slowing down the aging process through altering the genome of a healthy individual. This technology also pinpoints desirable traits of a certain person and then integrated them to others DNA.

1. It is surrounded with moral issues.The initial reaction of people to the practice of genetic engineering is whether it is morally right or not. As many people religiously believe in God, they see the technology as playing God, and expressly forbid it to be performed on their on their children. Aside from the religious arguments, there are also ethical objections, where opponents believe that diseases exist for a reason. While many of these conditions are to be dealt with, illnesses are generally needed, or else we would soon face the problem of overpopulation.

2. It can limit genetic diversity.Diversity is very important in all species of animals and the ecosystem, and human genetic engineering will a detrimental effect on peoples genetic diversity.

3. It poses possible irreversible effects and consequences.Even among scientists and researchers, genetic engineering is believed to have irreversible side effects, especially in the aspect of hereditarily modified genes. As you can see, the process involves the use of viral factors to carry functional genes to the human body, and naturally, these viral genes will likely leave certain side effects. Moreover, placing functional genes in the genome still does not present definite effects, which means that they can replace other important genes, rather than the mutated ones, causing other forms of health conditions to develop.

Genetic engineering is one of the most controversial topics of today, and keeping yourself informed about all its aspects can help you form a well-informed opinion on the matter.

---

See the article here:
6 Advantages and Disadvantages of Human Genetic ...

Listening for the Public Voice – Slate Magazine

Jupiterimages/Thinkstock

On Aug. 3, the scientific article in Nature finally gave us some facts about the much-hyped experiments that involved editing the genomes of human embryos at the Center for Embryonic Cell and Gene Therapy at Oregon Health and Science University. The story had broken in late July in Technology Review, spurring profuse hand-wringing and discussion. But until we saw the scientific paper, it was not clear what cells and methods were used, what genes were edited, or what the results were.

Now we know more, and while the paper demonstrates the possibility of genome editing of human embryos, it raises more questions than it answers. It is a useful demonstration of technical promise, though not an immediate prelude to the birth of a genome-edited baby. But the process by which the news emerged is also an ominous harbinger of the discombobulated way the debate about genetically altering human embryos is likely to unfold. We need open, vigorous debate that captures the many, often contradictory, moral views of Americans. Yet what we are likely to get is piecemeal, fragmented stories of breakthroughs with incomplete details, more sober publication in science journals that appear later, news commentary that lasts a few days, and very little systematic effort to think through what policy should be.

The science underlying this news cycle about human genome editing builds on a technique first developed six years ago by studying how bacteria alter DNA. CRISPR genome editing is the most recent, and most promising, way to introduce changes into DNA. It is faster, easier, and cheaper than previous methods and should eventually be more precise and controllablewhich is why it may one day be available for clinical use in people.

Though headlines about the study discussed designer babies, researchers prefer to emphasize how these techniques could help stop devastating genetic disorders. The Oregon experiments with human embryo cells corrected disease-associated DNA variants associated with heart muscle wasting that can cause heart failure. The treated embryos were alive for only a few days and were never intended to become a human baby. They were, however, human embryos deliberately created for the research.

U.S. guidance in this area is sparse and reflects the lack of societal consensus. In 1994, when the federal government was contemplating funding for research involving human embryos, the NIH Embryo Research Panel concluded that just this kind of experiment was ethically appropriate. But within hours of that reports release, then-President Bill Clinton announced he did not agree with creating embryos in order to do research on them.

The United States currently has just two policies relevant to genomic editing of human embryos. The first blocks federal funding: On April 28, 2015, Francis Collins, director of the National Institutes of Health, stated, NIH will not fund any use of gene-editing technologies in human embryos. This is not embedded in statute or formal executive order, but members of Congress are fully aware of it and it is, in effect, a federal policy. NIH can (and does) fund genome editing of nonembryonic cells that might be used to treat cancer and for other possible therapeutic purposes, but not embryonic cells that would have their effect by creating humans with germline alterations.

Second, Congress has prohibited the Food and Drug Administration from reviewing research in which a human embryo is intentionally created or modified to include a heritable genetic modification. This language comes from a rider to FDAs annual appropriations. Yet use of human embryonic cells for treatment should be subject to FDA regulation. So this language in effect means alterations of embryonic cells cannot be done in the United States if there is any intent to treat a human being, including implantation of an altered embryo into a womans uterus. This will remain true so long as the rider is included in FDAs annual appropriations. The federal government thus has two relevant policies, both of which take federal agencies out of the action: One removes NIH funding, and the other precludes FDA oversight of genome-edited human embryos.

This leaves privately funded research that has no direct therapeutic purpose, such as with the Oregon experiments. The funding came from OHSU itself; South Korean Basic Research Funds; the municipal government of Shenzhen, China; and several private philanthropies (Chapman, Mathers, Helmsley, and Moxie). The research complies with recommendations to study the basic cellular processes of genome editing, keeping an eye on possible future clinical use but only so long as the work does not attempt to create a human pregnancy.

By coincidence, on the same day the Nature paper came out, the American Journal of Human Genetics also published a thoughtful 10-page position statement about germline genome editing from the American Society for Human Genetics endorsed by many other genetic and reproductive medicine organizations from all over the world. It reviews recommendations of the National Academies of Sciences, Engineering, and Medicine, several international and U.S.-based organizations and commissions, and makes several recommendations of its own, concluding it is inappropriate to perform germline gene editing that culminates in human pregnancy, but also there is no reason to prohibit in vitro germline genome editing on human embryos and gametes, with appropriate oversight and consent from donors, to facilitate research on the possible future clinical applications. Indeed, the statement argues for public funding. Finally, it urges research to proceed only with compelling medical rationale, strong oversight, and a transparent public process to solicit and incorporate stakeholder input.

So is there a problem here? It is truly wonderful that medical and scientific organizations have addressed genome editing. It is, however, far from sufficient. Reports and scientific consensus statements inform the policy debate but cannot resolve it. All of the reports on genome editing call for robust public debate, but the simple fact is that embryo research has proven highly divisive and resistant to consensus, and it is far from clear how to know when there is enough thoughtful deliberation to make policy choices. Its significant that none of the reports have emerged from a process that embodied such engagement. The Catholic Church, evangelical Christians, and concerned civic action groups who view embryo research as immoral are not likely to turn to the National Academies of Sciences, Engineering and Medicine, the American Society for Human Genetics, the Hinxton Group, the Nuffield Council on Bioetics, or other scientific and medical organizations for their primary counsel. They may well listen to scientists, but religious and moral doctrine will get greater weight. Yet religious groups highly critical of embryo research are part of the political systemand whether we embrace this sort of genome editing in the United States is a political question, not a purely technical one.

Reports and scientific consensus statements inform the policy debate but cannot resolveit.

Addressing the political questions will be extremely difficult. The U.S. government is poorly positioned to mediate the policy debate in a way that recognizes and addresses our complex moral pluralism. NIH and FDA are two of the most crucial agencies, but current policies remove them from line authority, and with good reason, given that engaging in this debate could actually endanger the agencies other vital missions. International consensus about genome editing of human embryos remains no more likely than about embryo research in general: Some countries ban it while others actively promote and fund it. Private foundations dont have the mandate or incentive to mediate political debate about a controversial technology that rouses the politics of abortion. What private philanthropic organization would willingly take on such a thankless and politically perilous ta
sk, and what organization would be credible to the full range of constituencies?

So who can carry out the public engagement that everyone seems to agree we need? The likely answer is no one. This problem occurs with all debate about fraught scientific and technical innovations, but its particularly acute when it touches on highly ossified abortion politics.

The debate about genomic editing of human embryos is unlikely to follow the recommendations for systematic forethought proposed by illustrious research bodies and reports. Given the reactions weve seen to human embryonic stem-cell research in the past two decades, we have ample reason for pessimism. Rather, debate is more likely to progress by reaction to events as researchers make newsoften with the same lack of information we lived with for the last week of July, based on incomplete media accounts and quotes from disparate experts who lacked access to the details. Most of the debate will be quote-to-quote combat in the public media, leavened by news and analysis in scientific and medical journals, but surrounded by controversy in religious and political media. It is not what anyone designing a system would want. But the recommendations for robust public engagement and debate feel a bit vacuous and vague, aspirations untethered to a concrete framework.

Our divisive political system seems fated to make decisions about genomic editing of human embryos mainly amidst conflict, with experts dueling in the public media rather than through a thoughtful and well-informed debate conducted in a credible framework. As the furor over the Oregon experiments begins to dissipate, we await the event that will cause the next flare-up. And so it will continue, skipping from news cycle to news cycle.

History shows that sometimes technical advances settle the issues, at least for most people and in defined contexts. Furor about in vitro fertilization after Louise Brown, the first test tube baby, was born in 1978 gave way to acceptance as grateful parents gave birth to more and more healthy babies and welcomed them into their families. Initial revulsion at heart transplants gave way in the face of success. Anger about prospects for human embryonic stem-cell research might similarly attenuate if practical applications emerge.

Such historical examples show precisely why reflective deliberation remains essential, despite its unlikely success. Momentum tends to carry the research forward. Yet at times we should stop, learn more, and decide actively rather than passively whether to proceed, when, how, and with what outcomes in mind. In the case of genome editing of human embryos, however, it seems likely that technology will make the next move.

This article is part of Future Tense, a collaboration among Arizona State University, New America, and Slate. Future Tense explores the ways emerging technologies affect society, policy, and culture. To read more, follow us on Twitter and sign up for our weekly newsletter.

See original here:
Listening for the Public Voice - Slate Magazine

Scientist John Shine honoured for discovery that formed basis of genetic engineering – The Guardian

Prof John Shine in 2015. Shine discovering a sequence of DNA now called the Shine-Dalgarno sequence which allows cells to produce proteins the basis for how all our cells operate. Photograph: Mal Fairclough/AAP

A man whose discovery was essential for the development of genetic engineering, and used that technology to create several therapies now helping many thousands of people, says receiving a Queens Birthday honour is a great recognition from the community of the value of scientific research.

John Shine started his career by discovering a sequence of DNA now called the Shine-Dalgarno sequence as part of his PhD in the mid 1970s.

That sequence, while a minuscule part of the human genome, allows cells to produce proteins the basis for how all our cells operate.

The discovery was essential for genetic engineering, spawned an entire biotech industry, and has now been used to produce therapies that have helped millions of people. In his own work, Shine used those techniques to clone of human insulin and growth hormone for the first time.

Other scientists honoured on Monday included astronomer Ken Freeman, who founded the field of galactic archaeology, and ethnobotanist Beth Gott.

Shine, who was appointed a Companion of the Order of Australia today, told the Guardian he has been unusually lucky in his career to have been able to oversee discoveries he made in basic sciences, be translated into real therapies and become commercialised.

My PhD was really esoteric research, he said, referring to his discovery of the Shine-Dalgarno sequence . But then I went over to San Francisco when gene cloning was just beginning right place, right time.

Shine had discovered how to clone the human gene that produces insulin, but to make that useful, it needed to be inserted into another organism that could be farmed in this case, bacteria, which would be farmed in large vats.

But if you want to put [the gene] into bacteria to make human insulin, you needed to trick the bacteria into thinking the gene was one of its own, he said.

It turned out Shines earlier discovery of the Shine-Dalgarno sequence was essential for making that final leap. Although the genetic code is the same in animals and bacteria, the regulatory code was very different. Thats where the Shine-Dalgarno sequence comes in, Shine said.

He needed to find the bacterias version of the Shine-Dalgarno sequence, and put that on either side of the human insulin gene, inside the bacteria.

You needed to put the right Shine-Dalgarno sequence just in front in the right place in the insulin gene to make the bacteria produce human insulin.

The fact that both problems were so closely related was mostly an accident, Shine says.

But throughout the rest of his career, Shine continued to be involved in the translation of his discoveries in esoteric science, all the way through to commercialisation.

Since stepping down as the head of the Garvan Institute in 2011 one of Australias top medical research institutes Shine has been the chair of the biotech giant CSL, one of Australasias largest companies.

So Ive come full circle, Shine said. CSL ... in more recent years, were moving into genetic engineering and weve released several genetically modified proteins for haemophilia that are changing the lives of thousands of people around the world.

Ive been very lucky to be able to go through the basic research in my career, and now see a lot of these real health care products come to fruition and improve the lives of thousands of people. Its wonderful when you can have all the excitement of research but also the satisfaction of seeing something very good coming out of it.

It is not the first time Shine has been recognised publicly for his work. In 2010 he won the prime ministers prize for science something his brother Rick Shine won in 2016.

Apart from the obvious personal honour, its a demonstration that the community does appreciate the benefits that come from research, Shine said. The wellbeing of any society is intimately linked to good healthcare.

Another winner of the prime ministers science prize, astronomer Ken Freeman, was appointed a Companion of the Order of Australia for his founding contributions to the field of galactic archaeology and his teaching work at the Australian National Universitys Mount Stromlo Observatory.

Honours were also awarded to Royal Melbourne hospitals Peter Grahame Colman (AM), for his work in endocrinology and diabetes research; aeronautical engineer Graeme Bird (AO), the former department head at the University of Sydney and a NASA consultant for 40 years; and Peter Klinken (AC), the chief scientist of Western Australia.

Ethnobotanist Beth Gott was made a Member of the Order of Australia for her work studying native plants and their use by Indigenous people. Gott founded Monash Universitys Aboriginal education garden in 1986 and has assembled databases of native plants in south-eastern Australia.

A paper she wrote in 2005 for the Journal of Biogreography found Indigenous fire-farming was crucial to the growth of plant tubers in southeastern Australia, allowing them to make up half of the local peoples diet.

See the original post:
Scientist John Shine honoured for discovery that formed basis of genetic engineering - The Guardian

Benefits of Human Genetic Engineering – Popular Issues

QUESTION: What are the benefits of human genetic engineering?

ANSWER:

The benefits of human genetic engineering can be found in the headlines nearly every day. With the successful cloning of mammals and the completion of the Human Genome Project, scientists all over the world are aggressively researching the many different facets of human genetic engineering. These continuing breakthroughs have allowed science to more deeply understand DNA and its role in medicine, pharmacology, reproductive technology, and countless other fields.

The most promising benefit of human genetic engineering is gene therapy. Gene therapy is the medical treatment of a disease by repairing or replacing defective genes or introducing therapeutic genes to fight the disease. Over the past ten years, certain autoimmune diseases and heart disease have been treated with gene therapy. Many diseases, such as Huntington's disease, ALS (Lou Gehrig's disease), and cystic fibrosis are caused by a defective gene. The hope is that soon, through genetic engineering, a cure can be found for these diseases by either inserting a corrected gene, modifying the defective gene, or even performing genetic surgery. Eventually the hope is to completely eliminate certain genetic diseases as well as treat non-genetic diseases with an appropriate gene therapy.

Currently, many pregnant women elect to have their fetuses screened for genetic defects. The results of these screenings can allow the parents and their physician to prepare for the arrival of a child who may have special needs before, during, and after delivery. One possible future benefit of human genetic engineering is that, with gene therapy, a fetus w/ a genetic defect could be treated and even cured before it is born. There is also current research into gene therapy for embryos before they are implanted into the mother through in-vitro fertilization.

Another benefit of genetic engineering is the creation pharmaceutical products that are superior to their predecessors. These new pharmaceuticals are created through cloning certain genes. Currently on the market are bio-engineered insulin (which was previously obtained from sheep or cows) and human growth hormone (which in the past was obtained from cadavers) as well as bio-engineered hormones and blood clotting factors. The hope in the future is to be able to create plants or fruits that contain a certain drug by manipulating their genes in the laboratory.

The field of human genetic engineering is growing and changing at a tremendous pace. With these changes come several benefits and risks. These benefits and risks must be weighed in light of their moral, spiritual, legal, and ethical perspectives. The potential power of human genetic engineering comes with great responsibility.

What is your response?

Yes, today I am deciding to follow Jesus

Yes, I am already a follower of Jesus

I still have questions

See the rest here:
Benefits of Human Genetic Engineering - Popular Issues

Farmington Medical Startup Targets Hearing Loss With New Drugs – Hartford Courant

Researchers have established a startup business that could restore hearing that people have lost to construction, traffic, jet planes and even rock concerts.

Frequency Therapeutics, based in Farmington and Woburn, Mass., is developing drugs that would activate certain cells, stimulating the regrowth of hair cells in the inner ear to counter "chronic noise-induced hearing loss."

"The evolution of our hearing was not meant to hang out on subway platforms or put on earbuds or go to U2 concerts," said David Lucchino, chief executive officer of Frequency Therapeutics. "There's a disconnect between the evolution of hearing and the industrialized world we live in."

The company is part of the University of Connecticut's business incubation program, which aims to provide support to new business startups, and has received $32 million in financing. It is researching technology to develop a gel that would be injected in the middle ear between the eardrum and oval window in a doctor's office procedure of about 30 minutes.

The intent is to recreate sensory hair cells as many as 15,000 in each ear that act as antenna in converting sound into signals understood by the brain. Or as Lucchino says, how to "biologically hot-wire the inner ear" to help it regenerate itself.

The human ear is incapable of spontaneously restoring lost or damaged hair cells, making hearing loss permanent.

Jeff Karp, who co-founded Frequency Therapeutics in 2015, said "druggable tissue regeneration" has a broad platform, with hearing loss a first application.

Activating the body's progenitor cells known as descendants of stem cells that can form one or more kinds of cells in regenerating tissue also could be applied to treating skin disorders or reversing vision problems. By activating the progenitor cells, Frequency Therapeutics can prod disease modification without the complexity of genetic engineering.

Birds and amphibians, such as frogs, regenerate their hearing, which is critical for their survival. That observation prompted researchers to ask if the same can be done for humans, said Karp, an associate professor of medicine at Harvard Medical School's teaching affiliate, Brigham and Women's Hospital in Boston. "We knew the biology existed."

Lucchino said researchers looking to establish companies that will draw investment money consider ways to have the "biggest impact helping people." Finding a successful treatment for hearing loss would benefit a large market: About 36 million people in the U.S. are affected, researchers say.

The World Health Organization estimates that 1.1 billion young people are at risk for hearing loss from recreational noise. About 360 million people worldwide, or 5 percent of the global population, have disabling hearing loss. Of that, 32 million are children.

Hearing loss caused by prolonged exposure to excessive noise can be due to heavy construction or military training, but common loud noises like subways, concerts and the use of headphones can have a significant impact on hearing.

Genetic causes, complications at birth, certain infectious diseases, chronic ear infections, the use of particular drugs, exposure to excessive noise and aging also are blamed for hearing loss, according to the World Health Organization

The next step for Frequency Therapeutics is for researchers to move their work to the clinic, expected in the next year to 18 months, and "show this drug actually works," Lucchino said.

Read the rest here:
Farmington Medical Startup Targets Hearing Loss With New Drugs - Hartford Courant

Genome Engineering Market Is Gaining Momentum with the Introduction of the Latest Technological Developments – Digital Journal

Transparency Market Research Report Added "Genome Engineering Market - Global Industry Analysis, Size, Share, Growth, Trends, and Forecast 2015 - 2023"

This press release was orginally distributed by SBWire

Albany, NY -- (SBWIRE) -- 05/09/2017 -- Due to the presence of a small number of leading international players and few regional players, the competitive rivalry in the global genome engineering market is expected to remain moderate, reports Transparency Market Research in a new study. The market is dominated by three leading companies, Thermo Fisher Scientific, Inc., Sigma-Aldrich Corporation, and Sangamo Biosciences, Inc. These companies together accounted for 73.5% of the overall global revenue in 2014.

The emergence of small regional players in the genome engineering market has impelled the leading companies to focus on innovative product development. They are also focused on the development of differentiated products to maintain their lead in the global as well as regional markets. To produce innovative and technologically advanced products, companies are entering into agreements with research institutes and laboratories and investing in research and development projects.

Increasing Investments by Pharmaceutical and Biotechnology Companies to Boost Adoption of Genome Engineered Techniques

With the emergence of new trends in the treatment of genetic diseases, pharmaceutical and biotechnology firms have realized the need for advanced gene editing technologies for detecting genetic anomalies. Leading firms are focusing on the mutation of cells to curb cell and genetic diseases. According to a TMR analyst, "To gain technology relating to gene editing, pharmaceutical companies are either investing in the ongoing projects of medical organizations or entering into a collaboration with them."

For instance, in October 2015, in order to develop treatments for human genetic disorders, Vertex Pharmaceuticals a cystic fibrosis drug maker, entered into an agreement with CRISPR Therapeutics, a gene editing tech company.

The increasing funding by governments and non-government organizations for genome research and technological advancements along with investments made by biotechnology and pharmaceutical companies is expected to boost the worldwide adoption of genome engineered techniques.

Rising Ethical Concerns Regarding Genetic Engineering to Hinder Industry Growth

Genetic engineering has been a topic for debate for years now after human germ-line alteration for medical purposes was considered unethical by several social, health, and religious organizations. The U.S. National Institute of Health has prohibited funding for genetic engineering of human embryos arguing against its need as it leads to complications in the human genes. Several social organizations have argued that the alteration in animal genes is likely to affect the genetic makeup of the coming generations of the animal along with reducing the lifespan of an individual genetically engineered animal.

Along with ethical issues, strict regulatory framework regarding the approval for genetically modifying a plant, human or animal genome are likely to impede the growth of the global genome engineering market.

For more information on this report, fill the form @ http://www.transparencymarketresearch.com/sample/sample.php?flag=S&rep_id=4671

Use of Genome Engineering Technologies for Wide Range of Applications to Provide Lucrative Opportunities to Vendors

Genome engineering technologies are used for various applications such as in crop improvement. It has huge potential in homologous recombination for crop improvement in targeted gene replacement therapy. The companies operating in the global genome engineering market are focusing on capitalizing opportunities arising from the usage of genome engineering techniques in a wide range of applications. They are focusing on the modification of the existing technologies to meet the required standards of the various applications segments and gain advanced genome engineering technologies.

North America is expected to lead the global genome engineering market with revenues amounting to US$3.68 bn by the end of 2023. Cell line engineering is likely to maintain its lead among the applications segments with a revenue of US$3.32 bn by 2023.

With positive factors in dominance, the global genome engineering market is estimated to rise at a CAGR of 14.2% between 2015 and 2023. The global genome engineering market was valued at US$2.30 bn in 2015 and is estimated to touch a valuation of over US$7.21 bn by 2023.

The review is based on the findings of a TMR report titled," Genome Engineering Market: Global Industry Analysis, Size, Share, Growth, Trends, and Forecast 2015 - 2023."

About Transparency Market Research Transparency Market Research (TMR) is a global market intelligence company providing business information reports and services. The company's exclusive blend of quantitative forecasting and trend analysis provides forward-looking insight for thousands of decision makers. TMR's experienced team of analysts, researchers, and consultants use proprietary data sources and various tools and techniques to gather and analyze information. Our business offerings represent the latest and the most reliable information indispensable for businesses to sustain a competitive edge.

Contact Us

Transparency Market Research State Tower, 90 State Street, Suite 700 Albany, NY 12207 United States Tel: +1-518-618-1030 USA - Canada Toll Free: 866-552-3453 Email: sales@transparencymarketresearch.com Website: http://www.transparencymarketresearch.com

For more information on this press release visit: http://www.sbwire.com/press-releases/genome-engineering-market/release-803570.htm

Read more here:
Genome Engineering Market Is Gaining Momentum with the Introduction of the Latest Technological Developments - Digital Journal

Amazing genetics – The News International

With the world population expected to reach nine billion by 2050, and with limited cultivable area on our planet, there is an increasing probability of droughts and mass famines in many countries.

Pakistan will be among those countries that will be most seriously affected by global warming. The spectacular advances in genomics in the last few decades offer some beacon of hope. The development of genetically-engineered crops will give increased yields, offer better nutrition and be resistant to diseases.

All the hereditary information in plants or animals is contained in their genes. Think of a tiny microscopic necklace (DNA) with many millions or billions of four different types of molecules known as nucleic acids arranged in it. It is the sequence in which these nucleic acids are arranged that determines everything about living organisms, such as the types and qualities of fruits that plants bear, the colour of our eyes, the structure of our hearts or brains, etc. The order in which these molecular beads are arranged is known as the genetic code. The first such code in humans to be unravelled was that of Prof Jim Watson in 2007. It cost about a million dollars and took years to accomplish. With faster sequencing machines now available, this can be done within a week at a cost of about $1,500 today.

A remarkable breakthrough has now been made by scientists at Imperial College, London. They have developed a microchip that can allow the sequencing to be done at an incredible speed the entire genome of 3.16 billion nucleic acids in human beings can be read and deciphered within minutes. The device in which the chip is incorporated reads the small changes in current as the molecular necklace passes through it. It is being scaled up so that it can read the sequence of molecules at a speed of 10 million molecules per second (compared to the present machines that can read the sequence at 10 molecules per second).

Another amazing development has been the identification of crime genes in hardened criminals. The presence of the gene restricts the formation of serotonin B2 receptor, and so affects the part of the brain that is responsible for restraint and foresight of the consequences of ones actions. The presence of the gene increases the predisposition to violence. However, all the people carrying the gene are not necessarily violent. Other psychological causes may also be responsible for violent behaviour.

A few years ago, researchers at Kings College London had identified certain genes that are responsible for the ageing process in human beings. They found that these genes are switched off and on by certain external factors, such as diet and the environment, and may hold the keys for living a longer and healthier life. The four key genes that affected the rate of healthy ageing and potential longevity were related to cholesterol, lung function and maternal longevity.

A research group at ETH Zurich discovered that when certain ageing genes are altered, the healthy lifespan of laboratory animals can be extended significantly. Efforts to achieve something similar in human beings are under way and many scientists believe that our children may be able to live up to the age of 120 years. In 2016, the US Food and Drug Administration (FDA) approved an anti-ageing drug trial. This was the first time the FDA recognised ageing as a new drug target

Over 200 million people are afflicted with malaria each year and nearly 800,000 deaths are recorded due to it every year. Over 90 percent of these deaths mostly of chidren occur in Sub-Saharan Africa. An exciting approach to tackle this disease is to develop genetically modified mosquitoes that can bring down the population of the harmful female variety. Anthony James, working at the University of California Irvine, has developed a genetically-modified variety of these female mosquitoes only. The genetic deformation prevents them from flying. The larvae hatch on water but the females cannot fly, and therefore die.

This approach of genetic genocide may ultimately help to reduce the populations of malaria-causing mosquitoes and save millions of lives. The advances made in the rapid sequencing of the human genome are leading to a greater understanding of the genetic causes of many human diseases. A whole new area of personalised medicine is also under rapid development. This will allow drugs to be tailored according to individual genetic make-up of different groups of populations.

An excellent centre for genetic engineering has now been established in Pakistan. The Jamil-ur-Rahman Centre for Genome Research built from my personal donation and named after my father is located in the International Centre for Chemical and Biological Sciences (ICCBS) in Karachi and is emerging as a centre of excellence. It is equipped with the state-of-the-art gene sequencing facilities the best in the country and is now deeply involved in health and agricultural research under the able leadership of the dynamic director of the ICCBS, Prof Iqbal Choudhary.

The rapid advances in genome sequencing technologies are opening up a whole new era of medicine. We need to develop our own research base to develop new genetically engineered varieties of food crops rather than relying on seeds imported from the West. This will also reduce the danger of us becoming completely dependent on foreign masters. Control the food chain within a country and you can control that country. This must not be allowed to happen in Pakistan. We need to invest massively in developing salt-tolerant and drought-resistant varieties of different crops through natural selection or through genetic engineering before we are engulfed by the challenges of famine and drought that surely lie ahead. Science must come to the rescue.

Countries that are investing in such advances are earning billions of dollars. For Pakistan to emerge from the shackles of poverty, we need to invest in science, technology, innovation. We also need to establish strong linkages between research and industry/agriculture. But the development budget of the Ministry of Science and Technology in Pakistan (about Rs1.8 billion only) is extremely low. Our investment in education is also low a little over two percent of our GDP ranking us among the bottom nine countries of the world.

We must realise that in order to develop, we must invest in top quality schools, colleges and universities so that we can transition to a strong knowledge-based economy. It is time to change directions and invest in our real wealth our children so that we too can stand with dignity in the comity of nations.

The writer is chairman of UN ESCAP Committee on Science Technology & Innovation and former chairman of the HEC. Email: [emailprotected]

Visit link:
Amazing genetics - The News International

Coronavirus: Scientists tackle the theories on how it started – Sky News

Scientists have analysed the entirety of the novel coronavirus' genomic sequence to assess claims that it may have been made in a laboratory or been otherwise engineered.

The coronavirus outbreak first emerged in the Chinese city of Wuhan last December and has caused an international pandemic, infecting more than 198,000 people and leading to over 7,900 deaths.

International blame around the COVID-19 pandemic has incited conspiracy theories about its origin.

Without evidence Zhao Lijian, a spokesperson for China's foreign ministry, suggested on Twitter that the virus could have been brought to Wuhan by the US army.

While he may have been insincerely provocative in response to American officials describing the outbreak as the Wuhan virus, stressing its beginnings in China, he received thousands of retweets.

Rumours linking the virus to the Wuhan Institute of Virology - based on geographic proximity, and without any endorsement from qualified epidemiologists - have also circulated.

:: Listen to the Daily podcast on Apple Podcasts, Google Podcasts, Spotify, Spreaker.

Shortly after the epidemic began, Chinese scientists sequenced the genome of the virus and made the data publicly available for researchers worldwide.

Even the integrity of these scientists and medical professionals has been called into question by conspiracy theorists, prompting an international coalition of scientists to sign a joint letter of support for them and their work, published in medical journal The Lancet.

The value of the genomic sequence could prove vital for those developing a vaccine, but it also contains key details revealing how the virus evolved.

New analysis by researchers at the Scripps Research Institute in the US, UK and Australia discovered that the virus has proved so infectious because it developed a near-perfect mechanism to bind to human cells.

This mechanism is so sophisticated in its adaptions that the researchers say that it must have evolved and not been genetically engineered in their paper, titled "COVID-19 coronavirus epidemic has a natural origin", published in the journal Nature Medicine.

Dr Josie Golding, the epidemics lead at the Wellcome Trust in the UK, described the paper as "crucially important to bring an evidence-based view to the rumours that have been circulating about the origins of the virus causing COVID-19".

"They conclude that the virus is the product of natural evolution, ending any speculation about deliberate genetic engineering," Dr Golding added.

So how do they know? One of the most effective parts of the virus are its spike proteins, molecules on the outside of the virus which it uses to grab hold of and then penetrate the outer walls of human and animal cells.

There are two key features in the novel coronavirus' spike proteins which make its evolution a certainty.

The first is what's called the receptor-binding domain (RBD) which they describe as "a kind of grappling hook that grips on to host cells", while the second is known as the cleavage site, "a molecular can opener that allows the virus to crack open and enter host cells".

If researchers were actually going to design a virus to harm humans then it would be constructed from the backbone of a virus already known to cause illness, the researchers said.

However the coronavirus backbone is radically different to those which are already known to affect humans, and in fact are most similar to viruses which are found in bats and pangolins.

"These two features of the virus, the mutations in the RBD portion of the spike protein and its distinct backbone, rules out laboratory manipulation as a potential origin for [the coronavirus]," said Dr Kristian Andersen, corresponding author on the paper.

Another study of the genome by researchers at the Wuhan Institute for Virology reported that the virus was 96% identical to a coronavirus found in bats, one of the many animals sold at a Wuhan seafood market where it is suspected the virus jumped to humans.

However the new research was unable to determine whether the virus evolved into its current pathogenic state in a non-human host before jumping to a human, or if it evolved into that state after making the jump.

Excerpt from:
Coronavirus: Scientists tackle the theories on how it started - Sky News

There’s much to learn from China’s mobilisation in the face of crisis – Morning Star Online

CHINA has built an isolation hospital for coronavirus sufferers in six days. In the spirit of socialist emulation, the team building the Wuhan facility aimed to beat the seven-day record set in Beijing during the 2003 Sars emergency.

As this weekends weather events have shown, Britain is yet to set in place effective flood controls.

China is a big economy and can mobilise very considerable human and material resources. But Britain is, by comparison, a mature economy the fifth-largest in the world and has yet to lay the first sleeper in the second high-speed railway.

The Chinese have offered to build this disputed bit of infrastructure by the middle of the decade. It might be a good idea to suggest they take the contract to build a decent system of flood defences first and then have a crack at building HS2 top-down from the north andupgrade the regional rail system while they are at it.

Two crises, two systems.

It is impossible for the lay person, from outside the charmed circles of experts, to make informed decisions about the feasibility or the costs of infrastructure projects of this scale.

That is why governments have to take leadership responsibility, make the people charged with these tasks accountable and keep a sharp eye on the costs and commercial advantages that accrue for the people and enterprises involved.

The starting point for any project at the scale required to modernise Britains rail network (and for that matter our coastal and flood defence systems) is the human and social needs measured against the environmental and social costs of not getting on with it.

By the same token, the starting point for any response to a medical emergency of the kind presented by this new mutation of the coronavirus must be the public health priorities that it raises.

It is hard to imagine that Britain, relying on an increasingly privatised health system, a civil engineering sector dominated by large, larcenous and frequently failing firms in many cases owned and managed by dynasties of extremely reactionary hue to be able to either conceive of such projects, let alone carry it out to such a tight timetable.

The reason China can do this is not due to any inherent characteristics of the Chinese people, any unexplained genetic predisposition, but simply the nature of the actually existing social system.

China can mobilise these enormous social forces, can direct these immense human resources, and gain the enthusiastic human engagement of the necessary millions precisely because the commanding heights of the economy and the decisive levers of power are, in essence, socially owned and directed.

This is not to say that every aspect of the way China goes about things would go down particularly well in Britain. That being said, there is a fairly substantial constituency of opinion who might give very serious consideration to implementing the Chinese policy of shooting corrupt bankers.

And the practice of imposing powerful sanctions including long prison sentences on political and government officials, or the managers of enterprisesguilty of negligence and corruption in their public roles is something that would represent a sea change in the way Britain deals with these matters.

Capitalism, as a system for running complex modern economies and managing advanced and modern states, continually demonstrates its redundancy.

We have powerful examples of different ways of doing things. Britain needs to find its own way to ensure that the system of ownership and control corresponds to the real needs of our people and the harmonious and productive development of our economy. It is demonstrably clear that this is not capitalism.

Link:
There's much to learn from China's mobilisation in the face of crisis - Morning Star Online

Ten books on thinking about thinking – Moneyweb.co.za

Christmas is behind us and the new year is upon us, there may be some time to find a new read.

So here are a few books I will read, or atleaststart. What attracted me to these books is how they approachthinking about thinking: Each tries to tease out why our general understanding on a subject is so often wrong; they explore better cognitive frameworks that could help us comprehend issues more clearly; they consider unique perspectives in securities trading, national security, genetics and artificial intelligence.

On to the reading:

No. 1. Behave: The Biology of Humans at Our Best and Worst by Robert M. Sapolsky.

The professor of biology and neurological sciences at Stanford University (and a MacArthur Fellowship winner in 1987) takes a deep examination into the most basic question of human behaviour: Why do we do the things we do?

He probes the things that influence and determine behaviour: neurology, endocrinology, structural development of the nervous system, culture, ecology and the millions of years of evolution. Why we do what we do turns out to be even more complicated than you might have imagined.

No. 2. The Mosquito: A Human History of Our Deadliest Predator by Timothy C. Winegard.

Forget sharks, terrorists or guns: Mosquitoes have killed more people than all other factors in historycombined. Of the 108 billion humans who have ever lived, almost half 52 billion have died from mosquito-borne illnesses. For 190 million years, the mosquito has been waging a war against the rest of the planet, and for all of that history we have been fighting a mostly losing battle.

This has long been one of my very favourite topics; I am thrilled there is finally a book dedicated to it.

No. 3. The Man Who Solved the Market: How Jim Simons Launched the Quant Revolution by Gregory Zuckerman.

This is my nominee for finance book of the year: I read it,reviewedit and interviewed the author forMasters in Business. All thats left is to reread it slowly and deliberately, with no purpose other to enjoy the tale of how one brilliant man saw the markets in a different way from everyone else.

No. 4. Hacking Darwin: Genetic Engineering and the Future of Humanity by Jamie Metzl.

What will happen to children, lifespans, the plant and the animal world when humans begin to retool the worlds genetic code? Metzl tackles the risks and potential rewards to tinkering with the determinants of life as if theyre just another piece of software.

No. 5. Biased: Uncovering the Hidden Prejudice That Shapes What We See, Think, and Do by Jennifer L. Eberhardt.

Investors know that unconscious bias is at work all the time, undermining our goals. What we may not realize is how bias infects our visual perception, attention, memory and actions. The author suggests solutions to managing our biases, but I remain skeptical we can get past our own error-prone nature.

No. 6. Range: Why Generalists Triumph in a Specialised World by David Epstein.

Among top performers, specialization is the exception, not the rule. Thats the startling conclusion of Epstein, a journalist with Sports Illustrated and ProPublica. Considering some of the worlds most successful athletes, artists, inventors, scientists and business people, he found that it was the generalists who excelled, not the specialists.

No. 7. The Spy and the Traitor: The Greatest Espionage Story of the Cold War by Ben Macintyre.

What colleagues, institutions and competitors do you trust? How does counterintelligence and disinformation affect how we make decisions? These issues are explored in this nonfiction tale of the three-way Cold War game of espionage between the US, the UK and the Soviet Union.

No. 8. Trick Mirror: Reflections on Self-Delusion by Jia Tolentino.

Tolentino looks at the basic building blocks of social media and how we use it to deceive not so much others as ourselves. This series of essays tracks among other things the evolution of the internet from a band of enthusiastic geeks and hackers to the trolls and agents of agitprop that have taken over.

No. 9. Talking to Strangers: What We Should Know about the People We Dont Know by Malcolm Gladwell.

Communication breakdown is the focus in this tour of errors, miscommunication and lies. One of our eras most engaging storytellers, Gladwell roams from Fidel Castro to Bernie Madoff and lots of folks in between. His big premise: the default condition of our species is to assume others tell the truth. This makes all of us vulnerable to the deceptions of politicians, salespeople and con artists.

No. 10. Prediction Machines: The Simple Economics of Artificial Intelligence, by Ajay Agrawal, Joshua Gans and Avi Goldfarb.

What happens if we rethinkthe concept of artificial intelligence as a drop in the cost of prediction? That is the question tackled by the three authors of this book, all economists at the University of Torontos Rotman School of Management. The conclusion is that AI, instead of complicating human affairs, may improve decision-making.

2019Bloomberg L.P.

The rest is here:
Ten books on thinking about thinking - Moneyweb.co.za

Do we need a Greta Thunberg in healthcare? – ModernHealthcare.com

After a week at the World Economic Forum in Davos this month, I'm convinced the globe faces two existential threats that demand disruption of our businesses, our policies and indeed our lifestyles: climate change and health assurance for all.

Both will require disruption in our way of thinking, creative partnerships with entities that have not worked together to create new ecosystems, as well as artificial intelligence and other new technologies that may be game-changers if constructed properly.

Just as climate change cannot be solved by the energy industry alone, health assurance cannot be solved by the healthcare delivery industry alone. At Davos, I served as a distinguished fellow of the World Economic Forum, charged with developing equitable and sustainable business models for the transition to a digital economywhat World Economic Forum founder Klaus Schwab called the "fourth industrial revolution."

The fourth industrial revolution can be defined in a pretty nonthreatening way as the blurring of boundaries between the digital, physical and biological worlds, as a fusion of advances in artificial intelligence, robotics, the internet of things, genetic engineering, quantum computing, 5G and the kitchen sink of exciting new technologies that will blossom in the next decade.

But the impact of a digital economy on healthcare will be immense. Which is why at Davos I advocated that we talk less about the technology of self-driving cars, and more about self-healing humans.

After presenting at 10 sessions at the forum, here's my framework for equitable and sustainable models of change:

Start with ethics. Trust is more important than technology. Ethics must be injected into product development at the very earliest stage, when values are being assessed. Don't wait until a product is ready for market and then ask marketing to make it trustworthy.

Reach across industry. We talk and plan in silos, but health assurance only comes when our industry talks with those involved in food, transportation, education, policy and the creation of jobs.

There's no such thing as "non-disruptive" disruption. By definition, disruption will be painful to those who don't want to think differently as new ecosystems are built.

Data is not the new gold, but intellectual property is. We must understand how intellectual property is derived from the personal data of our patients, and ensure there are bright lines for enhanced consent in the use of this data.

Never forget the human in the middle. As online meets offline, the excitement tends to focus on the technology. But what's exciting is focusing on humanson new roles for clinicians in the online-meets-offline world, on new services for patients. And on what I call health assuranceconstructing a system where the primary goal is a healthy and happy life for all.

As in the climate change crisis, it's the humans who will create the revolution. There is, in fact, an army of Greta Thunbergs in healthcare. You see them in the patient empowerment movement. You see them among our students, who are deeply concerned about social justice in a world that doesn't reimburse for it. You see them in the calls for gender and racial equality. And you can find them in the frontiers of the digital health movement.

I've long believed that if you want to see the future, find good people who are uncomfortable with the status quo. Our job is to go to places like Davos and advocate for those peoplefor a system that can be brilliant when caring for the sick, but also enhances health assurance across all boundaries.

Originally posted here:
Do we need a Greta Thunberg in healthcare? - ModernHealthcare.com

Getting the Most from Biotech: Precision Engineering and Partnership – BioSpace

We are Earths Tech Support, declared Randall Kirk, Executive Chairman of the Board of Directorsand former CEO ofIntrexon. Intrexonis one of the biggest developers of synthetic biology (or engineering biology) applications in therapeutics, agriculture and chemicals. Kirk gave a keynote speech atSynbio Marketson synthetic biologys struggle to break into mainstream markets and its revolutionary new approach for industrial biotech in the food, pharmaceuticals, chemicals and materials sectors.

Before these new technologies can save the world, they need to be accepted and get to market. Companies must overcome the usual hurdles in finding investment and meeting regulatory requirements. They must find compatible scale-up partners and face new challenges in communicating the benefits and safety of their novel technology to society.

Partnerships for Success

Collaborations are beginning to blossom in synthetic biology. The field is often likened to the silicon chip industry. In its infancy, a single company would design, build and use their own chips. Now, companies outsource the design, building, testing and manufacture of chips along a structured value chain thanks to standardization of parts and uniformity in the field. This took years to achieve. Synthetic biology companies are currently developing their own unique tools to perform new feats in engineering biotechnology. Standardization is the dream and, to achieve this, companies must work together to break into the market.

A striking partnership at the conference was that ofAMSilkandAirbus. The airline industry has a problem: they must increase fuel efficiency by reducing weight of their aircraft without compromising on safety. Composite materials are an alternative to hefty sheet metals and AMSilk produces a durable but lightweight material: synthetic spider silk. AMSilk is interesting for its energy absorption, which is important for safety of the aircraft, Detlev Konigorski of Airbus explains. This partnership could help Airbus develop safe new materials while helping the carbon footprint of the airline industry.

One of the kings of collaboration isGinkgo Bioworks. Ginkgo uses several automated platforms to speed up and precisely carry out genetic manipulation, growth and testing of cells. To build their analytical power, they collaborated withBerkeley Lights, whose technology allows functional screening of thousands of cells simultaneously, increasing throughput.

Ginkgo has used this actively in their healthcare collaborations, such as a recent team-up withSynlogic, a microbiome therapeutics company developing living medicines. Ginkgo used its platform to increase the potency of SynlogicsE. coli-based drug in non-human primates in less than a year. Ginkgo CCO Matt McKnight wants to build on these partnerships by partnering with early-stage companies. Theyrecently announced a $350 m platformto build companies using Ginkgos foundries. He foresees more partnerships in the synthetic biology space in future, I think we shouldnt have full stack engineering biology companies. In any discipline, we dont see this. People work together.

Chemicals giantBASFis also interested in partnering with synthetic biology companies. Markus Pompejus, Vice President for Innovation and Scouting addressed the conference in Berlin citing the companys wide range of products. In principal, many products could be produced with biotech methods. Synbio is a research topic, but biotech is the application, Pompejus says.

Partnering may be off-putting for early-stage companies who want to maximize ownership of their company and the topic came up repeatedly at Synbio Markets. Where do you draw the line? Where do you co-develop with customers or should you do it more yourself? asks session chair James Hallinan ofCambridge Consultants, an expert engineering firm.

Depends where you are, says Alexandre Zanghellini of protein design companyArzeda, The later you partner, the more value you capture. You certainly want to keep the process propriety until the point where it can be scaled, then partner with marketing, scale up and development partners.

Talking Tech and Selling Solutions

Synthetic biology exists at the nexus of biology and nearly every other field. Its less a field of study and more of a precision engineering approach to traditional biotechnology using standardized tools and platforms. Kirk argued in his speech humanity has been using synthetic biology for thousands of years, using crop breeding as an example of humans precisely selecting and breeding desirable traits to engineer better strains of corn, for example. Now our role in the world has changed.

Weve been doing it for 12,000 years and weve been doing it without thinking of the consequences. Synthetic biology allows us tremendous specificity and potential to solve world problems by targeting individual species, he said.

How does this help us synthetic biology products access new and existing markets? Every process has biology in it, McKnight says.InscriptasCCO Jason Gammack thinks the solution lies in getting a few tangible products to lead the way. We need to make the products tangible. In the US were in hyperdrive mode. Two years ago, there was very little. Now,Impossible Foodsis in Burger King, says Gammack. Gary Lin ofPurple Orange Venturesthinks we need to raise the profile of synthetic biology among the public, adding One of the hard challenges, we need policymakers and government funding to support this. The amount of capital gone into this space is a drop in the ocean.

The issue spills over into the regulation of gene-edited technologies, especially in Europe. We recently had a debate on CRISPR plants, says Nadine Bongaerts-Duportet ofScience MattersandHello Tomorrow. The European Union regulations says CRISPR-edited crops are defined as genetically modified (GM), while those edited by radiation exposure are not. Bongaerts adds, The difference between UV exposure and CRISPR [as gene-editing methods], everybody understands the regulations dont make sense. How do you, with a positive message, make sure everyone gets it? All the panelists agree that building trust is key.

The trust us, were scientists approach doesnt work because people dont understand the technology, according to Gammack. I would fault all the synbio community, says Kirk. We look at polling data on GMO attitudes, I thought healthcare would be the first area [accepted]. In terms of polling, people have the greatest acceptance to insect disease vectors, he says, citing IntrexonsOxitecand theirGM mosquitoas an example.

The messaging, particularly around GM and especially here in Europe, is a minefield. From our perspective, we need to be mindful of potential roadblocks, says Lin, GM in food is the most difficult to grapple with. Part of the process is creating awareness of what the food process looks like. Transparency and openness about the technology is a major factor in getting this technology to market.MonsantosFlavr Savr tomatodisaster is still fresh in peoples minds. Public acceptance to this technology is a must before the market can be broken into reliably.

We need to understand emotions and backgrounds of people we talk to, to link our advancements to the incentives they care about. We should not over-hype, because if you can be critical and open about it, people will trust you, says Bongaerts.

Read the original post:
Getting the Most from Biotech: Precision Engineering and Partnership - BioSpace

Genome Editing Services, World Markets to 2030: Focus on CRISPR – The Most Popular Genome Manipulation Technology Tool – P&T Community

DUBLIN, Nov. 28, 2019 /PRNewswire/ -- The "Genome Editing Services Market-Focus on CRISPR 2019-2030" report has been added to ResearchAndMarkets.com's offering.

This report features an extensive study of the current landscape of CRISPR-based genome editing service providers. The study presents an in-depth analysis, highlighting the capabilities of various stakeholders engaged in this domain, across different geographical regions.

Currently, there is an evident increase in demand for complex biological therapies (including regenerative medicine products), which has created an urgent need for robust genome editing techniques. The biopharmaceutical pipeline includes close to 500 gene therapies, several of which are being developed based on the CRISPR technology.

Recently, in July 2019, a first in vivo clinical trial for a CRISPR-based therapy was initiated. However, successful gene manipulation efforts involve complex experimental protocols and advanced molecular biology centered infrastructure. Therefore, many biopharmaceutical researchers and developers have demonstrated a preference to outsource such operations to capable contract service providers.

Consequently, the genome editing contract services market was established and has grown to become an indispensable segment of the modern healthcare industry, offering a range of services, such as gRNA design and construction, cell line development (involving gene knockout, gene knockin, tagging and others) and transgenic animal model generation (such as knockout mice). Additionally, there are several players focused on developing advanced technology platforms that are intended to improve/augment existing gene editing tools, especially the CRISPR-based genome editing processes.

Given the rising interest in personalized medicine, a number of strategic investors are presently willing to back genetic engineering focused initiatives. Prevalent trends indicate that the market for CRISPR-based genome editing services is likely to grow at a significant pace in the foreseen future.

Report Scope

One of the key objectives of the report was to evaluate the current opportunity and the future potential of CRISPR-based genome editing services market. We have provided an informed estimate of the likely evolution of the market in the short to mid-term and long term, for the period 2019-2030.

In addition, we have segmented the future opportunity across [A] type of services offered (gRNA construction, cell line engineering and animal model generation), [B] type of cell line used (mammalian, microbial, insect and others) and [C] different geographical regions (North America, Europe, Asia Pacific and rest of the world).

To account for the uncertainties associated with the CRISPR-based genome editing services market and to add robustness to our model, we have provided three forecast scenarios, portraying the conservative, base and optimistic tracks of the market's evolution.

The research, analysis and insights presented in this report are backed by a deep understanding of key insights generated from both secondary and primary research. All actual figures have been sourced and analyzed from publicly available information forums and primary research discussions. Financial figures mentioned in this report are in USD, unless otherwise specified.

Key Topics Covered

1. PREFACE1.1. Scope of the Report1.2. Research Methodology1.3. Chapter Outlines

2. EXECUTIVE SUMMARY

3. INTRODUCTION3.1. Context and Background3.2. Overview of Genome Editing3.3. History of Genome Editing3.4. Applications of Genome Editing3.5. Genome Editing Techniques3.5.1. Mutagenesis3.5.2 Conventional Homologous Recombination3.5.3 Single Stranded Oligo DNA Nucleotides Homologous Recombination3.5.4. Homing Endonuclease Systems (Adeno Associated Virus System)3.5.5. Protein-based Nuclease Systems3.5.5.1. Meganucleases3.5.5.2. Zinc Finger Nucleases3.5.5.3. Transcription Activator-like Effector Nucleases3.5.6. DNA Guided Systems3.5.6.1. Peptide Nucleic Acids3.5.6.2. Triplex Forming Oligonucleotides3.5.6.3. Structure Guided Endonucleases3.5.7. RNA Guided Systems3.5.7.1. CRISPR-Cas93.5.7.2. Targetrons3.6. CRISPR-based Genome Editing3.6.1. Role of CRISPR-Cas in Adaptive Immunity in Bacteria3.6.2. Key CRISPR-Cas Systems3.6.3. Components of CRISPR-Cas System3.6.4. Protocol for CRISPR-based Genome Editing3.7. Applications of CRISPR3.7.1. Development of Therapeutic Interventions3.7.2. Augmentation of Artificial Fertilization Techniques3.7.3. Development of Genetically Modified Organisms3.7.4. Production of Biofuels3.7.5. Other Bioengineering Applications3.8. Key Challenges and Future Perspectives

4. CRISPR-BASED GENOME EDITING SERVICE PROVIDERS: CURRENT MARKET LANDSCAPE4.1. Chapter Overview4.2. CRISPR-based Genome Editing Service Providers: Overall Market Landscape4.2.3. Analysis by Type of Service Offering4.2.4. Analysis by Type of gRNA Format4.2.5. Analysis by Type of Endonuclease4.2.6. Analysis by Type of Cas9 Format4.2.7. Analysis by Type of Cell Line Engineering Offering4.2.8. Analysis by Type of Animal Model Generation Offering4.2.9. Analysis by Availability of CRISPR Libraries4.2.10. Analysis by Year of Establishment4.2.11. Analysis by Company Size4.2.12. Analysis by Geographical Location4.2.13. Logo Landscape: Distribution by Company Size and Location of Headquarters

5. COMPANY COMPETITIVENESS ANALYSIS5.1. Chapter Overview5.2. Methodology5.3. Assumptions and Key Parameters5.4. CRISPR-based Genome Editing Service Providers: Competitive Landscape5.4.1. Small-sized Companies5.4.2. Mid-sized Companies5.4.3. Large Companies

6. COMPANY PROFILES6.1. Chapter Overview6.2. Applied StemCell6.2.1. Company Overview6.2.2. Service Portfolio6.2.3. Recent Developments and Future Outlook6.3. BioCat6.4. Biotools6.5. Charles River Laboratories6.6. Cobo Scientific6.7. Creative Biogene6.8. Cyagen Biosciences6.9. GeneCopoeia6.10. Horizon Discovery6.11. NemaMetrix6.12. Synbio Technologies6.13. Thermo Fisher Scientific

7. PATENT ANALYSIS7.1. Chapter Overview7.2. Scope and Methodology7.3. CRISPR-based Genome Editing: Patent Analysis7.3.1. Analysis by Application Year and Publication Year7.3.2. Analysis by Geography7.3.3. Analysis by CPC Symbols7.3.4. Emerging Focus Areas7.3.5. Leading Players: Analysis by Number of Patents7.4. CRISPR-based Genome Editing: Patent Benchmarking Analysis7.4.1. Analysis by Patent Characteristics7.5. Patent Valuation Analysis

8. ACADEMIC GRANT ANALYSIS8.1. Chapter Overview8.2. Scope and Methodology8.3. Grants Awarded by the National Institutes of Health for CRISPR-based8.3.1. Year-wise Trend of Grant Award8.3.2. Analysis by Amount Awarded8.3.3. Analysis by Administering Institutes8.3.4. Analysis by Support Period8.3.5. Analysis by Funding Mechanism8.3.6. Analysis by Type of Grant Application8.3.7. Analysis by Grant Activity8.3.8. Analysis by Recipient Organization8.3.9. Regional Distribution of Grant Recipient Organization8.3.10. Prominent Project Leaders: Analysis by Number of Grants8.3.11. Emerging Focus Areas8.3.12. Grant Attractiveness Analysis

9. CASE STUDY: ADVANCED CRISPR-BASED TECHNOLOGIES/SYSTEMS AND TOOLS9.1. Chapter Overview9.2. CRISPR-based Technology Providers9.2.1. Analysis by Year of Establishment and Company Size9.2.2. Analysis by Geographical Location and Company Expertise9.2.3. Analysis by Focus Area9.2.4. Key Technology Providers: Company Snapshots9.2.4.1. APSIS Therapeutics9.2.4.2. Beam Therapeutics9.2.4.3. CRISPR Therapeutics9.2.4.4. Editas Medicine9.2.4.5. Intellia Therapeutics9.2.4.6. Jenthera Therapeutics9.2.4.7. KSQ Therapeutics9.2.4.8. Locus Biosciences9.2.4.9. Refuge Biotechnologies9.2.4.10. Repare Therapeutics9.2.4.11. SNIPR BIOME9.2.5. Key Technology Providers: Summary of Venture Capital Investments9.3. List of CRISPR Kit Providers9.4. List of CRISPR Design Tool Providers

10. POTENTIAL STRATEGIC PARTNERS10.1. Chapter Overview10.2. Scope and Methodology10.3. Potential Strategic Partners for Genome Editing Service Providers10.3.1. Key Industry Partners10.3.1.1. Most Like
ly Partners10.3.1.2. Likely Partners10.3.1.3. Less Likely Partners10.3.2. Key Non-Industry/Academic Partners10.3.2.1. Most Likely Partners10.3.2.2. Likely Partners10.3.2.3. Less Likely Partners

11. MARKET FORECAST11.1. Chapter Overview11.2. Forecast Methodology and Key Assumptions11.3. Overall CRISPR-based Genome Editing Services Market, 2019-203011.4. CRISPR-based Genome Editing Services Market: Distribution by Regions, 2019-203011.4.1. CRISPR-based Genome Editing Services Market in North America, 2019-203011.4.2. CRISPR-based Genome Editing Services Market in Europe, 2019-203011.4.3. CRISPR-based Genome Editing Services Market in Asia Pacific, 2019-203011.4.4. CRISPR-based Genome Editing Services Market in Rest of the World, 2019-203011.5. CRISPR-based Genome Editing Services Market: Distribution by Type of Services, 2019-203011.5.1. CRISPR-based Genome Editing Services Market for gRNA Construction, 2019-203011.5.2. CRISPR-based Genome Editing Services Market for Cell Line Engineering, 2019-203011.5.3. CRISPR-based Genome Editing Services Market for Animal Model Generation, 2019-203011.6. CRISPR-based Genome Editing Services Market: Distribution by Type of Cell Line, 2019-203011.6.1. CRISPR-based Genome Editing Services Market for Mammalian Cell Lines, 2019-203011.6.2. CRISPR-based Genome Editing Services Market for Microbial Cell Lines, 2019-203011.6.3. CRISPR-based Genome Editing Services Market for Other Cell Lines, 2019-2030

12. SWOT ANALYSIS12.1. Chapter Overview12.2. SWOT Analysis12.2.1. Strengths12.2.2. Weaknesses12.2.3. Opportunities12.2.4. Threats12.2.5. Concluding Remarks

13. EXECUTIVE INSIGHTS

14. APPENDIX 1: TABULATED DATA

15. APPENDIX 2: LIST OF COMPANIES AND ORGANIZATIONS

Companies Mentioned

For more information about this report visit https://www.researchandmarkets.com/r/78rwbq

Research and Markets also offers Custom Research services providing focused, comprehensive and tailored research.

Media Contact:

Research and Markets Laura Wood, Senior Manager press@researchandmarkets.com

For E.S.T Office Hours Call +1-917-300-0470 For U.S./CAN Toll Free Call +1-800-526-8630 For GMT Office Hours Call +353-1-416-8900

U.S. Fax: 646-607-1907 Fax (outside U.S.): +353-1-481-1716

View original content:http://www.prnewswire.com/news-releases/genome-editing-services-world-markets-to-2030-focus-on-crispr---the-most-popular-genome-manipulation-technology-tool-300966100.html

SOURCE Research and Markets

View original post here:
Genome Editing Services, World Markets to 2030: Focus on CRISPR - The Most Popular Genome Manipulation Technology Tool - P&T Community

The Big Science and Environment Stories of the Decade – KQED

The 2010s saw breakthroughs in medical science and spectacular discoveries in space and physics. For Californians, it was also the decade that climate change arrived in our front yards in the form of serial cataclysmic fire seasons.

During the decade, scientists refined the regimen of HIV/AIDS medication, made life-saving advances in the treatment of cancer, and invented an entirely new gene-editing technology, with the hope of one day curing diseases before they begin.

NASAs New Horizons probe captured the first close-up images of Pluto, and the world caught its first glimpse, albeit a bit blurry, of a black hole. Our understanding of exoplanets exploded: the Kepler Space Telescope and the TESS satellite found thousands of new planets outside our solar system, and researchers began to comprehend what those worlds might actually look like.

As the decade closes, the KQED Science team has created a sort of mixtape of the major trends, significant moments and noteworthy discoveries, with an eye toward California and the Bay Area.

Do you want the good news or the bad news first? Well, lets get it out of the way ...

Wildfires Create Havoc

The changing climate is leading to longer dry periods in California, which is at least three degrees warmer since the beginning of the industrial era, the Environmental Protection Agency reported in 2016.

Climate change, combined with a century of suppressing wildfires and denser populations in areas perilously close to fire-prone wilderness, have created the worst fire seasons on record. Since 2012, four of the five biggest California wildfires have burned over 1.2 million acres.

Late on Oct. 8, 2017, hot, dry winds downed power lines, carrying sparks and flaming embers long distances to ignite multiple fires. The Tubbs Fire and other North Bay blazes scorched large areas of Sonoma and Napa counties, claiming 44 lives and destroying over 8,000 buildings.

The following summer, during the Carr Fire, a "fire tornado" exploded into the outskirts of Redding, devastating everything in its path. The blaze killed eight people and destroyed 1,000 homes.

But the worst was yet to come. In November, the Camp Fire nearly wiped out the town of Paradise and surrounding communities. It was the deadliest wildfire in California history, killing 86 people, destroying almost 14,000 homes, and costing more money than any natural disaster in the world that year. Across wide swaths of the state, smoke from the fire rendered the air unhealthy to breathe, inundating the Bay Area for almost two weeks so that the region registered its worst air quality on record.

As far as global warming goes, the outlook is not good, whether it relates to fires or to other natural disasters. The 2010s included the hottest year (2017) and the hottest month (July 2019) on record, and the 10 years that make up the decade will almost certainly set a new temperature mark as well, according to the U.N., based on millions of global measurements taken over the last 170 years.

This summer, our series Living With Wildfire: California Reimagined asked some big questions about how the state can, in our warming world, learn to survive more frequent and ferocious conflagrations. Are some fire-prone areas now too dangerous to accommodate new housing? How can towns prepare for mass evacuations? And neighborhoods make themselves fire-resistant? Are Californians willing to suffer the inconvenience and financial cost to protect the state from extreme wildfires? Perhaps, but it will mean big changes in how we think and live. -- Danielle Venton

Rise of Renewables

As Californians began to experience climate change in the form of hotter days and more destructive fires, state policies to mitigate global warming began to pay dividends. Californias investor-owned utilities shattered renewable energy targets mandated by the state, and California reduced its overall emissions of greenhouse gases below the 1990 level, two years ahead of schedule.

These climate policies, in a state with the world's fifth largest economy, helped spur a rapid decline in the cost of renewable energy around the U.S. This past decade, the cost of wind energy fell by 57%, utility-scale solar power by 86%, and battery energy storage by 76%. In 2019, for the first time, power generation in the U.S. from renewable energy surpassed power produced from coal.

Those are big successes, but California has a lot of work to do over the next 10 years if the state is going to meet its 2045 goal of net-zero emissions, also called carbon neutrality. California is way behind in meeting this ambitious objective, in part because emissions from the transportation sector are soaring, due to Californians driving more miles in larger, gas-guzzling trucks and of SUVs.

The state is trying to reverse this trend by incentivizing fuel-efficient cars and setting a target of 5 million electric vehicles traversing California roads by 2030. But meeting that goal is going to be tough, with sales of EVs currently standing at only a fraction of that total.

Meanwhile, frustrated by the lack of progress in the fight against climate change, young people took to the streets the last couple of years. The Sunrise Movement, Youth vs. Apocalypse and other Bay Area advocacy groups participated in global climate strikes protesting the failure of government, finance, industry and other institutions to address climate change.-- Kevin Stark

Medical Advances

The decade saw major advances in the treatment of HIV and cancer.

Over the last 10 years, scientists have perfected antiretroviral drugs, taken daily in a single pill by people who are HIV-positive. These drugs allow HIV patients to live relatively free of sickness, a far cry from the first decade of the epidemic, when the diagnosis was tantamount to a death sentence. No longer highly toxic, antiretrovirals now work so well they can lower a patient's viral load to undetectable levels, making it untransmittable from one person to another. Another daily pill, called PrEP, can be used as a prophylactic against HIV exposure by people who are still free of the virus. Such major strides in treatment and prevention are why scientists are optimistic HIV will be eradicated altogether within the next decade.

For some types of cancer, a treatment called immunotherapy drastically improved survival and cure rates. For example, stage 4 melanoma , which doesn't respond to radiation or chemotherapy, used to mean certain death, with patients surviving less than a year on average. But over the last decade, instead of burning or poisoning cancer cells to stop the disease, new medicines have unleashed the body's natural defenses.

Normally the immune system recognizes disease-causing organisms. But cancer cells go undetected as harmful. New drugs, as well as genetic engineering techniques, make them visible and ripe for attack. Think of it like affixing a flag with the message kill me" on cells that previously operated with impunity. Pancreatic, breast and prostate cancer, among other types, do not currently respond to immunotherapy, but scientists foresee a day when the treatment could be the primary weapon against an array of cancers.

There may also be a day when doctors can eliminate genetic diseases altogether. A tool called CRISPR acts as a molecular scalpel that can make precise changes to genetic mutations giving rise to disease. Scientists hope to one day cure genetic conditions like blindness or sickle cell anemia before they even start. Though tinkering with our DNA raises all kinds of ethical questions about "playing God."-- Lesley McClurg

See the original post here:
The Big Science and Environment Stories of the Decade - KQED

Who’s Afraid of Roundup? – American Council on Science and Health

By Geoffrey Kabat

Originally published asKabat, Geoffrey. Whos Afraid of Roundup?Issues in Science and Technology36, no. 1 (Fall 2019): 6473. Reprinted with permission.

In May 2019, a California jury awarded $2 billion to a husband and wife who claimed that the weed-killer Roundup caused their non-Hodgkins lymphoma. The defendant in the suit was Bayer AG, which had recently acquired Monsanto, Roundups manufacturer.

Crucial in determining the judgment was Alameda County Superior Court judge Winifred Smiths denial of a request by Bayers lawyers to share with the jury the US Environmental Protection Agencys recent determination that the active ingredient in Roundup, glyphosate, is not carcinogenic and poses no risk to public health when used as directed. What is the relevance? the judge is reported to have asked.

Instead, the judge allowed the plaintiffs lawyers to base their case on the International Agency for Research on Cancers (IARC) 2015 determination that glyphosate is a probable carcinogen. Deprived of the opportunity to hear any countervailing evidence, the jury found for the plaintiffs.

This was the third Roundup trial, following other cases in which a total of $158 million was awarded to the plaintiffs. At present there are over 18,000 lawsuits in the United States pending against Bayer based on claims that exposure to Roundup was responsible for the plaintiffs cancers.

The stakes are not limited to Bayer and those involved in the lawsuits. They extend to farmers, the agricultural sector of every country, and consumers worldwide who depend on affordable food. And even beyond these impacts, what is at stake is societys ability to rely on the best scientific evidence on questions that are entangled with competing interests and deeply held worldviews.

Roundup, the worlds most widely used herbicide, has been in use for 45 years. By targeting a key enzyme present in all plants, it can kill a wide variety of weeds. Farmers value it because it enables them to manage weeds more easily and more effectively than other products, and because it reduces the need for tillage, thus improving soil conservation. Roundup also has low toxicity compared with products it has replaced, such as atrazine and alachlor (both of which are banned in Europe). A successful campaign to ban Roundup would result in a worsening of soil quality and deny farmers a crucial tool for controlling weeds, confronting them with the choice between a return to using more harmful herbicides or experiencing major reductions in agricultural productivity for many crops.

THE SOLE DISSENTING VOICE

In view of the prominence given IARC in the legal proceedings, it is noteworthy that the agency stands alone in its conclusion that glyphosate poses a carcinogenic risk. The US Environmental Protection Agencys recent assessment is only the latest in a succession of reports from national regulatory agencies, as well as international bodies, that support the safety of glyphosate. These include Health Canada, the European Food Safety Authority (EFSA), the European Chemicals Agency, Germanys Federal Institute for Risk Assessment, and the Food and Agriculture Organization of the United Nations, as well as health and regulatory agencies of France, Australia, New Zealand, Japan, and Brazil.

How has a chemical that has been exhaustively reviewed by regulatory agencies all over the world and repeatedly found to be safe become a vehicle for a torrent of lawsuits?

To answer this question, the place to start is IARC, which in March 2015 classified glyphosate as a probable carcinogen based primarily on what it termed sufficient evidence in rodent studies. However, revelations by the Reuters journalist Kate Kelland, and documents made public in the Monsanto lawsuits, paint a different picture from that presented by IARC regarding the agencys process in initiating and producing the report and its conclusions.

Unlike virtually all other agencies, IARC engages in hazard assessment rather than risk assessment. This means that IARC considers any scientific evidence of possible carcinogenicity, no matter how difficult to interpret or how irrelevant to actual human exposure. In doing so, the agency ignores a cornerstone of toxicology that states the dose makes the poison. The agencys approach fails to distinguish between exposures as they occur in the real world and far-fetched and improbable scenarios, and this in turn leads to an upward skewing of evaluations in terms of risk. (Unsurprisingly, then, of roughly 500 agents and chemicals evaluated by IARC, only one, caprolactam, a chemical used in the manufacture of synthetic textiles, was found unlikely to be carcinogenic). The problems with the IARC glyphosate classification, however, cannot be explained primarily by the distinction between hazard and risk evaluation.

How has a chemical that has been exhaustively reviewed by regulatory agencies all over the world and repeatedly found to be safe become a vehicle for a torrent of lawsuits?

First, IARC based its probable carcinogen assessment primarily on the results of studies in rodents, because the agency considered the human evidence limited. However, independent analysis (by a former statistician at the US National Cancer Institute, Robert Tarone) of the rodent studies relied on by IARC showed no consistent or robust evidence of increased tumor yields in exposed animals. The IARC Working Group that conducted the assessment selected a few positive results in one sex and used an inappropriate statistical test to declare some tumor increases significant. Comparable inverse associations, some statistically significant, were ignored.

Second, IARC was aware of the availability of relevant results regarding non-Hodgkins lymphoma (NHL) from the large National Cancer Institute-funded Agricultural Health Study (AHS), a prospective study of 54,000 pesticide applicators in Iowa and North Carolina. Although only very early results for glyphosate and NHL from the study had been published when the IARC Working Group met to evaluate glyphosate, a senior investigator on the AHS served as chair of the group. This scientist would have been aware that updated results from the AHS showed no significant increases for NHL from glyphosate exposure.

IARC argues that these results were not included in its 2015 assessment of glyphosate due to its rule that excludes unpublished findings. However, if the objective was to produce a valid assessment of glyphosate, this explanation is inadequate. The characteristics and methods of the AHS were widely known, and the details of the statistical methods used in the analysis of NHL had been described in a 2014 paper. Given that the existence of high-quality results from a large, carefully designed prospective studyprecisely the type of human evidence that regulators most valuewas known to at least one member of the Working Group, IARCs decision to proceed with the report but ignore the existence of the AHS results requires a more forthcoming explanation. Indeed, when the results for glyphosate and cancer incidence in the AHS were finally published in theJournal of the National Cancer Institute, in 2018, the paper reported no significant increases for more than 20 solid or lymphopoietic malignancies, including NHL and several NHL subtypes.

Third, in the past two years other improprieties in IARCs glyphosate assessment have come to light. Kate Kelland, the health reporter for Reuters, examined drafts of the chapter of the monograph devoted to animal studies and found that early drafts more accurately summarized the evidence, whereas later drafts progressively emphasized findings that appeared to indicate a positive association.

Finally, the role played by Christopher Portier in the glyphosate assessment has become apparent in transcripts from litigation involving Monsanto. Portier, an American scientist who had worked for the federal government, chaired an IARC committee that prioritized glyphosate as an agent to be evaluated, and subsequently served
as an invited specialist on the Working Group that evaluated glyphosate. Although IARC is hyper-alert to conflicts of interest involving industry, the agency seems not to be concerned about anti-industry bias. Two weeks after publication of the IARC report, Portier signed a lucrative contract to act as a litigation consultant with a law firmLundy, Lundy, Soileau, and Southengaged in bringing lawsuits against Monsanto for Roundup exposure.

In sum, IARCs classification of glyphosate diverged from the conclusions of other agencies worldwide, and the divergence resulted from a flawed assessment of the scientific evidence by the IARC Working Group.

A POWERFUL COUNTER-NARRATIVE

How can a respected scientific agency and its supporters take such a different view of the safety of Roundup/glyphosate from the mainstream?

Although glyphosate spraying has been practiced since 1974, its use has increased almost 15-fold globally since the 1996 introduction of Roundup-ready genetically engineered, glyphosate-tolerant crops. As a result, use of Roundup and cultivation of genetically modified foods have become indissolubly linked not just in agricultural practice but in public debates about genetically modified organisms (GMOs). Indeed, a powerful alliance of groups that oppose agricultural biotechnology has entered the fray concerning the carcinogenicity of glyphosate. These groups are anti-GMO, anti-pesticide, and anti-Big Ag, favoring instead natural farming and organic foods. One prominent organization is US Right to Know (USRTK), funded by the Organic Consumers Association, which advocates for organic agriculture while opposing genetic engineeringas well as, it might be noted, vaccines. USRTK and similar groups, including GM Watch, the Environmental Working Group, Greenpeace, and many others, ignore the enormous body of evidence that demonstrates the benefits of genetic engineering of crops, for example through improved tolerance to drought, increased resistance to pests, and enhanced nutrient content (as in the case of Golden Rice). Now the low-toxicity pesticides that enhance the value of GM crops are in the crosshairs as well. To these groups, IARC represents the sole agency that has not been corrupted by making compromises with industry.

Anti-GM agriculture groups have been waging an all-out campaign on their websites and in social media attacking journalists, scientists, and agricultural experts who defend modern farming and criticize IARC, alleging that they sow misinformation, ignore evidence of risks, and are compromised by conflicts of interest. Their targets have included academic experts Nina Fedoroff of Penn State, Kevin Folta of the University of Florida, Drew Kershen of the University of Oklahoma, Alison Van Eenennaam of the University of California, Davis, and many others (including myself). To counter the activist anti-GMO, anti-pesticide organizations, groups such as the Genetic Literacy Project, the American Council on Science and Health, and the Cornell Alliance for Science see their mission as trying to explain the science and its implications on these contested topics to the public.

What distinguishes the two sides is that the latter groups pay more attention to the quality of the scientific evidence and are interested in gene editing, development of more resilient crop varieties, strategies for reducing pesticide use, and other advances that have the potential to feed more people with fewer chemical inputs using less land. In contrast, the former groups tend simply to assert that there are serious risks associated with genetic engineering of plants and animals and with pesticides, and to tar all who disagree as being associated with agrichemical companies and their front organizations. They dont have to point to any substantive evidence of the implied risks or cover-ups. They dont have to distinguish between solid studies and those that are questionable. All thats needed is to assert that the figures they single out are part of a sinister and corrupt network featuring,as USRTK says, secret financial arrangements and close collaborations between corporations, their PR firms, and supposedly independent academics who promote corporate interests.

In addition, both American and European activists have been lobbying bureaucrats and politicians in the European Union to have glyphosate banned. Christopher Portier and Carey Gillam, a spokesperson for USRTK, have testified before the European Parliament in support of a ban. The European Union provides fertile soil for activists opposed to modern agricultural practices because it has enshrined the precautionary principle as part of its regulatory framework. As explained in a 2017 European Commission document, the precautionary principle allows that regulatory intervention may still be legitimate, even if the supporting evidence is incomplete or speculative and the economic costs of regulation are high. IARC, by declaring glyphosate a probable carcinogen, provides groups such as USRTK the authoritative scientific cover they need to pursue their campaign against Bayer and in support of a glyphosate ban. In California, the IARC findings allow the state to list glyphosate as a carcinogen under its 1986 Safe Drinking Water and Toxic Enforcement Act, better known as Proposition 65, and thus provide an apparent scientific basis for litigation.

More broadly, IARCs flawed assessment both relies on and lends apparent scientific credibility to a variety of powerful beliefs and biases that infect the public discussion of environmental exposures to chemicals such as glyphosate. By bracketing out much of what is known about the causes of cancer and by focusing peoples attention solely on what are trace environmental residues, activist organizations reinforce these beliefs and biases, which seem prevalent enough to merit being labeled memes. From my own work, and building on decades of research by decision scientists such as Paul Slovic, Cass Sunstein, Daniel Gardner, and Peter Sandman, I identify at least four such memes:

In the case of glyphosate, 40 years of science demonstrating the safety of the chemical is quite consistent and is supported not only by industry-affiliated scientists but by independent scientists, including agricultural experts, toxicologists, and regulatory officials who are familiar with pesticide use, as evidenced by the fact that so many regulatory bodies worldwide are in agreement. Why, then, are the attacks on glyphosate in courtrooms and governments succeeding? Part of the explanation of course is that the widely shared memes I cite allow advocacy groups and others skeptical of GM crops and agrochemicals to discount the body of science documenting glyphosates safety and focus entirely on the IARC assessment.

SCIENCE DIVIDED

The more interesting and difficult question is why a substantial number of scientists appear to support the IARC assessment. Indeed,a November 2015 letterto the European Commissioner for Health and Food Safety signed by 96 scientists attacked the European Food Safety Authoritys determination that glyphosate was not carcinogenic, and supported IARCs contrary determination.

But Bernhard Url, the head of EFSA, in an address to his organization, provided a different perspective: People that have not contributed to the work, that have not seen the evidence most likely, that have not had the time to go into the detail, that are not in the process, have signed a letter of support [for a ban on glyphosate]. Sorry to say that, for me, with this you leave the domain of science, you enter into the domain of lobbying and campaigning. And this is not the way EFSA goes.

Its possible to understand why scientists without direct and deep expertise on a specific subject might weigh in through such a letter. Because, of course, scientists are human too. Scientists who have worked with IARC appear to feel a strong loyalty to the institution and rally to its defense, often without appearing to know the details of the substantive criticisms that have been made by outside scientists. But if loyalty to
IARC and alignment with its mission can explain the support of IARCs broad base, it is still necessary to explain how the IARC leadership that organized and oversaw the glyphosate review can defend their position. Here, it is difficult to escape the conclusion that there are bigger issues at stake than the narrow interpretation of the evidence regarding glyphosate.

IARCs flawed assessment both relies on and lends apparent scientific credibility to a variety of powerful beliefs and biases that infect the public discussion.

My own belief is that an extreme precautionary approach to evaluating risks is at the root of both the recent conduct of the IARC program to identify human carcinogens and that of IARC-associated epidemiologists who are, it seems, often willing to give weight to evidence from weak observational studies and from other types of studies that appear to point to a risk. It must also be said that being in a position to make authoritative pronouncements about risks that are of public concern is not a negligible source of influence and career advancement. Because of their political or professional stake in the issue, scientists may find particularly credible and draw attention to certain studies that purport to show an association, while ignoring other higher-quality studies. For example, an expert providing testimony for the plaintiffs in one of the Monsanto cases cited crude case-control studies of glyphosate as evidence that exposure is associated with increased risk of NHL, while ignoring the higher-quality findings of the Agricultural Health Study. A recent paper in the journalMutation Researchcombined five small case-control studies with the much larger AHS in a meta-analysis, and, by selecting only the highest of five risk estimates from the AHS, the authors asserted that exposure to glyphosate increased the risk of NHL by 41%. If they had included the other estimates, there likely would have been no risk. One could give many more examples of this kind of selective approach to the evidence.

Of course, other scientists may have biases that push in the other direction, sometimes indeed because their interests or sympathies lie with industry, or with farmers. But thats why scientists representing a variety of institutional perspectives need to be included in any process to assess small environmental risks in large populations using complex statistical tools. And failure to have such representation sets IARC apart from the many other environmental risk assessment bodies that have concluded that glyphosate does not pose a cancer risk.

FACEBOOK SCIENCE IN ACTION

For years, IARC has positioned itself as the voice of independent scientific authority on the carcinogenicity of physical, chemical, and biological agents. When specific assessments of IARC have been questioned or criticized by qualified scientists, the agencys default response has been to assert its preeminent position and its authority, rather than to address the specific substantive criticisms or engage in a discussion of the evidence on its merits. In addition, IARC and its defenders typically argue that any criticism must be motivated by conflicts of interest and subservience to industry. For example, an article published in 2015 in the journalEnvironmental Health Perspectivestitled IARC Monographs: 40 Years of Evaluating Carcinogenic Hazards to Humans, signed by 124 authors, sought to win the public debate by insinuating that critics of IARC have venal motives. Yet the article consistently failed to address legitimate specific points raised by critics.

This pattern of refusing to engage in a discussion of the evidence for its classifications goes back more than 10 years. In the most recent publications of IARC supporters addressing the glyphosate issue, the authors restate yet again IARCs conscientious approach to its mission, focus on alleged questionable behavior by Monsanto, and imply that IARCs critics have conflicts of interest. However, they continue to avoid discussing the evidence and ignore the fact that all other regulatory agencies have found glyphosate to be safe and noncarcinogenic. Nor, with the exception of acknowledging Portiers becoming a litigation consultant immediately after publication of the glyphosate assessment, do they acknowledge any of the other irregularities pertaining to the glyphosate report.

IARCs supporters in the scientific community consistently paint a picture of selfless scientists motivated by protecting public health pitted against powerful corporations aided by compliant scientists and politicians. Quite intentionally, this Manichean picture leaves no room for a discussion of the scientific evidence on its merits. You are either for IARC and science and public health, or you are okay with corporations assaulting public health because they dont care if people get cancer so long as they get profits. There is no middle ground. What needs emphasizing, however, is that the effect of IARCs strategy is to transform a debate about science and evidence into a crusade for moral and political purity against which there can be no defense. In this highly polarized climate, those who see things differently may be reluctant to speak out.

The memes that shape peoplesincluding some scientistsviews on complex issues of risk coalesce, and reinforce and amplify each other, contributing to what the Nobel Prizewinning behavioral psychologist Daniel Kahneman terms an availability cascade a self-sustaining chain of events, which may start from media reports of a relatively minor event and lead up to public panic and large-scale government action. This cycle, he adds, is sometimes sped along deliberately by availability entrepreneurs, individuals or organizations who work to ensure a continuous flow of worrying news. The danger is increasingly exaggerated as the media compete for attention-grabbing headlines. Scientists and others who try to dampen the increasing fear and revulsion attract little attention, most of it hostile: anyone who claims that the danger is overstated is suspected of association with a heinous cover-up.

Under such circumstances, positive studies or assessments, such as IARCs assessment of glyphosate, reinforce the prevailing beliefs and fears, while negative studies or assessments, such as those of the other government organizations that do not reveal a cancer risk, fail to find a receptive audience. The availability cascade has in turn led to the juggernaut of litigation cases against Monsanto/Bayer, each one enacted as a morality play in which a plaintiff with a rare, poorly-understood cancer is pitted against a powerful corporation.

In this case, the availability entrepreneurs include IARC itself, along with some scientists, advocates, plaintiffs lawyers, and nongovernmental organizations with an agenda (and, it should be said, with their own set of financial interests, such as funding from the organic foods industry and manufacturers of green environmental products). Collectively, they spin the evidence for their purposes. The result is what EFSAs Bernhard Url has termed the Facebook age of science. As he put it: You have a scientific assessment, you put it on Facebook, and you count how many people like it. For [EFSA], this is no way forward. We produce a scientific opinion, we stand for it, but we cannot take into account whether it will be liked or not.

The glyphosate controversy may be the most glaring example of Facebook science, but it should come as no surprise that the same factors that are at work here are at work in many other areas, whether electromagnetic fields, cell phone radiation, so-called endocrine disrupting chemicals, numerous aspects of diet, cosmetic talc, GMOs, vaccines, nuclear power, or climate change.

Todays highly interconnected world faces serious problems that are in large part the result of the unprecedented progress that has been made over the past 150 years in science, technology, public health, and nutrition. These problems include, among others, the emergence of new pandemic virus strains and increasing antibiotic
resistance; degradation of the environment, leading to loss of habitat and loss of species diversity; the challenge of producing adequate food for a growing population; and the pressing need to transition to a realistic energy policy as part of a response to a changing climate. These challenges will not be met by appeasing activists who seem to believe that the world would be better off today without many of the scientific and technological advances of the past, who exaggerate the risk associated with those advances by misrepresenting the scientific evidence, and who have nothing to offer but the simplistic and moralistic narratives of Facebook science.

I realize that complex issues of risk and the environment create a near-impenetrable thicket of uncertainties, values, interests, and competing experts views of the evidence. But sometimes the clear weight of evidence coupled with a dose of common sense is enough to show whats right, even if that means going against the tide of popular outrage. Glyphosate is a boon to agriculture and humanity. Lets refocus the energy and resources spent on trying to demonize this useful and valuable chemical on problems that really matter.

Geoffrey Kabatserves on the ACSH Board of Science Advisors and is a cancer epidemiologist and the author ofGetting Risk Right: Understanding the Science of Elusive Health Risks.

See the original post here:
Who's Afraid of Roundup? - American Council on Science and Health

New Clues as to Why Mutations in the MYH9 Gene Cause a Broad Spectrum of Disorders in Humans – Newswise

MEDIA CONTACT

Available for logged-in reporters only

New Clues as to Why Mutations in the MYH9 Gene Cause Broad Spectrum of Disorders in Humans

Researchers use in vivo imaging to watch how cells move and generate forces inside living tissues, study sheds new light on how motor proteins generate forces inside living tissues and how genetic factors alter these forces to result in disease

Newswise New York, NYOctober 28, 2019Myosins are motor proteins that convert chemical energy into mechanical work, generating force and movement. Myosin II generates forces that are essential to drive cell movements and cell shape changes that generate tissue structure. While researchers know that mutations in the genes that encode nonmuscle myosin II lead to diseases, including severe congenital defects as well as blood platelet dysfunction, nephritis, and deafness in adults, they do not fully understand the mechanisms that translate altered myosin activity into specific changes in tissue organization and physiology.

A team of researchers led by Karen Kasza, Clare Boothe Luce Assistant Professor of Mechanical Engineering, used the Drosophila embryo to model human disease mutations that affect myosin motor activity. Through in vivo imaging and biophysical analysis, they demonstrated that engineering human MYH9-related disease mutations into Drosophila myosin II produces motors with altered organization and dynamics that fail to drive rapid cell movements, resulting in defects in epithelial morphogenesis. The studythe first to demonstrate that these mutations result in slower cell movements in vivowas published October 15, 2019, by PNAS.

Its not currently possible to watch what happens at the cell level when these genes are mutated in humans, and its still really difficult to do this in mammalian model organisms like mice, says Kasza, the studys lead author who began the research as a postdoctoral fellow at the Sloan Kettering Institute and continued it when she joined Columbia Engineering in 2016.

Because there are so many similarities between the myosin II protein in humans and in fruit flies, Kaszas approach was to start by tackling how to watch the effects of myosin II mutations in fruit flies. Her group engineered the human disease mutations into fruit fly myosin and then observed how this affected the behaviors of the proteins, cells, and tissues in the organism.

They used high-resolution confocal fluorescence imaging to take movies of the process, together with biophysical approaches such as laser ablation, or laser nano-dissection, to measure the forces generated by the mutated myosin II motor proteins in vivo.

Kasza found that, while the mutated myosin II motor proteins actually went to the proper places inside cells and were able to generate force, the fine-scale organization of the myosin proteins and the speed of their movement inside cells were different than for the normal wild-type myosin protein. The team saw slower movements of cells within tissues that brought about abnormalities in embryo shape during development.

By watching how cells move and generate forces inside living tissues, weve uncovered new clues as to why mutations in theMYH9gene cause a broad spectrum of disorders in humans. Kasza observes. Our work sheds new light on how motor proteins generate forces inside living tissues and on how genetic factors alterthese forces to result in disease. This mechanistic understanding will help us better understand these diseases and could lead to new diagnostic or therapeutic strategies down the road.

The researchers are now working on new approaches to very precisely manipulate the forces generated by myosin motors inside living cells and tissues. These new tools will help the team to uncover how mechanical forces influence biochemical processes that control cell movements and cell fate. These studies will be essential to better understanding how dysregulation of mechanical forces contributes to disease.

About the Study

The study is titled Cellular defects resulting from disease-related myosin II mutations in Drosophila.

Authors are: Karen E. Kasza1,2,; Sara Supriyatno1; and Jennifer A. Zallen1.1Howard Hughes Medical Institute and Developmental Biology Program, Sloan Kettering Institute;2Department of Mechanical Engineering, Columbia Engineering.

The study was supported by NIH/NIGMS R01 grant GM102803 to JAZ. KEK holds a Career Award at the Scientific Interface from the Burroughs Wellcome Fund, a Clare Boothe Luce Professorship, and a Packard Fellowship.JAZ is an investigator of the HowardHughes Medical Institute.

The authors declare no financial or other conflicts of interest.

###

LINKS:

Paper: https://doi.org/10.1073/pnas.1909227116 DOI: 10.1073/pnas.1909227116https://www.pnas.org/http://engineering.columbia.edu/https://engineering.columbia.edu/faculty/karen-kaszahttps://me.columbia.edu/

###

Columbia EngineeringColumbia Engineering, based in New York City, is one of the top engineering schools in the U.S. and one of the oldest in the nation. Also known as The Fu Foundation School of Engineering and Applied Science, the School expands knowledge and advances technology through the pioneering research of its more than 220 faculty, while educating undergraduate and graduate students in a collaborative environment to become leaders informed by a firm foundation in engineering. The Schools faculty are at the center of the Universitys cross-disciplinary research, contributing to the Data Science Institute, Earth Institute, Zuckerman Mind Brain Behavior Institute, Precision Medicine Initiative, and the Columbia Nano Initiative. Guided by its strategic vision, Columbia Engineering for Humanity, the School aims to translate ideas into innovations that foster a sustainable, healthy, secure, connected, and creative humanity.

Read more:
New Clues as to Why Mutations in the MYH9 Gene Cause a Broad Spectrum of Disorders in Humans - Newswise

Genome editing – Wikipedia

Genome editing, or genome engineering, is a type of genetic engineering in which DNA is inserted, deleted, modified or replaced in the genome of a living organism. Unlike early genetic engineering techniques that randomly inserts genetic material into a host genome, genome editing targets the insertions to site specific locations.

In 2018, the common methods for such editing use engineered nucleases, or "molecular scissors". These nucleases create site-specific double-strand breaks (DSBs) at desired locations in the genome. The induced double-strand breaks are repaired through nonhomologous end-joining (NHEJ) or homologous recombination (HR), resulting in targeted mutations ('edits').

As of 2015 four families of engineered nucleases were used: meganucleases, zinc finger nucleases (ZFNs), transcription activator-like effector-based nucleases (TALEN), and the clustered regularly interspaced short palindromic repeats (CRISPR/Cas9) system.[1][2][3][4] Nine genome editors were available as of 2017.[5]

Genome editing with engineered nucleases, i.e. all three major classes of these enzymeszinc finger nucleases (ZFNs), transcription activator-like effector nucleases (TALENs) and engineered meganucleaseswere selected by Nature Methods as the 2011 Method of the Year.[6] The CRISPR-Cas system was selected by Science as 2015 Breakthrough of the Year.[7]

Genetic engineering as a method of introducing new genetic elements into organisms has been around since the 1970s. One drawback of this technology has been the random nature with which the DNA is inserted into the hosts genome. This can impair or alter other genes within the organism. Methods were sought which targeted the inserted genes to specific sites within an organism genome. As well as reducing off-target effects it also enabled the editing of specific sequences within a genome. This could be used for research purposes, by targeting mutations to specific genes, and in gene therapy. By inserting a functional gene into an organism and targeting it to replace the defective one it could be possible to cure certain genetic diseases.

Early methods to target genes to certain sites within a genome (called gene targeting) relied on homologous recombination (HR).[8] By creating DNA constructs that contain a template that matches the targeted genome sequence it is possible that the HR processes within the cell will insert the construct at the desired location. Using this method on embryonic stem cells led to the development of transgenic mice with targeted genes knocked out. It has also been possible to knock in genes or alter gene expression patterns.[9] In recognition of their discovery of how homologous recombination can be used to introduce genetic modifications in mice through embryonic stem cells, Mario Capecchi, Martin Evans and Oliver Smithies were awarded the 2007 Nobel Prize for Physiology or Medicine.[10]

If a vital gene is knocked out it can prove lethal to the organism. In order to study the function of these genes site specific recombinases (SSR) were used. The two most common types are the Cre-LoxP and Flp-FRT systems. Cre recombinase is an enzyme that removes DNA by homologous recombination between binding sequences known as Lox-P sites. The Flip-FRT system operates in a similar way, with the Flip recombinase recognising FRT sequences. By crossing an organism containing the recombinase sites flanking the gene of interest with an organism that express the SSR under control of tissue specific promoters, it is possible to knock out or switch on genes only in certain cells. These techniques were also used to remove marker genes from transgenic animals. Further modifications of these systems allowed researchers to induce recombination only under certain conditions, allowing genes to be knocked out or expressed at desired times or stages of development.[9]

Genome editing relies on the concept of DNA double stranded break (DSB) repair mechanics. There are two major pathways that repair DSB; non-homologous end joining (NHEJ) and homology directed repair (HDR). NHEJ uses a variety of enzymes to directly join the DNA ends while the more accurate HDR uses a homologous sequence as a template for regeneration of missing DNA sequences at the break point. This can be exploited by creating a vector with the desired genetic elements within a sequence that is homologous to the flanking sequences of a DSB. This will result in the desired change being inserted at the site of the DSB. While HDR based gene editing is similar to the homologous recombination based gene targeting, the rate of recombination is increased by at least three orders of magnitude.[11]

The key to genome editing is creating a DSB at a specific point within the genome. Commonly used restriction enzymes are effective at cutting DNA, but generally recoginze and cut at multiple sites. To overcome this challenge and create site-specific DSB, three distinct classes of nucleases have been discovered and bioengineered to date. These are the Zinc finger nucleases (ZFNs), transcription-activator like effector nucleases (TALEN), meganucleases and the clustered regularly interspaced short palindromic repeats (CRISPR/Cas9) system.

Meganucleases, discovered in the late 1980s, are enzymes in the endonuclease family which are characterized by their capacity to recognize and cut large DNA sequences (from 14 to 40 base pairs).[12] The most widespread and best known meganucleases are the proteins in the LAGLIDADG family, which owe their name to a conserved amino acid sequence.

Meganucleases, found commonly in microbial species, have the unique property of having very long recognition sequences (>14bp) thus making them naturally very specific.[13][14] However, there is virtually no chance of finding the exact meganuclease required to act on a chosen specific DNA sequence. To overcome this challenge, mutagenesis and high throughput screening methods have been used to create meganuclease variants that recognize unique sequences.[14][15] Others have been able to fuse various meganucleases and create hybrid enzymes that recognize a new sequence.[16][17] Yet others have attempted to alter the DNA interacting aminoacids of the meganuclease to design sequence specific meganucelases in a method named rationally designed meganuclease.[18] Another approach involves using computer models to try to predict as accurately as possible the activity of the modified meganucleases and the specificity of the recognized nucleic sequence.[19]

A large bank containing several tens of thousands of protein units has been created. These units can be combined to obtain chimeric meganucleases that recognize the target site, thereby providing research and development tools that meet a wide range of needs (fundamental research, health, agriculture, industry, energy, etc.) These include the industrial-scale production of two meganucleases able to cleave the human XPC gene; mutations in this gene result in Xeroderma pigmentosum, a severe monogenic disorder that predisposes the patients to skin cancer and burns whenever their skin is exposed to UV rays.[20]

Meganucleases have the benefit of causing less toxicity in cells than methods such as Zinc finger nuclease (ZFN), likely because of more stringent DNA sequence recognition;[14] however, the construction of sequence-specific enzymes for all possible sequences is costly and time consuming, as one is not benefiting from combinatorial possibilities that methods such as ZFNs and TALEN-based fusions utilize.

As opposed to meganucleases, the concept behind ZFNs and TALEN technology is based on a non-specific DNA cutting catalytic domain, which can then be linked to specific DNA sequence recognizing peptides such as zinc fingers and transcription activator-like effectors (TALEs).[21] The first step to this was to find an endonuclease whose DNA recognition site and cleaving site were separate from each other, a situation that is not the most common among restriction enzymes.[21] Once this enzyme was found, its cleaving portion could
be separated which would be very non-specific as it would have no recognition ability. This portion could then be linked to sequence recognizing peptides that could lead to very high specificity.

Zinc finger motifs occur in several transcription factors. The zinc ion, found in 8% of all human proteins, plays an important role in the organization of their three-dimensional structure. In transcription factors, it is most often located at the protein-DNA interaction sites, where it stabilizes the motif. The C-terminal part of each finger is responsible for the specific recognition of the DNA sequence.

The recognized sequences are short, made up of around 3 base pairs, but by combining 6 to 8 zinc fingers whose recognition sites have been characterized, it is possible to obtain specific proteins for sequences of around 20 base pairs. It is therefore possible to control the expression of a specific gene. It has been demonstrated that this strategy can be used to promote a process of angiogenesis in animals.[22] It is also possible to fuse a protein constructed in this way with the catalytic domain of an endonuclease in order to induce a targeted DNA break, and therefore to use these proteins as genome engineering tools.[23]

The method generally adopted for this involves associating two DNA binding proteins each containing 3 to 6 specifically chosen zinc fingers with the catalytic domain of the FokI endonuclease which need to dimerize to cleave the double-strand DNA. The two proteins recognize two DNA sequences that are a few nucleotides apart. Linking the two zinc finger proteins to their respective sequences brings the two FokI domains closer together. FokI requires dimerization to have nuclease activity and this means the specificity increases dramatically as each nuclease partner would recognize a unique DNA sequence. To enhance this effect, FokI nucleases have been engineered that can only function as heterodimers.[24]

Several approaches are used to design specific zinc finger nucleases for the chosen sequences. The most widespread involves combining zinc-finger units with known specificities (modular assembly). Various selection techniques, using bacteria, yeast or mammal cells have been developed to identify the combinations that offer the best specificity and the best cell tolerance. Although the direct genome-wide characterization of zinc finger nuclease activity has not been reported, an assay that measures the total number of double-strand DNA breaks in cells found that only one to two such breaks occur above background in cells treated with zinc finger nucleases with a 24 bp composite recognition site and obligate heterodimer FokI nuclease domains.[24]

The heterodimer functioning nucleases would avoid the possibility of unwanted homodimer activity and thus increase specificity of the DSB. Although the nuclease portions of both ZFNs and TALEN constructs have similar properties, the difference between these engineered nucleases is in their DNA recognition peptide. ZFNs rely on Cys2-His2 zinc fingers and TALEN constructs on TALEs. Both of these DNA recognizing peptide domains have the characteristic that they are naturally found in combinations in their proteins. Cys2-His2 Zinc fingers typically happen in repeats that are 3 bp apart and are found in diverse combinations in a variety of nucleic acid interacting proteins such as transcription factors. Each finger of the Zinc finger domain is completely independent and the binding capacity of one finger is impacted by its neighbor. TALEs on the other hand are found in repeats with a one-to-one recognition ratio between the amino acids and the recognized nucleotide pairs. Because both zinc fingers and TALEs happen in repeated patterns, different combinations can be tried to create a wide variety of sequence specificities.[13] Zinc fingers have been more established in these terms and approaches such as modular assembly (where Zinc fingers correlated with a triplet sequence are attached in a row to cover the required sequence), OPEN (low-stringency selection of peptide domains vs. triplet nucleotides followed by high-stringency selections of peptide combination vs. the final target in bacterial systems), and bacterial one-hybrid screening of zinc finger libraries among other methods have been used to make site specific nucleases.

Zinc finger nucleases are research and development tools that have already been used to modify a range of genomes, in particular by the laboratories in the Zinc Finger Consortium. The US company Sangamo BioSciences uses zinc finger nucleases to carry out research into the genetic engineering of stem cells and the modification of immune cells for therapeutic purposes.[25][26] Modified T lymphocytes are currently undergoing phase I clinical trials to treat a type of brain tumor (glioblastoma) and in the fight against AIDS.[24]

Transcription activator-like effector nucleases (TALENs) are specific DNA-binding proteins that feature an array of 33 or 34-amino acid repeats. TALENs are artificial restriction enzymes designed by fusing the DNA cutting domain of a nuclease to TALE domains, which can be tailored to specifically recognize a unique DNA sequence. These fusion proteins serve as readily targetable "DNA scissors" for gene editing applications that enable to perform targeted genome modifications such as sequence insertion, deletion, repair and replacement in living cells.[27] The DNA binding domains, which can be designed to bind any desired DNA sequence, comes from TAL effectors, DNA-binding proteins excreted by plant pathogenic Xanthomanos app. TAL effectors consists of repeated domains, each of which contains a highly considered sequence of 34 amino acids, and recognize a single DNA nucleotide within the target site. The nuclease can create double strand breaks at the target site that can be repaired by error-prone non-homologous end-joining (NHEJ), resulting in gene disruptions through the introduction of small insertions or deletions. Each repeat is conserved, with the exception of the so-called repeat variable di-residues (RVDs) at amino acid positions 12 and 13. The RVDs determine the DNA sequence to which the TALE will bind. This simple one-to-one correspondence between the TALE repeats and the corresponding DNA sequence makes the process of assembling repeat arrays to recognize novel DNA sequences straightforward. These TALENs can be fused to the catalytic domain from a DNA nuclease, FokI, to generate a transcription activator-like effector nuclease (TALEN). The resultant TALEN constructs combine specificity and activity, effectively generating engineered sequence-specific nucleases that bind and cleave DNA sequences only at pre-selected sites. The TALEN target recognition system is based on an easy-to-predict code. TAL nucleases are specific to their target due in part to the length of their 30+ base pairs binding site. TALEN can be performed within a 6 base pairs range of any single nucleotide in the entire genome.[28]

TALEN constructs are used in a similar way to designed zinc finger nucleases, and have three advantages in targeted mutagenesis: (1) DNA binding specificity is higher, (2) off-target effects are lower, and (3) construction of DNA-binding domains is easier.

CRISPRs (Clustered Regularly Interspaced Short Palindromic Repeats) are genetic elements that bacteria use as a kind of acquired immunity to protect against viruses. They consist of short sequences that originate from viral genomes and have been incorporated into the bacterial genome. Cas (CRISPR associated proteins) process these sequences and cut matching viral DNA sequences. By introducing plasmids containing Cas genes and specifically constructed CRISPRs into eukaryotic cells, the eukaryotic genome can be cut at any desired position.[29] Several companies, including Cellectis[30] and Editas, have been working to monetize the CRISPR method while developing gene-specific therapies.[31][32]

Meganucleases method of gene editing is the least efficient of the methods mentioned above. Due to the nature of its DNA-binding elem
ent and the cleaving element, it is limited to recognizing one potential target every 1,000 nucleotides.[4] ZFN was developed to overcome the limitations of meganuclease. The number of possible targets ZFN can recognized was increased to one in every 140 nucleotides.[4] However, both methods are unpredictable due to the ability of their DNA-binding elements affecting each other. As a result, high degrees of expertise and lengthy and costly validations processes are required.

TALE nucleases being the most precise and specific method yields a higher efficiency than the previous two methods. It achieves such efficiency because the DNA-binding element consists of an array of TALE subunits, each of them having the capability of recognizing a specific DNA nucleotide chain independent from others, resulting in a higher number of target sites with high precision. New TALE nucleases take about one week and a few hundred dollars to create, with specific expertise in molecular biology and protein engineering.[4]

CRISPR nucleases have a slightly lower precision when compared to the TALE nucleases. This is caused by the need of having a specific nucleotide at one end in order to produce the guide RNA that CRISPR uses to repair the double-strand break it induces. It has been shown to be the quickest and cheapest method, only costing less than two hundred dollars and a few days of time.[4] CRISPR also requires the least amount of expertise in molecular biology as the design lays in the guide RNA instead of the proteins. One major advantage that CRISPR has over the ZFN and TALEN methods is that it can be directed to target different DNA sequences using its ~80nt CRISPR sgRNAs, while both ZFN and TALEN methods required construction and testing of the proteins created for targeting each DNA sequence.[33]

Because off-target activity of an active nuclease would have potentially dangerous consequences at the genetic and organismal levels, the precision of meganucleases, ZFNs, CRISPR, and TALEN-based fusions has been an active area of research. While variable figures have been reported, ZFNs tend to have more cytotoxicity than TALEN methods or RNA-guided nucleases, while TALEN and RNA-guided approaches tend to have the greatest efficiency and fewer off-target effects.[34] Based on the maximum theoretical distance between DNA binding and nuclease activity, TALEN approaches result in the greatest precision.[4]

The methods for scientists and researchers wanting to study genomic diversity and all possible associated phenotypes were very slow, expensive, and inefficient. Prior to this new revolution, researchers would have to do single-gene manipulations and tweak the genome one little section at a time, observe the phenotype, and start the process over with a different single-gene manipulation.[35] Therefore, researchers at the Wyss Institute at Harvard University designed the MAGE, a powerful technology that improves the process of in vivo genome editing. It allows for quick and efficient manipulations of a genome, all happening in a machine small enough to put on top of a small kitchen table. Those mutations combine with the variation that naturally occurs during cell mitosis creating billions of cellular mutations.

Chemically combined, synthetic single-stranded DNA (ssDNA) and a pool of oligionucleotides are introduced at targeted areas of the cell thereby creating genetic modifications. The cyclical process involves transformation of ssDNA (by electroporation) followed by outgrowth, during which bacteriophage homologous recombination proteins mediate annealing of ssDNAs to their genomic targets. Experiments targeting selective phenotypic markers are screened and identified by plating the cells on differential medias. Each cycle ultimately takes 2.5 hours to process, with additional time required to grow isogenic cultures and characterize mutations. By iteratively introducing libraries of mutagenic ssDNAs targeting multiple sites, MAGE can generate combinatorial genetic diversity in a cell population. There can be up to 50 genome edits, from single nucleotide base pairs to whole genome or gene networks simultaneously with results in a matter of days.[35]

MAGE experiments can be divided into three classes, characterized by varying degrees of scale and complexity: (i) many target sites, single genetic mutations; (ii) single target site, many genetic mutations; and (iii) many target sites, many genetic mutations.[35] An example of class three was reflected in 2009, where Church and colleagues were able to program Escherichia coli to produce five times the normal amount of lycopene, an antioxidant normally found in tomato seeds and linked to anti-cancer properties. They applied MAGE to optimize the 1-deoxy-d-xylulose-5-phosphate (DXP) metabolic pathway in Escherichia coli to overproduce isoprenoid lycopene. It took them about 3 days and just over $1,000 in materials. The ease, speed, and cost efficiency in which MAGE can alter genomes can transform how industries approach the manufacturing and production of important compounds in the bioengineering, bioenergy, biomedical engineering, synthetic biology, pharmaceutical, agricultural, and chemical industries.

As of 2012 efficient genome editing had been developed for a wide range of experimental systems ranging from plants to animals, often beyond clinical interest, and was becoming a standard experimental strategy in research labs.[36] The recent generation of rat, zebrafish, maize and tobacco ZFN-mediated mutants and the improvements in TALEN-based approaches testify to the significance of the methods, and the list is expanding rapidly. Genome editing with engineered nucleases will likely contribute to many fields of life sciences from studying gene functions in plants and animals to gene therapy in humans. For instance, the field of synthetic biology which aims to engineer cells and organisms to perform novel functions, is likely to benefit from the ability of engineered nuclease to add or remove genomic elements and therefore create complex systems.[36] In addition, gene functions can be studied using stem cells with engineered nucleases.

Listed below are some specific tasks this method can carry out:

The combination of recent discoveries in genetic engineering, particularly gene editing and the latest improvement in bovine reproduction technologies (e.g. in vitro embryo culture) allows for genome editing directly in fertilised oocytes using synthetic highly specific endonucleases. RNA-guided endonucleases:clustered regularly interspaced short palindromic repeats associated Cas9 (CRISPR/Cas9) are a new tool, further increasing the range of methods available. In particular CRISPR/Cas9 engineered endonucleases allows the use of multiple guide RNAs for simultaneous Knockouts (KO) in one step by cytoplasmic direct injection (CDI) on mammalian zygotes.[37]

Thanks to the parallel development of single cell transcriptomics, genome editing and new stem cell models we are now entering a scientifically exciting period where functional genetics is no longer restricted to animal models but can be performed directly in human samples. Single cell gene expression analysis has resolved a transcriptional road-map of human development from which key candidate genes are being identified for functional studies. Using global transcriptomics data to guide experimentation, the CRISPR based genome editing tool has made it feasible to disrupt or remove key genes in order to elucidate function in a human setting. [38]

Genome editing using Meganuclease,[39] ZFNs, and TALEN provides a new strategy for genetic manipulation in plants and are likely to assist in the engineering of desired plant traits by modifying endogenous genes. For instance, site-specific gene addition in major crop species can be used for 'trait stacking' whereby several desired traits are physically linked to ensure their co-segregation during the breeding processes.[24] Progress in such cases have been recently reported in Arabidopsis thaliana[40][41][42] and Zea mays. In Arabidopsis thaliana, using Z
FN-assisted gene targeting, two herbicide-resistant genes (tobacco acetolactate synthase SuRA and SuRB) were introduced to SuR loci with as high as 2% transformed cells with mutations.[43] In Zea mays, disruption of the target locus was achieved by ZFN-induced DSBs and the resulting NHEJ. ZFN was also used to drive herbicide-tolerance gene expression cassette (PAT) into the targeted endogenous locus IPK1 in this case.[44] Such genome modification observed in the regenerated plants has been shown to be inheritable and was transmitted to the next generation.[44] A potentially successful example of the application of genome editing techniques in crop improvement can be found in banana, where scientists used CRISPR/Cas9 editing to inactivate the endogenous banana streak virus in the B genome of banana (Musa spp.) to overcome a major challenge in banana breeding.[45]

In addition, TALEN-based genome engineering has been extensively tested and optimized for use in plants.[46]TALEN fusions have also been used by a U.S. food ingredient company, Calyxt,[47] to improve the quality of soybean oil products[48] and to increase the storage potential of potatoes[49]

Several optimizations need to be made in order to improve editing plant genomes using ZFN-mediated targeting.[50] There is a need for reliable design and subsequent test of the nucleases, the absence of toxicity of the nucleases, the appropriate choice of the plant tissue for targeting, the routes of induction of enzyme activity, the lack of off-target mutagenesis, and a reliable detection of mutated cases.[50]

A common delivery method for CRISPR/Cas9 in plants is Agrobacterium-based transformation.[51] T-DNA is introduced directly into the plant genome by a T4SS mechanism. Cas9 and gRNA-based expression cassettes are turned into Ti plasmids, which are transformed in Agrobacterium for plant application.[51] To improve Cas9 delivery in live plants, viruses are being used more effective transgene delivery.[51]

The ideal gene therapy practice is that which replaces the defective gene with a normal allele at its natural location. This is advantageous over a virally delivered gene as there is no need to include the full coding sequences and regulatory sequences when only a small proportions of the gene needs to be altered as is often the case.[52] The expression of the partially replaced genes is also more consistent with normal cell biology than full genes that are carried by viral vectors.

The first clinical use of TALEN-based genome editing was in the treatment of CD19+ acute lymphoblastic leukemia in an 11-month old child in 2015. Modified donor T cells were engineered to attack the leukemia cells, to be resistant to Alemtuzumab, and to evade detection by the host immune system after introduction.[53][54]

Extensive research has been done in cells and animals using CRISPR-Cas9 to attempt to correct genetic mutations which cause genetic diseases such as Down syndrome, spina bifida, anencephaly, and Turner and Klinefelter syndromes.[55]

In February 2019, medical scientists working with Sangamo Therapeutics, headquartered in Richmond, California, announced the first ever "in body" human gene editing therapy to permanently alter DNA - in a patient with Hunter Syndrome.[56] Clinical trials by Sangamo involving gene editing using Zinc Finger Nuclease (ZFN) are ongoing.[57]

Researchers have used CRISPR-Cas9 gene drives to modify genes associated with sterility in A. gambiae, the vector for malaria.[58] This technique has further implications in eradicating other vector borne diseases such as yellow fever, dengue, and Zika.[59]

The CRISPR-Cas9 system can be programmed to modulate the population of any bacterial species by targeting clinical genotypes or epidemiological isolates. It can selectively enable the beneficial bacterial species over the harmful ones by eliminating pathogen, which gives it an advantage over broad-spectrum antibiotics.[35]

Antiviral applications for therapies targeting human viruses such as HIV, herpes, and hepatitis B virus are under research. CRISPR can be used to target the virus or the host to disrupt genes encoding the virus cell-surface receptor proteins.[33] In November 2018, He Jiankui announced that he had edited two human embryos, to attempt to disable the gene for CCR5, which codes for a receptor that HIV uses to enter cells. He said that twin girls, Lulu and Nana, had been born a few weeks earlier. He said that the girls still carried functional copies of CCR5 along with disabled CCR5 (mosaicism) and were still vulnerable to HIV. The work was widely condemned as unethical, dangerous, and premature.[60]

In January 2019, scientists in China reported the creation of five identical cloned gene-edited monkeys, using the same cloning technique that was used with Zhong Zhong and Hua Hua the first ever cloned monkeys - and Dolly the sheep, and the same gene-editing Crispr-Cas9 technique allegedly used by He Jiankui in creating the first ever gene-modified human babies Lulu and Nana. The monkey clones were made in order to study several medical diseases.[61][62]

In the future, an important goal of research into genome editing with engineered nucleases must be the improvement of the safety and specificity of the nucleases. For example, improving the ability to detect off-target events can improve our ability to learn about ways of preventing them. In addition, zinc-fingers used in ZFNs are seldom completely specific, and some may cause a toxic reaction. However, the toxicity has been reported to be reduced by modifications done on the cleavage domain of the ZFN.[52]

In addition, research by Dana Carroll into modifying the genome with engineered nucleases has shown the need for better understanding of the basic recombination and repair machinery of DNA. In the future, a possible method to identify secondary targets would be to capture broken ends from cells expressing the ZFNs and to sequence the flanking DNA using high-throughput sequencing.[52]

Because of the ease of use and cost-efficiency of CRISPR, extensive research is currently being done on it. There are now more publications on CRISPR than ZFN and TALEN despite how recent the discovery of CRISPR is.[33] Both CRISPR and TALEN are favored to be the choices to be implemented in large-scale productions due to their precision and efficiency.

Genome editing occurs also as a natural process without artificial genetic engineering. The agents that are competent to edit genetic codes are viruses or subviral RNA-agents.

Although GEEN has higher efficiency than many other methods in reverse genetics, it is still not highly efficient; in many cases less than half of the treated populations obtain the desired changes.[43] For example, when one is planning to use the cell's NHEJ to create a mutation, the cell's HDR systems will also be at work correcting the DSB with lower mutational rates.

Traditionally, mice have been the most common choice for researchers as a host of a disease model. CRISPR can help bridge the gap between this model and human clinical trials by creating transgenic disease models in larger animals such as pigs, dogs, and non-human primates.[63][64] Using the CRISPR-Cas9 system, the programmed Cas9 protein and the sgRNA can be directly introduced into fertilized zygotes to achieve the desired gene modifications when creating transgenic models in rodents. This allows bypassing of the usual cell targeting stage in generating transgenic lines, and as a result, it reduces generation time by 90%.[64]

One potential that CRISPR brings with its effectiveness is the application of xenotransplantation. In previous research trials, CRISPR demonstrated the ability to target and eliminate endogenous retroviruses, which reduces the risk of transmitting diseases and reduces immune barriers.[33] Eliminating these problems improves donor organ function, which brings this application closer to a reality.

In plants, genome editing is seen as a viable solution to the conservation of biodiversity. Gene drive are a
potential tool to alter the reproductive rate of invasive species, although there are significant associated risks. [65]

Many transhumanists see genome editing as a potential tool for human enhancement.[66][67][68] Australian biologist and Professor of Genetics David Andrew Sinclair notes that "the new technologies with genome editing will allow it to be used on individuals (...) to have (...) healthier children" designer babies.[69] According to a September 2016 report by the Nuffield Council on Bioethics in the future it may be possible to enhance people with genes from other organisms or wholly synthetic genes to for example improve night vision and sense of smell.[70][71]

The American National Academy of Sciences and National Academy of Medicine issued a report in February 2017 giving qualified support to human genome editing.[72] They recommended that clinical trials for genome editing might one day be permitted once answers have been found to safety and efficiency problems "but only for serious conditions under stringent oversight."[73]

In the 2016 Worldwide Threat Assessment of the US Intelligence Community statement United States Director of National Intelligence, James R. Clapper, named genome editing as a potential weapon of mass destruction, stating that genome editing conducted by countries with regulatory or ethical standards "different from Western countries" probably increases the risk of the creation of harmful biological agents or products. According to the statement the broad distribution, low cost, and accelerated pace of development of this technology, its deliberate or unintentional misuse might lead to far-reaching economic and national security implications.[74][75][76] For instance technologies such as CRISPR could be used to make "killer mosquitoes" that cause plagues that wipe out staple crops.[76]

According to a September 2016 report by the Nuffield Council on Bioethics, the simplicity and low cost of tools to edit the genetic code will allow amateurs or "biohackers" to perform their own experiments, posing a potential risk from the release of genetically modified bugs. The review also found that the risks and benefits of modifying a person's genome and having those changes pass on to future generations are so complex that they demand urgent ethical scrutiny. Such modifications might have unintended consequences which could harm not only the child, but also their future children, as the altered gene would be in their sperm or eggs.[70][71] In 2001 Australian researchers Ronald Jackson and Ian Ramshaw were criticized for publishing a paper in the Journal of Virology that explored the potential control of mice, a major pest in Australia, by infecting them with an altered mousepox virus that would cause infertility as the provided sensitive information could lead to the manufacture of biological weapons by potential bioterrorists who might use the knowledge to create vaccine resistant strains of other pox viruses, such as smallpox, that could affect humans.[71][77] Furthermore, there are additional concerns about the ecological risks of releasing gene drives into wild populations.[71][78][79]

Excerpt from:
Genome editing - Wikipedia

Ethical Implications of Human Genetic Engineering | SAGE

DNA editing techniques have been available for decades and are crucial tools for understanding gene functions and molecular pathways. Recently, genome editing has stepped back into the limelight because of newer technologies that can quickly and efficiently modify genomes by introducing or genetically correcting mutations in human cells and animal models. These tools include Zinc Finger Nucleases (ZFNs), Transcription activator-like effector nucleases (TALENs), and the most recent player to join the ranks, Clustered Regularly Interspaced Short Palindromic repeats (CRISPR) (here, here). In a short time span, CRISPR/Cas9 has completely revolutionized the understanding of protein function, disease modeling, and potential therapeutic applications.

BACKGROUND on CRISPR/Cas9

The CRISPR/Cas9 system functions similarly to ZFNs and TALENs, it also takes advantage of a cells DNA repair machinery to delete (knock-out) or add in (knock-in) sequences of DNA. However, CRISPR/Cas9 offers several advantages: it is easier to target a specific gene of interest since designing the required CRISPR component is simple and efficient, whereas generating ZFNs and TALENs is more time consuming; it is often more proficient in generating the desired recombination results; and it is exponentially more cost effective, so almost any laboratory in the world can use it. CRISPR/Cas9 has been shown to work in several model organisms, and consequently researchers are keen to apply this technology for modifying genetic mutations in humans with uncured diseases as well as in human embryos, which arouses many scientific and ethical considerations.

Human embryonic gene editing

Genome editing technologies have come a long way and have already advanced towards mammalian models and clinical trials in humans. Recently, genetic modification of human embryos using CRISPR/Cas9 technology was achieved by the Huang laboratory in China in April 2015. They genetically modified un-viable embryos obtained from an in vitro fertilization clinic. These embryos were fertilized with two different sources of sperm, thus impairing their development. In this study, the Huang group repaired a mutation in the human -globin gene (HBB) that causes the blood disorder -thalassaemia. The CRISPR/Cas9 system and a donor DNA sequence containing the normal, healthy version of the HBB were injected into 86 embryos. A total of four embryos successfully integrated the corrected version of the HBB at the appropriate site. However, the authors reported a high number of off-target effects, meaning that CRISPR/Cas9 modified other locations in the genome; a non-ideal situation that could cause the disruption of other essential gene functions. The study demonstrated two important findings: genetic engineering is possible in human embryos and the CRISPR/Cas9 system requires essential improvements before it can be used in future studies on human embryos. More importantly, these results force scientists to question the future and the implications of such a powerful technology. Should we accept genetic engineering of human embryos? If yes, when and in what capacity should we accept it?

Current guidelines and regulation

Scientists in the United States are addressing the need for regulation of human embryonic gene editing. On April 29th, the US National Institute of Health (NIH) director, Dr. Francis Collins, released a statement emphasizing the bureaus policy against funding research involving genome editing of human embryos and the ethical concerns regarding this technology. However, the policy does not necessarily cover privately funded projects.

Safety regarding genetic engineering is a major concern and Huangs publication highlights this point. However, this publication forces the community to address whether scientists should use non-viable or discarded embryos to improve the efficiency and efficacy of the CRISPR/Cas9 system. The CRISPR/Cas9 system was developed for human genome targeting in 2012 and since then has seen rapid improvements. If it is decided that unviable embryos can be used for this type of research, the next step for US lawmakers is to evaluate new guidelines for the funding and safety of genetic engineering in these embryos.

Ethical concerns

While the interest and use of CRISPR/Cas9 has exploded since its discovery in 2012, prominent scientists in the field have already initiated conversations regarding the ethical implications that arise when modifying the human genome. Preventing genetic diseases by human genetic engineering is inevitable. The slippery slope is when/if we start to use it for cosmetic changes such as eye color or for improving a desired athletic trait. A perfect example is surgery, which we have performed for hundred years for disease purposes and is now widely used as a cosmetic tool. Opening the doors for genetic engineering of human embryos could with time lead to manipulate genetics for desirable traits, raising the fear of creating a eugenic driven human population.

Who are we to manipulate nature? However, for all those who suffer from genetic diseases the answer is not so simples; if we can safely prevent severe genetic diseases and create healthy humans, why not manipulate nature? Have we not already done this in other animal populations? At this time the long term effects of genome editing remain unknown, raising additional questions. As the field progresses, with appropriate regulations and guidelines it will eventually co-exist alongside other major controversial topics including nuclear power and genetically modified organisms. Since ethics are different across the world, creating international guidelines will be a challenge, but a necessity. Strict regulations are in place for nuclear power, the same should be possible for genetic engineering of human embryos. To outlaw genetic engineering entirely will be potentially declining a place at the discussion table, as the further utilization of CRISPR/Cas9 technology is unlikely to be abandoned.

This fall The National Academy of Sciences and National Academy of Medicine, together with CRISPR/Cas9 discoverers Dr. Jennifer Doudna, Dr. Emmanuelle Charpentier, and other leading scientist within the field are organizing an international summit to consider all aspects (both ethical and scientific) of human genetic engineering to develop standard guidelines and policies for practicing human genome editing. The NIH already has guidelines in place, and will potentially add more as a result of this summit. It is expected that other countries will have varying guidelines for human genomic engineering. Also, to avoid fear and misunderstanding, scientists will need to convey human genome editing in a responsible manner to the general human population. This summit is a step in the right direction encouraging caution and regulations. Hence, there is now a need for a timely but thoughtful set of guidelines for the general scientific community as well as for the broader human community.

Read this article:
Ethical Implications of Human Genetic Engineering | SAGE

Human genetic variation – Wikipedia

Human genetic variation is the genetic differences in and among populations. There may be multiple variants of any given gene in the human population (alleles), a situation called polymorphism.

No two humans are genetically identical. Even monozygotic twins (who develop from one zygote) have infrequent genetic differences due to mutations occurring during development and gene copy-number variation.[1] Differences between individuals, even closely related individuals, are the key to techniques such as genetic fingerprinting.As of 2017, there are a total of 324 million known variants from sequenced human genomes.[2]As of 2015, the typical difference between the genomes of two individuals was estimated at 20 million base pairs (or 0.6% of the total of 3.2 billion base pairs).[3]

Alleles occur at different frequencies in different human populations. Populations that are more geographically and ancestrally remote tend to differ more. The differences between populations represent a small proportion of overall human genetic variation. Populations also differ in the quantity of variation among their members.The greatest divergence between populations is found in sub-Saharan Africa, consistent with the recent African origin of non-African populations.Populations also vary in the proportion and locus of introgressed genes they received by archaic admixture both inside and outside of Africa.

The study of human genetic variation has evolutionary significance and medical applications. It can help scientists understand ancient human population migrations as well as how human groups are biologically related to one another. For medicine, study of human genetic variation may be important because some disease-causing alleles occur more often in people from specific geographic regions. New findings show that each human has on average 60 new mutations compared to their parents.[4][5]

Causes of differences between individuals include independent assortment, the exchange of genes (crossing over and recombination) during reproduction (through meiosis) and various mutational events.

There are at least three reasons why genetic variation exists between populations. Natural selection may confer an adaptive advantage to individuals in a specific environment if an allele provides a competitive advantage. Alleles under selection are likely to occur only in those geographic regions where they confer an advantage. A second important process is genetic drift, which is the effect of random changes in the gene pool, under conditions where most mutations are netural (that is, they do not appear to have any positive or negative selective effect on the organism). Finally, small migrant populations have statistical differencescall the founder effectfrom the overall populations where they originated; when these migrants settle new areas, their descendant population typically differs from their population of origin: different genes predominate and it is less genetically diverse.

In humans, the main cause[citation needed] is genetic drift. Serial founder effects and past small population size (increasing the likelihood of genetic drift) may have had an important influence in neutral differences between populations.[citation needed] The second main cause of genetic variation is due to the high degree of neutrality of most mutations. A small, but significant number of genes appear to have undergone recent natural selection, and these selective pressures are sometimes specific to one region.[6][7]

Genetic variation among humans occurs on many scales, from gross alterations in the human karyotype to single nucleotide changes.[8] Chromosome abnormalities are detected in 1 of 160 live human births. Apart from sex chromosome disorders, most cases of aneuploidy result in death of the developing fetus (miscarriage); the most common extra autosomal chromosomes among live births are 21, 18 and 13.[9]

Nucleotide diversity is the average proportion of nucleotides that differ between two individuals. As of 2004, the human nucleotide diversity was estimated to be 0.1%[10] to 0.4% of base pairs.[11] In 2015, the 1000 Genomes Project, which sequenced one thousand individuals from 26 human populations, found that "a typical [individual] genome differs from the reference human genome at 4.1 million to 5.0 million sites affecting 20 million bases of sequence."[3] Nearly all (>99.9%) of these sites are small differences, either single nucleotide polymorphisms or brief insertion-deletions in the genetic sequence, but structural variations account for a greater number of base-pairs than the SNPs and indels.[3]

As of 2017[update], the Single Nucleotide Polymorphism Database (dbSNP), which lists SNP and other variants, listed 324 million variants found in sequenced human genomes.[2]

A single nucleotide polymorphism (SNP) is a difference in a single nucleotide between members of one species that occurs in at least 1% of the population. The 2,504 individuals characterized by the 1000 Genomes Project had 84.7 million SNPs among them.[3] SNPs are the most common type of sequence variation, estimated in 1998 to account for 90% of all sequence variants.[12] Other sequence variations are single base exchanges, deletions and insertions.[13] SNPs occur on average about every 100 to 300 bases[14] and so are the major source of heterogeneity.

A functional, or non-synonymous, SNP is one that affects some factor such as gene splicing or messenger RNA, and so causes a phenotypic difference between members of the species. About 3% to 5% of human SNPs are functional (see International HapMap Project). Neutral, or synonymous SNPs are still useful as genetic markers in genome-wide association studies, because of their sheer number and the stable inheritance over generations.[12]

A coding SNP is one that occurs inside a gene. There are 105 Human Reference SNPs that result in premature stop codons in 103 genes. This corresponds to 0.5% of coding SNPs. They occur due to segmental duplication in the genome. These SNPs result in loss of protein, yet all these SNP alleles are common and are not purified in negative selection.[15]

Structural variation is the variation in structure of an organism's chromosome. Structural variations, such as copy-number variation and deletions, inversions, insertions and duplications, account for much more human genetic variation than single nucleotide diversity. This was concluded in 2007 from analysis of the diploid full sequences of the genomes of two humans: Craig Venter and James D. Watson. This added to the two haploid sequences which were amalgamations of sequences from many individuals, published by the Human Genome Project and Celera Genomics respectively.[16]

According to the 1000 Genomes Project, a typical human has 2,100 to 2,500 structural variations, which include approximately 1,000 large deletions,160 copy-number variants, 915 Alu insertions, 128 L1 insertions, 51 SVA insertions, 4 NUMTs, and 10 inversions.[3]

A copy-number variation (CNV) is a difference in the genome due to deleting or duplicating large regions of DNA on some chromosome. It is estimated that 0.4% of the genomes of unrelated humans differ with respect to copy number. When copy number variation is included, human-to-human genetic variation is estimated to be at least 0.5% (99.5% similarity).[17][18][19][20] Copy number variations are inherited but can also arise during development.[21][22][23][24]

A visual map with the regions with high genomic variation of the modern-human reference assembly relatively to aNeanderthal of 50k [25] has been built by Pratas et al.[26]

Epigenetic variation is variation in the chemical tags that attach to DNA and affect how genes get read. The tags, "called epigenetic markings, act as switches that control how genes can be read."[27] At some alleles, the epigenetic state of the DNA, and associated phenotype, can be inherited across generations of individuals.[28]

Genetic variability is a measure of the tendency of individual genotypes i
n a population to vary (become different) from one another. Variability is different from genetic diversity, which is the amount of variation seen in a particular population. The variability of a trait is how much that trait tends to vary in response to environmental and genetic influences.

In biology, a cline is a continuum of species, populations, races, varieties, or forms of organisms that exhibit gradual phenotypic and/or genetic differences over a geographical area, typically as a result of environmental heterogeneity.[29][30][31] In the scientific study of human genetic variation, a gene cline can be rigorously defined and subjected to quantitative metrics.

In the study of molecular evolution, a haplogroup is a group of similar haplotypes that share a common ancestor with a single nucleotide polymorphism (SNP) mutation. Haplogroups pertain to deep ancestral origins dating back thousands of years.[32]

The most commonly studied human haplogroups are Y-chromosome (Y-DNA) haplogroups and mitochondrial DNA (mtDNA) haplogroups, both of which can be used to define genetic populations. Y-DNA is passed solely along the patrilineal line, from father to son, while mtDNA is passed down the matrilineal line, from mother to both daughter and son. The Y-DNA and mtDNA may change by chance mutation at each generation.

A variable number tandem repeat (VNTR) is the variation of length of a tandem repeat. A tandem repeat is the adjacent repetition of a short nucleotide sequence. Tandem repeats exist on many chromosomes, and their length varies between individuals. Each variant acts as an inherited allele, so they are used for personal or parental identification. Their analysis is useful in genetics and biology research, forensics, and DNA fingerprinting.

Short tandem repeats (about 5 base pairs) are called microsatellites, while longer ones are called minisatellites.

The recent African origin of modern humans paradigm assumes the dispersal of non-African populations of anatomically modern humans after 70,000 years ago. Dispersal within Africa occurred significantly earlier, at least 130,000 years ago. The "out of Africa" theory originates in the 19th century, as a tentative suggestion in Charles Darwin's Descent of Man,[33] but remained speculative until the 1980s when it was supported by study of present-day mitochondrial DNA, combined with evidence from physical anthropology of archaic specimens.

According to a 2000 study of Y-chromosome sequence variation,[34] human Y-chromosomes trace ancestry to Africa, and the descendants of the derived lineage left Africa and eventually were replaced by archaic human Y-chromosomes in Eurasia. The study also shows that a minority of contemporary populations in East Africa and the Khoisan are the descendants of the most ancestral patrilineages of anatomically modern humans that left Africa 35,000 to 89,000 years ago.[34] Other evidence supporting the theory is that variations in skull measurements decrease with distance from Africa at the same rate as the decrease in genetic diversity. Human genetic diversity decreases in native populations with migratory distance from Africa, and this is thought to be due to bottlenecks during human migration, which are events that temporarily reduce population size.[35][36]

A 2009 genetic clustering study, which genotyped 1327 polymorphic markers in various African populations, identified six ancestral clusters. The clustering corresponded closely with ethnicity, culture and language.[37] A 2018 whole genome sequencing study of the world's populations observed similar clusters among the populations in Africa. At K=9, distinct ancestral components defined the Afrosiatic-speaking populations inhabiting North Africa and Northeast Africa; the Nilo-Saharan-speaking populations in Northeast Africa and East Africa; the Ari populations in Northeast Africa; the Niger-Congo-speaking populations in West-Central Africa, West Africa, East Africa and Southern Africa; the Pygmy populations in Central Africa; and the Khoisan populations in Southern Africa.[38]

The human genetic variations found to be very rare between individuals but it is a lot more common within population (more than 5%).[39] The number of variants change depend on how closely related the populations are. The more closely related the population the higher the percentage of variations.

It is commonly assumed that early humans left Africa, and thus must have passed through a population bottleneck before their African-Eurasian divergence around 100,000 years ago (ca. 3,000 generations). The rapid expansion of a previously small population has two important effects on the distribution of genetic variation. First, the so-called founder effect occurs when founder populations bring only a subset of the genetic variation from their ancestral population. Second, as founders become more geographically separated, the probability that two individuals from different founder populations will mate becomes smaller. The effect of this assortative mating is to reduce gene flow between geographical groups and to increase the genetic distance between groups.[citation needed]

The expansion of humans from Africa affected the distribution of genetic variation in two other ways. First, smaller (founder) populations experience greater genetic drift because of increased fluctuations in neutral polymorphisms. Second, new polymorphisms that arose in one group were less likely to be transmitted to other groups as gene flow was restricted.[citation needed]

Populations in Africa tend to have lower amounts of linkage disequilibrium than do populations outside Africa, partly because of the larger size of human populations in Africa over the course of human history and partly because the number of modern humans who left Africa to colonize the rest of the world appears to have been relatively low.[40] In contrast, populations that have undergone dramatic size reductions or rapid expansions in the past and populations formed by the mixture of previously separate ancestral groups can have unusually high levels of linkage disequilibrium[40]

The distribution of genetic variants within and among human populations are impossible to describe succinctly because of the difficulty of defining a "population," the clinal nature of variation, and heterogeneity across the genome (Long and Kittles 2003). In general, however, an average of 85% of genetic variation exists within local populations, ~7% is between local populations within the same continent, and ~8% of variation occurs between large groups living on different continents (Lewontin 1972; Jorde et al. 2000a). The recent African origin theory for humans would predict that in Africa there exists a great deal more diversity than elsewhere and that diversity should decrease the further from Africa a population is sampled.

Sub-Saharan Africa has the most human genetic diversity and the same has been shown to hold true for phenotypic diversity.[35] Phenotype is connected to genotype through gene expression. Genetic diversity decreases smoothly with migratory distance from that region, which many scientists believe to be the origin of modern humans, and that decrease is mirrored by a decrease in phenotypic variation. Skull measurements are an example of a physical attribute whose within-population variation decreases with distance from Africa.

The distribution of many physical traits resembles the distribution of genetic variation within and between human populations (American Association of Physical Anthropologists 1996; Keita and Kittles 1997). For example, ~90% of the variation in human head shapes occurs within continental groups, and ~10% separates groups, with a greater variability of head shape among individuals with recent African ancestors (Relethford 2002).

A prominent exception to the common distribution of physical characteristics within and among groups is skin color. Approximately 10% of the variance in skin color occurs within groups, and ~90% occurs between groups (Relethford 2002). This distribution of sk
in color and its geographic patterning with people whose ancestors lived predominantly near the equator having darker skin than those with ancestors who lived predominantly in higher latitudes indicate that this attribute has been under strong selective pressure. Darker skin appears to be strongly selected for in equatorial regions to prevent sunburn, skin cancer, the photolysis of folate, and damage to sweat glands.[41]

Understanding how genetic diversity in the human population impacts various levels of gene expression is an active area of research. While earlier studies focused on the relationship between DNA variation and RNA expression, more recent efforts are characterizing the genetic control of various aspects of gene expression including chromatin states,[42] translation,[43] and protein levels.[44] A study published in 2007 found that 25% of genes showed different levels of gene expression between populations of European and Asian descent.[45][46][47][48][49] The primary cause of this difference in gene expression was thought to be SNPs in gene regulatory regions of DNA. Another study published in 2007 found that approximately 83% of genes were expressed at different levels among individuals and about 17% between populations of European and African descent.[50][51]

The population geneticist Sewall Wright developed the fixation index (often abbreviated to FST) as a way of measuring genetic differences between populations. This statistic is often used in taxonomy to compare differences between any two given populations by measuring the genetic differences among and between populations for individual genes, or for many genes simultaneously.[52] It is often stated that the fixation index for humans is about 0.15. This translates to an estimated 85% of the variation measured in the overall human population is found within individuals of the same population, and about 15% of the variation occurs between populations. These estimates imply that any two individuals from different populations are almost as likely to be more similar to each other than either is to a member of their own group.[53][54][55] Richard Lewontin, who affirmed these ratios, thus concluded neither "race" nor "subspecies" were appropriate or useful ways to describe human populations.[56]

Wright himself believed that values >0.25 represent very great genetic variation and that an FST of 0.150.25 represented great variation. However, about 5% of human variation occurs between populations within continents, therefore FST values between continental groups of humans (or races) of as low as 0.1 (or possibly lower) have been found in some studies, suggesting more moderate levels of genetic variation.[52] Graves (1996) has countered that FST should not be used as a marker of subspecies status, as the statistic is used to measure the degree of differentiation between populations,[52] although see also Wright (1978).[57]

Jeffrey Long and Rick Kittles give a long critique of the application of FST to human populations in their 2003 paper "Human Genetic Diversity and the Nonexistence of Biological Races". They find that the figure of 85% is misleading because it implies that all human populations contain on average 85% of all genetic diversity. They argue the underlying statistical model incorrectly assumes equal and independent histories of variation for each large human population. A more realistic approach is to understand that some human groups are parental to other groups and that these groups represent paraphyletic groups to their descent groups. For example, under the recent African origin theory the human population in Africa is paraphyletic to all other human groups because it represents the ancestral group from which all non-African populations derive, but more than that, non-African groups only derive from a small non-representative sample of this African population. This means that all non-African groups are more closely related to each other and to some African groups (probably east Africans) than they are to others, and further that the migration out of Africa represented a genetic bottleneck, with much of the diversity that existed in Africa not being carried out of Africa by the emigrating groups. Under this scenario, human populations do not have equal amounts of local variability, but rather diminished amounts of diversity the further from Africa any population lives. Long and Kittles find that rather than 85% of human genetic diversity existing in all human populations, about 100% of human diversity exists in a single African population, whereas only about 70% of human genetic diversity exists in a population derived from New Guinea. Long and Kittles argued that this still produces a global human population that is genetically homogeneous compared to other mammalian populations.[58]

There is a hypothesis that anatomically modern humans interbred with Neanderthals during the Middle Paleolithic. In May 2010, the Neanderthal Genome Project presented genetic evidence that interbreeding did likely take place and that a small but significant[how?] portion of Neanderthal admixture is present in the DNA of modern Eurasians and Oceanians, and nearly absent in sub-Saharan African populations.

Between 4% and 6% of the genome of Melanesians (represented by the Papua New Guinean and Bougainville Islander) are thought to derive from Denisova hominins - a previously unknown species which shares a common origin with Neanderthals. It was possibly introduced during the early migration of the ancestors of Melanesians into Southeast Asia. This history of interaction suggests that Denisovans once ranged widely over eastern Asia.[59]

Thus, Melanesians emerge as the most archaic-admixed population, having Denisovan/Neanderthal-related admixture of ~8%.[59]

In a study published in 2013, Jeffrey Wall from University of California studied whole sequence-genome data and found higher rates of introgression in Asians compared to Europeans.[60] Hammer et al. tested the hypothesis that contemporary African genomes have signatures of gene flow with archaic human ancestors and found evidence of archaic admixture in African genomes, suggesting that modest amounts of gene flow were widespread throughout time and space during the evolution of anatomically modern humans.[61]

New data on human genetic variation has reignited the debate about a possible biological basis for categorization of humans into races. Most of the controversy surrounds the question of how to interpret the genetic data and whether conclusions based on it are sound. Some researchers argue that self-identified race can be used as an indicator of geographic ancestry for certain health risks and medications.

Although the genetic differences among human groups are relatively small, these differences in certain genes such as duffy, ABCC11, SLC24A5, called ancestry-informative markers (AIMs) nevertheless can be used to reliably situate many individuals within broad, geographically based groupings. For example, computer analyses of hundreds of polymorphic loci sampled in globally distributed populations have revealed the existence of genetic clustering that roughly is associated with groups that historically have occupied large continental and subcontinental regions (Rosenberg et al. 2002; Bamshad et al. 2003).

Some commentators have argued that these patterns of variation provide a biological justification for the use of traditional racial categories. They argue that the continental clusterings correspond roughly with the division of human beings into sub-Saharan Africans; Europeans, Western Asians, Central Asians, Southern Asians and Northern Africans; Eastern Asians, Southeast Asians, Polynesians and Native Americans; and other inhabitants of Oceania (Melanesians, Micronesians & Australian Aborigines) (Risch et al. 2002). Other observers disagree, saying that the same data undercut traditional notions of racial groups (King and Motulsky 2002; Calafell 2003; Tishkoff and Kidd 2004[11]). They point out, for example, that major populations considered races or su
bgroups within races do not necessarily form their own clusters.

Furthermore, because human genetic variation is clinal, many individuals affiliate with two or more continental groups. Thus, the genetically based "biogeographical ancestry" assigned to any given person generally will be broadly distributed and will be accompanied by sizable uncertainties (Pfaff et al. 2004).

In many parts of the world, groups have mixed in such a way that many individuals have relatively recent ancestors from widely separated regions. Although genetic analyses of large numbers of loci can produce estimates of the percentage of a person's ancestors coming from various continental populations (Shriver et al. 2003; Bamshad et al. 2004), these estimates may assume a false distinctiveness of the parental populations, since human groups have exchanged mates from local to continental scales throughout history (Cavalli-Sforza et al. 1994; Hoerder 2002). Even with large numbers of markers, information for estimating admixture proportions of individuals or groups is limited, and estimates typically will have wide confidence intervals (Pfaff et al. 2004).

Genetic data can be used to infer population structure and assign individuals to groups that often correspond with their self-identified geographical ancestry. Jorde and Wooding (2004) argued that "Analysis of many loci now yields reasonably accurate estimates of genetic similarity among individuals, rather than populations. Clustering of individuals is correlated with geographic origin or ancestry."[10]

An analysis of autosomal SNP data from the International HapMap Project (Phase II) and CEPH Human Genome Diversity Panel samples was published in 2009.The study of 53 populations taken from the HapMap and CEPH data (1138 unrelated individuals) suggested that natural selection may shape the human genome much more slowly than previously thought, with factors such as migration within and among continents more heavily influencing the distribution of genetic variations.[63]A similar study published in 2010 found strong genome-wide evidence for selection due to changes in ecoregion, diet, and subsistenceparticularly in connection with polar ecoregions, with foraging, and with a diet rich in roots and tubers.[64] In a 2016 study, principal component analysis of genome-wide data was capable of recovering previously-known targets for positive selection (without prior definition of populations) as well as a number of new candidate genes.[65]

Forensic anthropologists can determine aspects of geographic ancestry (i.e. Asian, African, or European) from skeletal remains with a high degree of accuracy by analyzing skeletal measurements.[66] According to some studies, individual test methods such as mid-facial measurements and femur traits can identify the geographic ancestry and by extension the racial category to which an individual would have been assigned during their lifetime, with over 80% accuracy, and in combination can be even more accurate. However, the skeletons of people who have recent ancestry in different geographical regions can exhibit characteristics of more than one ancestral group and, hence, cannot be identified as belonging to any single ancestral group.

Gene flow between two populations reduces the average genetic distance between the populations, only totally isolated human populations experience no gene flow and most populations have continuous gene flow with other neighboring populations which create the clinal distribution observed for moth genetic variation. When gene flow takes place between well-differentiated genetic populations the result is referred to as "genetic admixture".

Admixture mapping is a technique used to study how genetic variants cause differences in disease rates between population.[67] Recent admixture populations that trace their ancestry to multiple continents are well suited for identifying genes for traits and diseases that differ in prevalence between parental populations. African-American populations have been the focus of numerous population genetic and admixture mapping studies, including studies of complex genetic traits such as white cell count, body-mass index, prostate cancer and renal disease.[68]

An analysis of phenotypic and genetic variation including skin color and socio-economic status was carried out in the population of Cape Verde which has a well documented history of contact between Europeans and Africans. The studies showed that pattern of admixture in this population has been sex-biased and there is a significant interactions between socio economic status and skin color independent of the skin color and ancestry.[69] Another study shows an increased risk of graft-versus-host disease complications after transplantation due to genetic variants in human leukocyte antigen (HLA) and non-HLA proteins.[70]

Differences in allele frequencies contribute to group differences in the incidence of some monogenic diseases, and they may contribute to differences in the incidence of some common diseases.[71] For the monogenic diseases, the frequency of causative alleles usually correlates best with ancestry, whether familial (for example, Ellis-van Creveld syndrome among the Pennsylvania Amish), ethnic (Tay-Sachs disease among Ashkenazi Jewish populations), or geographical (hemoglobinopathies among people with ancestors who lived in malarial regions). To the extent that ancestry corresponds with racial or ethnic groups or subgroups, the incidence of monogenic diseases can differ between groups categorized by race or ethnicity, and health-care professionals typically take these patterns into account in making diagnoses.[72]

Even with common diseases involving numerous genetic variants and environmental factors, investigators point to evidence suggesting the involvement of differentially distributed alleles with small to moderate effects. Frequently cited examples include hypertension (Douglas et al. 1996), diabetes (Gower et al. 2003), obesity (Fernandez et al. 2003), and prostate cancer (Platz et al. 2000). However, in none of these cases has allelic variation in a susceptibility gene been shown to account for a significant fraction of the difference in disease prevalence among groups, and the role of genetic factors in generating these differences remains uncertain (Mountain and Risch 2004).

Some other variations on the other hand are beneficial to human, as they prevent certain diseases and increase the chance to adapt to the environment. For example, mutation in CCR5 gene that protects against AIDS. CCR5 gene is absent on the surface of cell due to mutation. Without CCR5 gene on the surface, there is nothing for HIV viruses to grab on and bind into. Therefore the mutation on CCR5 gene decreases the chance of an individuals risk with AIDS. The mutation in CCR5 is also quite popular in certain areas, with more than 14% of the population carry the mutation in Europe and about 6-10% in Asia and North Africa.[73]

Apart from mutations, many genes that may have aided humans in ancient times plague humans today. For example, it is suspected that genes that allow humans to more efficiently process food are those that make people susceptible to obesity and diabetes today.[74]

Neil Risch of Stanford University has proposed that self-identified race/ethnic group could be a valid means of categorization in the USA for public health and policy considerations.[75][76] A 2002 paper by Noah Rosenberg's group makes a similar claim: "The structure of human populations is relevant in various epidemiological contexts. As a result of variation in frequencies of both genetic and nongenetic risk factors, rates of disease and of such phenotypes as adverse drug response vary across populations. Further, information about a patients population of origin might provide health care practitioners with information about risk when direct causes of disease are unknown."[77]

Human genome projects are scientific endeavors that determine or study the structure of the human genome. The Human Genome Project was a landmark
genome project.

Regional:

Projects:

See the article here:
Human genetic variation - Wikipedia

A new tool for genetically engineering the oldest branch of life – Phys.Org

March 8, 2017 G. William Arends Professor of Microbiology and theme leader of the IGB's Mining Microbial Genomes theme Bill Metcalf, left, with IGB Fellow Dipti Nayak. Credit: University of Illinois at Urbana-Champaign

A new study by G. William Arends Professor of Microbiology at the University of Illinois Bill Metcalf with postdoctoral Fellow Dipti Nayak has documented the use of CRISPR-Cas9 mediated genome editing in the third domain of life, Archaea, for the first time. Their groundbreaking work, reported in Proceedings of the National Academy of Sciences, has the potential to vastly accelerate future studies of these organisms, with implications for research including global climate change. Metcalf and Nayak are members of the Carl R. Woese Institute for Genomic Biology at Illinois.

"Under most circumstances our model archaeon, Methanosarcina acetivorans, has a doubling time of eight to ten hours, as compared to E. coli, which can double in about 30 minutes. What that means is that doing genetics, getting a mutant, can take monthsthe same thing would take three days in E. coli," explains Nayak. "What CRISPR-Cas9 enables us to do, at a very basic level, is speed up the whole process. It removes a major bottleneck... in doing genetics research with this archaeon.

"Even more," continues Nayak, "with our previous techniques, mutations had to be introduced one step at a time. Using this new technology, we can introduce multiple mutations at the same time. We can scale up the process of mutant generation exponentially with CRISPR."

CRISPR, short for Clustered Regularly Interspaced Short Palindromic Repeats, began as an immune defense system in archaea and bacteria. By identifying and storing short fragments of foreign DNA, Cas (CRISPR-associated system) proteins are able to quickly identify that DNA in the future, so that it can then quickly be destroyed, protecting the organism from viral invasion.

Since its discovery, a version of this immune systemCRISPR-Cas9has been modified to edit genomes in the lab. By pairing Cas9 with a specifically engineered RNA guide rather than a fragment of invasive DNA, the CRISPR system can be directed to cut a cell's genome in an arbitrary location such that existing genes can be removed or new ones added. This system has been prolifically useful in editing eukaryotic systems from yeast, to plant, to fish and even human cells, earning it the American Association for the Advancement of Science's 2015 Breakthrough of the Year award. However, its implementation in prokaryotic species has been met with hurdles, due in part to their different cellular processes.

To use CRISPR in a cellular system, researchers have to develop a protocol that takes into account a cell's preferred mechanism of DNA repair: after CRISPR's "molecular scissors" cut the chromosome, the cell's repair system steps in to mend the damage through a mechanism that can be harnessed to remove or add additional genetic material. In eukaryotic cells, this takes the form of Non-Homologous End Joining (NHEJ). Though this pathway has been used for CRISPR-mediated editing, it has the tendency to introduce genetic errors during its repair process: nucleotides, the rungs of the DNA ladder, are often added or deleted at the cut site.

NHEJ is very uncommon in prokaryotes, including Archaea; instead, their DNA is more often repaired through a process known as homology-directed repair. By comparing the damage to a DNA template, homology-directed repair creates what Nayak calls a "deterministic template"the end result can be predicted in advance and tailored to the exact needs of the researcher.

In many ways, homology-directed repair is actually preferable for genome editing: "As much as we want CRISPR-Cas9 to make directed edits in eukaryotic systems, we often end up with things that we don't want, because of NHEJ," explains Nayak. "In this regard, it was a good thing that most archaeal strains don't have a non-homologous end joining repair system, so the only way DNA can be repaired is through this deterministic homologous repair route."

Though it may seem counter-intuitive, one of Nayak and Metcalf's first uses of CRISPR-Cas9 was to introduce an NHEJ mechanism in Methanosarcina acetivorans. Though generally not preferable for genome editing, says Nayak, NHEJ has one use for which it's superior to homologous repair: "If you just want to delete a gene, if you don't care how ... non-homologous end joining is actually more efficient."

By using the introduced NHEJ repair system to perform what are known as "knock-out" studies, wherein a single gene is removed or silenced to see what changes are produced and what processes that gene might affect, Nayak says that future research will be able to assemble a genetic atlas of M. acetivorans and other archaeal species. Such an atlas would be incredibly useful for a variety of fields of research involving Archaea, including an area of particular interest to the Metcalf lab, climate change.

"Methanosarcina acetivorans is the one of the most genetically tractable archaeal strains," says Nayak. "[Methanogens are] a class of archaea that produce gigatons of this potent greenhouse gas every year, play a keystone role in the global carbon cycle, and therefore contribute significantly to global climate change." By studying the genetics of this and similar organisms, Nayak and Metcalf hope to gain not only a deeper understanding of archaeal genetics, but of their role in broader environmental processes.

In all, this research represents an exciting new direction in studying and manipulating archaea. "We began this research to determine if the use of CRISPR-Cas9 genome editing in archaea was even possible," concludes Nayak. "What we've discovered is that it's not only possible, but it works remarkably well, even as compared to eukaryotic systems."

Explore further: Modifying fat content in soybean oil with the molecular scissors Cpf1

More information: Dipti D. Nayak et al, Cas9-mediated genome editing in the methanogenic archaeon, Proceedings of the National Academy of Sciences (2017). DOI: 10.1073/pnas.1618596114

A team from the Center for Genome Engineering, within the Institute for Basic Research (IBS), succeeded in editing two genes that contribute to the fat contents of soybean oil using the new CRISPR-Cpf1 technology: an alternative ...

Although the genome editing system known as CRISPR/Cas has revolutionized genetic research in cell lines, its overall efficiency has been relatively poor when used to generate genetically altered animals for disease modeling. ...

Researchers from Memorial Sloan Kettering Cancer Center (MSK) have harnessed the power of CRISPR/Cas9 to create more-potent chimeric antigen receptor (CAR) T cells that enhance tumor rejection in mice. The unexpected findings, ...

Rest easy, folks. Armies of genetically modified super-species are unlikely to conquer Earth anytime soon.

A unique gene-editing method that efficiently inserts DNA into genes located in dividing and non-dividing cells of living rats has been developed by a team of international researchers, including scientists from KAUST.

CRISPR-Cas9 is a powerful new tool for editing the genome. For researchers around the world, the CRISPR-Cas9 technique is an exciting innovation because it is faster and cheaper than previous methods. Now, using a molecular ...

Researchers at the University of Arizona have found a promising way to prevent the loss of millions of tons of crops to a fungus each year, offering the potential to dramatically improve food security, especially in developing ...

Finnish and Estonian researchers have discovered and identified 16 new fungus gnat species in the Amazonia. The diverse gnat species maintain exceptionally rich parasitoid wasp species, which shows the importance of interdependence ...

A team of researchers with members from several institutions in India has found evidence of ostrich relatives living i
n India as far back as 25,000 years ago. In their paper uploaded to the open access site PLOS ONE, the ...

A new mathematical model could help clarify what drove the evolution of large brains in humans and other animals, according to a study published in PLOS Computational Biology.

A global research team has built five new synthetic yeast chromosomes, meaning that 30 percent of a key organism's genetic material has now been swapped out for engineered replacements. This is one of several findings of ...

Led by Tianjin University Professor Ying-Jin Yuan, TJU's synthetic biology team has completed the synthesis of redesigned yeast chromosomes synV and synX with the two studies published in Science on March 10, 2017.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Read the rest here:
A new tool for genetically engineering the oldest branch of life - Phys.Org