Proceeding with Caution | Harvard Medical School – Harvard Medical School

Click on any icon to hear that co-authors perspective on what the proposed guidelines mean for the region in which they work. Map compiled by Stephanie Dutchen

HMNews: What is the main goal of having a set of international guidelines?

Kendra Sirak: While some countries have developed rigorous standards that guide the scientific analysis of human remains, many others have few or no guidelines that ensure that this work is carried out responsibly and is both scientifically robust and sensitive to community perspectives. Everyone wants practical guidance that will be positive about the research enterprise while embracing high ethical standards.

Its our hope that these guidelines will raise the integrity of ancient DNA research around the world by minimizing damage to collections of human remains; ensuring sensitivity to the perspectives of stakeholder groups, especially when these groups are marginalized; and reducing opportunities for the misuse of results. We expect these guidelines will undergo further development as the field continues to evolve.

HMNews: Why now?

Jakob Sedig: Ancient DNA as a field has been growing rapidly, evolving from a promising technology to a mature field. The discussion about how to handle human remains and how to meaningfully involve diverse stakeholders has not yet caught up. More and more people are calling for clear, strong guidance that all researchers engaged in ancient DNA work can embrace.

Ancient DNA analysis has contributed vital new insights about the human past and has helped us understand the genetic roots of human diversity. It has disrupted nationalist and xenophobic narratives. It has challenged what many of us thought we knew about who we are and where we came from. But like any field that matters, its complex.

Because of the number of ancient individuals being analyzed, the socialand political nature of the work, and the challenges that ancient DNA findings have raised about theories proposed before we had such data, people are paying attention to ancient DNA. That makes it even more vital to articulate and adopt strong guidelines that work well everywhere.

HMNews: How did the team come up with these five guidelines?

Sedig: We took cues from archaeology and modern human genetics, which have established protocols for carrying out analyses on human remains and establishing stakeholder consent. We built on aspects of existing guidelines, such as those crafted by a group of North American scholars, including Indigenous scholars, published last year in the American Journal of Human Genetics.

Our diverse co-author groupparticularly those in Central and South America, Africa, Europe, South Asia, the Pacific, and East Asiafelt that these and other suggestions, while valuable, were not applicable in all world regions. Our virtual workshop led to monthslong discussions that took many different value systems and histories into account and sought balance between local contexts and general principles. We then wrote the manuscript.

Given that there was near-unanimous support and excitement about the final document among the workshop participants, we hope the broader community will embrace and build on these proposals. It would be wonderful if the proposals form a basis for official guidelines in the future.

HMNews: Why not just follow national or local government regulations wherever a project is being conducted?

Sirak: There are some places where laws are robust enough for that to be appropriate, but in other locales, we feel that researchers need to hold themselves to a higher standard than required by the laws currently in place.

HMNews: What are some of the needs and unique circumstances in different regions that shaped the guidelines?

Sirak: We have found that guidelines that work well for one region can come across as condescending or even colonialist in another. Many co-authors on this manuscript raised the point that indigeneity has different meanings in different places and is even used in some regions as a framework for oppression and discrimination against minority groups argued to be non-Indigenous. Thus, basing research ethics on a single definition can inadvertently reinforce rather than mitigate power imbalances in conducting and interpreting genetic analyses.

The videos our co-authors have shared speak to the many nuances of ethical ancient DNA research in the places where they live and work.

HMNews: Some critics say that ancient DNA research, which to a large extent has focused on and been conducted by white people from wealthy nations, has been a colonialist endeavor that siphons agency from marginalized groups. How do the proposed guidelines address these discussions about power and ownership?

Sedig: These are important conversations. We cant reiterate enough that our goal is to learn about the past in a sensitive, thoughtful, and ethical way. We do not want to contribute to exploitation; we want to do the opposite. We need to listen to and respect the people who are stakeholders in ancient DNA studiesincluding groups from the place of origin of the human remains being studiedand make sure their perspectives are represented in discussions about study design, research questions, and whether a project should proceed at all. Theres been a huge amount of progress in recent years in seeking local perspectives from the start to the conclusion of a study and incorporating that feedback into the project and publication. We have increasingly diverse groups of people who conduct the research as well.

We want to minimize harm and reduce inequity, and I believe the ancient DNA community has an extraordinary track record of providing arguments that do so. We know that in regions with histories of settler colonialism, we have to center Indigenous perspectives. We have to confront the colonial legacies of human remains collected in unethical ways and often sent abroad, and we should seek ways to mend the harms done, such as by considering how our research findings or the methods we are using might be helpful tools for facilitating repatriation of remains. We must ensure that local scientists and communities are as engaged as can be in ancient DNA research, particularly in places with histories of scientists conducting exploitative research. Researchers working in countries outside their own must prioritize establishing equitable collaborations that benefit local scholars and avoid carrying out parachute research at all costs.

When possible, those of us in positions of privilege should contribute to reducing structural inequities. Some ideas we propose in the guidelines are to help educate and train local community members and other stakeholders, assist with raising the curatorial standards of collections or developing museum exhibits, provide funds for training or attending professional meetings, and advocate for funding agencies to build more capacity for equitable ancient DNA research. We also need to ensure that we communicate results in ways that are accessible to nonscientists and the broader scholarly community. Lastly, we have to oppose those who use genetic data to support narratives of group superiority or to justify exclusionary policies.

At the same time, as scientists we need to make sure we can proceed in a way consistent with the scientific method. We cant ethically conduct a study without the guarantee that we can follow the data where they lead. This means that once stakeholder communities agree that publishing results would not cause them harm, the relevant portion of a manuscript wont be restricted. It also means the data must be made accessible at least so others can replicate or reevaluate results.

We have a loyalty to the facts we uncover as we learn about our shared humanity. In cases where the data we generate dont align with other forms of knowledge, such as traditional expertise or cultural beliefs, it is not our job to discredit or diminish that knowledge. Rather, those discrepancies highlight how comple
x an undertaking it is to understand the past and should be flagged in papers that result from the work.

Regarding ownership, we believe that whenever researchers are granted permission to study the remains of ancient individuals, they become stewards of that material with a responsibility to care for and respect it. They do not assume ownership of the remainsor of the data that arise from sequencing it.

HMNews: Some groups assert that stakeholder communities should decide whether and how certain kinds of ancient DNA data can be used in future analyses. How does this fit with the teams push for open data?

Sirak: We advocate for stakeholders having input into how data should be distributed and we advocate for open data. We believe that both goals can coexist.

Many of our co-authors felt strongly that ancient DNA data should always be made fully and publicly available. Other co-authors argued that when it comes to data from remains that might be meaningfully connected to present-day Indigenous communities, it could be appropriate to have usage restrictions. This was one of many debates we had, and in listening to one another, some of us changed our positions.

We all agreed that open data for ancient DNA is something to strive for. The data must be made available after publicationeither through full open access, which is ideal, or distributed by a professional organization without a stake in the research resultsso scholars can reproduce or challenge analyses. This also lowers the chances study results will be misused. We are proud that the raw data for nearly all ancient genomes published so far was made publicly available at or before the time of publication.

Finally, we agreed that Indigenous-led data repositories such as those now being developed could help mediate permissions when scholars wish to use data for purposes beyond those articulated in an original study plan.

HMNews: Given that equity is a priority, how accessible will this paper be to those who, for example, dont have paid access to the journal in which its being published or who arent fluent in English?

Sirak: Weve made our paper open access and applied the most flexible Creative Commons license to it, known as CC BY 4.0. That means its available for free to anyone in the public to read, distribute, adapt, and build upon. Our team members also have translated the text into more than 20 languages that they speak.

HMNews: Do you expect pushback from scientists who feel that the guidelines are too onerous and will make it harder to carry out research?

Sedig: We did receive feedback during the review process that the guidelines were too strongthat they would create a heavy burden for researchers from smaller labs or who are in the early stages of their careers. We respect this perspective and understand that were requesting a lot in terms of engaging with stakeholders and what could be called overhead beyond the research itself. However, we firmly believe that all ancient DNA studies, from an early-career stage onwards, should meet these ethical standards.

In a way, the proposals are merely concretizing the standards that are already emerging in the field. We believe that authors and journal editors feel their way toward this ethical framework during the review process. We believe that the proposals are practical and that early-career researchersincluding many who co-authored our articlewill benefit from having the principles clearly articulated and the guesswork reduced as they aim to carry out their research in an ethically principled way.

HMNews: What enforcement would there be if someone involved in ancient DNA research didnt follow these guidelines?

Sirak: Our co-authors do not represent any official organization, so we cannot make or enforce rules for anyone except ourselves. What our paper does represent is a grassroots, community-led pledge from representatives of a nontrivial faction of worldwide researchers engaged in this type of work. We have committed to adhering to a set of strong principles, and we invite others to hold us accountable to them.

It would be a great outcome if scientific journals, professional societies, or granting agencies found these proposals useful enough to turn into official guidelines, which would mean there could be professional repercussions for not adhering to them. The fact that scholars from such a diverse array of nations and disciplines have signed on to the guidelines at this stage makes us optimistic that they will be embraced in practice by laboratories and research groups as well as other groups engaged in ancient DNA research all over the world. But either way, its important to continue the global conversation.

This work was supported by the Australian Research Council Discovery Project (DP160100811), National Research Foundation South Africa, Brazilian National Council for Scientific and Technological Development (302163/2017-4), So Paulo Research Foundation/FAPESP (2018/23282-5), Francis Crick Institute (FC001595), Cancer Research UK, UK Medical Research Council, Wellcome Trust, Dutch Research Council (VI.C.191.070), Hungarian Academy of Sciences, Science and Engineering Research Board of India, Council of Scientific and Industrial Research in India (Ministry of Science and Technology, Government of India), European Research Council (ERC-2017-StG 804844-DAIRYCULTURES), Werner Siemens-Stiftung, John Templeton Foundation (6122), Howard Hughes Medical Institute, Max Planck Society, and the Max Planck Harvard Research Center for the Archaeoscience of the Ancient Mediterranean, and the National Geographic Society.

Interviews were edited for length and clarity.

More here:
Proceeding with Caution | Harvard Medical School - Harvard Medical School

UT Southwestern Team Awarded $8.8M to Participate in Genomic Variation Consortium Dallas Innovates – dallasinnovates.com

Left to right: Gary Hon, Ph.D., UTSW Assistant Professor of Obstetrics and Gynecology; Nikhil Munshi, M.D., Ph.D., Associate Professor of Internal Medicine and Molecular Biology; W. Lee Kraus, Ph.D., Professor and Director of the Cecil H. and Ida Green Center for Reproductive Biology Sciences

The Human Genome Project identified and mapped all of the genes of the human genome, achieving the worlds largest international, collaborative biological project. That opened the door to a wide array of innovative research projectsincluding a prestigious one that UT Southwestern has just joined.

A team ofUT Southwestern faculty led by Gary Hon, Ph.D.,has been awarded a five-year, $8.8 million grant to participate in the National Human Genome Research Institutes Impact of Genomic Variation on Function (IGVF) Consortium. The consortiums goal is understanding how developmental variants contribute to developmental diseases.

Dr. Hon is an assistant professor of obstetrics and gynecology in the Cecil H. and Ida Green Center for Reproductive Biology Sciences and a member of the Lyda Hill Department of Bioinformatics.

Hon developed Mosaic-seq, a genome engineering technique that helped lead to the awarding of the $8.8 million grant. In a statement, he saidthe IGVF Consortium is the National Human Genome Research Institutes next step to unveiling the genomes role in disease.

The Human Genome Project told us that most of the genome doesnt contain genes, Hon said. One big surprise from genome-wide association studies is that gene-poor regions contain many disease signatures.

It turns out that the signatures largely overlap with DNA elements, found by the Encyclopedia of DNA Elements (ENCODE) Consortium, that control when genes turn on, Hon added. The goal of this consortium is to fill in the gaps, linking DNA sequences to genes, cell phenotypes, and disease. Ultimately, this knowledge will allow us to interpret the disease potential of any persons genome sequence.

In their work with the consortium, the UTSW teamwill combine molecular biology, genomics, high throughput screens, and computational analyses to focus on potential disease-causing genetic variations in the cardiovascular, nervous, and placental systems.

Besides Hon, the teamalso includes principal investigators Nikhil Munshi, M.D., Ph.D., associate professor of internal medicine and molecular biology, and W. Lee Kraus, Ph.D., professor and director of the Green Center.

Mosaic-seq allows high throughput analysis of the molecular events that occur during programming of embryonic stem cells into other cell types. This technique uses single-cell sequencing to study different regions of the genome at the same time.

Just one experiment can perturb thousands of regions in the genome to better understand their function, according to the UTSW team.

With Mosaic-seq, researchers no critical have to study one region at a time. Hons lab received national attention in 2017 for this significant advance, which was part of his teams grant application.

UTSW now joins Harvard, Stanford, and Yale universities as one of the 30 research sites taking part in the IGVF Consortium nationwide.The consortium will study noncoding regions of the human genome that are known to contribute to genetic diseasesincluding congenital heart disease, autoimmune disease, and blood disorders.

Dr. Kraus, a professor of obstetrics and gynecology and pharmacology who holds the Cecil H. and Ida Green Distinguished Chair in reproductive biology sciences, will use additional CRISPR-based technologies in the consortium research project. Kraus will use them to study how genetic variation in non-coding RNAs originating from the regulatory elements impacts the development of the placenta.

The placentas development is important because it supports the human fetus as it grows, as well as the fetuss heart and central nervous system.

Studying the role of genetic variation in the embryonic development of these key organs could point the way to understanding human diseases in adults, Kraus said in the statement.

Dr. Munshi believes the IGVF Consortium initiativecould potentially fill in huge pieces of the puzzle for many diseases.

If we candetermine all of the noncoding elements in the genome that impact a particular developmental pathway, then those could become candidates fordisease-associated mutations, Munshi said.

By generating catalogs of tens of thousands offunctionalvariants, we dont have to search the billons of basepairs to find where thedisease-causingmutations might lie, he added. We can really focus the search on thesetens of thousands of variants. It really gives us an encyclopediatonarrow the search.

Sign up to keep your eye on whats new and next in Dallas-Fort Worth, every day.

Learn the ways medical school classrooms are looking more and more futuristic by incorporating simulation learning into their curriculums.

With Dallas ranked as one of the top cities in America for tech pros, UT Dallas and Fullstack have launchedfour skills training bootcamps focused on coding, cybersecurity, data analytics, and DevOps. The online bootcamps begin in November with tuition at $11,995 each.

Southlake-based OncoNano Medicine uses pH-sensitive nanoparticle technology to "light up" cancer for real-time surgical imaging. The multi-year collaboration will seek to uncover new cancer therapies that can benefit from OncoNano's technology. OncoNano raised $50 million in Series B funding in June.

In partnership with the Southwestern Medical Foundation, the Cary Council awarded $50K grants to each of its three 2021 young "DocStars." On a recent "What's Up Doc?" virtual event, the young investigators spoke about how their research projects are going, what they hope to achieveand why the seed grants are a catalyst for medical innovation.

Animation and game design is boomingand UTD is on theforefront. Here's how its School of Arts, Technology, and Emerging Communication has become a national leader. ATEC is producing graduates who go on to workfor top companieslike Blizzard Entertainment, Gearbox Software, id software, Disney, and 900lbs.

Read more here:
UT Southwestern Team Awarded $8.8M to Participate in Genomic Variation Consortium Dallas Innovates - dallasinnovates.com

Synthetic auxotrophy remains stable after continuous evolution and in coculture with mammalian cells – Science Advances

Abstract

Understanding the evolutionary stability and possible context dependence of biological containment techniques is critical as engineered microbes are increasingly under consideration for applications beyond biomanufacturing. While synthetic auxotrophy previously prevented Escherichia coli from exhibiting detectable escape from batch cultures, its long-term effectiveness is unknown. Here, we report automated continuous evolution of a synthetic auxotroph while supplying a decreasing concentration of essential biphenylalanine (BipA). After 100 days of evolution, triplicate populations exhibit no observable escape and exhibit normal growth rates at 10-fold lower BipA concentration than the ancestral synthetic auxotroph. Allelic reconstruction reveals the contribution of three genes to increased fitness at low BipA concentrations. Based on its evolutionary stability, we introduce the progenitor strain directly to mammalian cell culture and observe containment of bacteria without detrimental effects on HEK293T cells. Overall, our findings reveal that synthetic auxotrophy is effective on time scales and in contexts that enable diverse applications.

New safeguards are needed for the deliberate release of engineered microbes into the environment, which has promise for applications in agriculture, environmental remediation, and medicine (1). Genetically encoded biocontainment strategies enable attenuation of engineered live bacteria for diverse biomedical applications (24), including as potential vaccines (510), diagnostics (11), and therapeutics (1215). Auxotrophy, which is the inability of an organism to synthesize a compound needed for its growth, is an existing strategy for containment. However, foundational studies of auxotrophic pathogens demonstrated proliferation in relevant biological fluids (16) and reversion to prototrophy upon serial passaging (17, 18). Modern genome engineering strategies can prevent auxotrophic reversion, and auxotrophy has been a key component of microbial therapies that have reached advanced clinical trials. However, the ability for auxotrophs to access required metabolites within many host microenvironments, and after leaving the host, remains unaddressed. Auxotrophy may not be effective in scenarios where engineered living bacteria encounter metabolites from dead host cells (19) or invade host cells (20). Growth of double auxotrophs is supported in vivo by neoplastic tissue (13). Auxotrophy may also be insufficient for tight control of cell proliferation in environments rich with microbial sources of cross-feeding (21), such as gut, oral, skin, and vaginal microbiomes. Given that most naturally occurring microorganisms are auxotrophs (22), it is also unlikely that auxotrophy will limit the spread of an engineered microbe once it leaves the body and enters the environment.

Synthetic auxotrophy may overcome these hurdles by requiring provision of a synthetic molecule for survival of the engineered bacteria. This strategy was first implemented successfully in Escherichia coli by engineering essential proteins to depend on incorporation of a nonstandard amino acid (nsAA) (23, 24). We previously engineered E. coli strains for dependence on the nsAA biphenylalanine (BipA) by computer-aided redesign of essential enzymes in conjunction with expression of orthogonal translation machinery for BipA incorporation (23). Among several synthetic auxotrophs originally constructed, one strain harbored three redesigned, nsAA-dependent genesadenylate kinase (adk.d6), tyrosyl-tRNA synthetase (tyrS.d8), and BipA-dependent aminoacyl-tRNA synthetasefor aminoacylation of BipA (BipARS.d6). This BipA-dependent strain, dubbed DEP, exhibited undetectable escape throughout 14 days of monitoring at an assay detection limit of 2.2 1012 escapees per colony-forming unit (CFU) (23). Although this strain demonstrates effective biocontainment in 1-liter batch experiments, its precise escape frequency and long-term stability remained unexplored.

Here, we perform the first study of evolutionary stability of a synthetic auxotroph with the aid of automated continuous evolution. Continuous evolution better emulates scenarios where biocontainment may be needed by fostering greater genetic variability within a population. We posited that decreasing BipA concentrations would add selective pressure for adaptation or for escape, either of which would be enlightening. Adaptive laboratory evolution of DEP may improve its fitness in relevant growth contexts, as previously demonstrated for its nonauxotrophic but recoded ancestor, C321.A (25). We report that DEP maintains its inability to grow in the absence of synthetic nutrient, even after three parallel 100-day chemostat trials. In addition, we find evidence of adaptation, with evolved DEP isolates requiring 10-fold lower BipA concentration to achieve optimal growth than ancestral DEP (0.5 M rather than 5 M). We resequence evolved populations and perform allelic reconstruction in ancestral DEP using multiplex automatable genome engineering (MAGE), identifying alleles that partially restore the adaptive phenotype. Last, we advance this technology toward host-microbe coculture applications, demonstrating direct mixed culture of DEP and mammalian cells without the need for physical barriers or complex fluidics.

To perform continuous evolution of E. coli, we constructed custom chemostats for parallelized and automated culturing (Fig. 1A). Our design and construction were based on the eVOLVER system (26), an open-source, do-it-yourself automated culturing platform (figs. S1 to S4). By decreasing BipA concentration over time in our chemostats, we provide an initial mild selection for escape and steadily increase its stringency. This design is analogous to a morbidostat, where a lethal drug is introduced dynamically at sublethal concentrations to study microbial drug resistance (27), but with synthetic auxotrophy providing selective pressure. Our working algorithm for automated adjustment of BipA concentration as a function of turbidity is shown in Fig. 1B, and a representative image of our hardware is shown in Fig. 1C (see also fig. S5).

(A) Illustration of a smart sleeve connected to separate nonpermissive media and biphenylalanine (BipA; structure shown in blue) feed lines for automated adjustment of BipA concentration based on growth rate. Pumps and optics are integrated with Arduino controller hardware and Python software based on the eVOLVER do-it-yourself automated culturing framework. (B) Working algorithm for maintenance of cultures in continuous evolution mode. Criteria for lowering the BipA concentration are based on the difference in time elapsing between OD peaks (tpeak OD). Smaller time elapsed between OD peaks is indicative of higher growth rates, triggering decrease in BipA concentration when below a threshold value. (C) Representative configuration of hardware for parallelized evolution in triplicate, with three empty sleeves shown. Photo credit: Michael Napolitano, Harvard Medical School.

Our long-term culturing experiments featured two phases. The first phase included one chemostat (N = 1) that was inoculated with DEP for an 11-day incubation, with an initial concentration of BipA of 100 M and automated adjustment based on growth rate (Fig. 2A). Because we observed no colony formation when the outgrowth from this population was plated on nonpermissive media, we then began a second phase in replicate. We used our population grown for 11 days to inoculate three chemostats in parallel (N = 3) where BipA supply decreased automatically over the following 90 days from 100 M to nearly 100 nM. One controller provided identical BipA concentrations to all three vials at any given time. To determine whether the decrease in BipA supply was due to escape from dependence on BipA, we periodically performed escape assays. We continued to observe no escape, including when we seeded liter-scale cultures and plated the associated outgrowth on nonpermissive media. Evolved isolates were obtained after this procedure (fig. S6), and th
eir growth was characterized across BipA concentrations (Fig. 2B and fig. S7). At 0.5 to 1 M BipA, we observed growth of all evolved isolates and no growth of the ancestral DEP strain.

(A) Timeline for continuous evolution, with detection limits for escape frequency assays shown in parentheses. (B) Doubling times of progenitor and evolved synthetic auxotrophs as a function of BipA concentration, normalized to the doubling time of DEP at 100 M BipA. Error bars represent the SD across technical triplicates within the same experiment.

To identify the causal alleles contributing to decreased BipA requirement of all three evolved isolates, we performed whole-genome sequencing and mutational analysis. We expected that mutations in auxotrophic markers or orthogonal translation machinery associated with aminoacylation of BipA would be observed. However, no variants were detected in the plasmid-expressed orthogonal translation machinery (aminoacyl-tRNA synthetase and tRNA) reference sequence. Instead, in all three evolved isolates, variants were observed in three nonessential genes, all of which are implicated in molecular transport: acrB, emrD, and trkH (Fig. 3A). AcrB and EmrD are biochemically and structurally well-characterized multidrug efflux proteins (28), and TrkH is a potassium ion transporter (29). These exact mutations have no precedent in the literature to our knowledge. Because they are missense mutations or in-frame deletions, it is unclear whether they cause loss of function or altered function (table S1). Because permissive media contain four artificial targets of efflux (BipA, l-arabinose, chloramphenicol, and SDS), mutations that confer a selective advantage during continuous evolution could disable BipA/l-arabinose efflux, improve chloramphenicol/SDS efflux, or affect transport of these or other species more indirectly. Given the strong selective pressure enforced by decreasing BipA concentration, we hypothesize that mutations observed are more likely to affect BipA transport. We also observed mutations in all evolved populations to the 23S ribosomal RNA (rRNA) gene rrlA (table S2). 23S rRNA mutations have been found to enhance tolerance for D-amino acids (30) and -amino acids (31). However, 23S rRNA mutations could also be related to increased tolerance of chloramphenicol (32).

(A) List of alleles identified through next-generation sequencing. Sequencing results originally obtained during the project identified this EmrD allele as a 33-bp deletion, which was then reconstructed in the experiment shown in (B). However, resequencing performed at the end of the project identified the allele as a 39-bp deletion and was confirmed by Sanger sequencing. A repetitive GGCGCG nucleotide sequence corresponding to G323-A324 and G336-A337 creates ambiguity about the precise positional numbering of the deletion. However, the three possible 13amino acid deletions (323335, 324336, and 325337) result in the same final protein sequence. (B) Effect of reconstructed allele in DEP progenitor on doubling time as a function of BipA concentration, normalized to the doubling time of DEP at 100 M BipA. Error bars represent the SD across technical triplicates within the same experiment.

To learn how identified transporter alleles may contribute to increased growth rates at low BipA concentration, we performed allelic reconstruction in the progenitor DEP strain using MAGE (33). Among four mutants that we generated in DEP, we observed growth of all mutants at 2 M BipA, a condition in which progenitor DEP could not grow (Fig. 3B and fig. S8). Furthermore, only emrD mutants exhibited near-normal growth at 1 M BipA. To investigate possible differential sensitivity of strains that contain reconstructed alleles to other media components of interest (SDS, l-arabinose, tris buffer, and chloramphenicol), we varied the concentration of these components and measured doubling times (fig. S9). We observed no significant deviation in doubling time from DEP in any of these cases. These results collectively suggest that observed transporter alleles are linked to BipA utilization.

The unobservable escape of DEP even after 100 days of evolution encouraged us to explore the possibility of an improved in vitro model for host-microbe interactions. In vitro models allow direct visualization and measurement of cells and effectors during processes such as pathogenesis (34). They are more relevant than animal studies for several human cell-specific interactions due to biological differences across animal types (35, 36). A nonpathogenic E. coli strain engineered to express heterologous proteins could be particularly useful for studying or identifying virulence factors and disease progression. However, an obstacle associated with coculture of microbial and mammalian cells is microbial takeover of the population. Approaches used to address this are bacteriostatic antibiotics (37), semipermeable Transwell membranes (3840), microcarrier beads (41), microfluidic cell trapping (42), peristaltic microfluidic flow (43, 44), and microfluidic perfusion (45). However, the use of a well-characterized synthetic auxotroph capable of limited persistence could offer a superior alternative for spatiotemporal control of microbial growth, especially for studying longer duration phenomena such as chronic infection or wound healing. Our study demonstrates how temporal control can be achieved by removal of BipA; we anticipate that spatial control could be achieved by patterning BipA onto a variety of solid surfaces with limited diffusion, such as a skin patch.

We investigated mammalian cell culture health, growth, and morphology after simple transient exposure to a hypermutator variant of DEP that we engineered by inactivating mutS during allelic reconstruction (DEP*). The use of DEP* rather than DEP is yet another form of a stress test to increase opportunity for escape under coculture conditions. We directly cocultured adherent human cell line human embryonic kidney (HEK) 293T with either no bacteria, nonauxotrophic E. coli DH5, or DEP* overnight (24 hours). HEK293T cells were cultured in selection media that allow only growth of desired but not contaminant strains while selecting for bacterial plasmid maintenance. After coculture, we washed cells and replenished cells with media varying in inclusion of BipA and/or an antibiotic cocktail (penicillin/streptomycin/amphotericin B). We continued incubation and imaged cells at days 2, 4, and 7 after initial coincubation. HEK293T cells contain a copy of mCherry integrated into the AAVS1 locus, and they appear red. DH5 and DEP* were transformed with Clover green fluorescent protein before coculture and appear green.

Compared to the control culture where bacteria were not added (Fig. 4A), HEK293T cells cocultured with DH5 display visible bacterial lawns with no attached human cells in the absence of the antibiotic cocktail at all days of observation (Fig. 4B). In the presence of antibiotic, cocultures containing DH5 sharply transition from bacterial overgrowth to apparent bacterial elimination (Fig. 4C). In contrast, cells cocultured with DEP* in the absence of BipA exhibited similar morphology to the control at all days of observation and no detectable bacteria by fluorescence microscopy on day 7, without the need for antibiotics to achieve bacterial clearance (Fig. 4D). Thus, DEP* addition was not detrimental to HEK293T cells in the absence of BipA, and DEP* remains biocontained and cannot survive because of cross-feeding. Clearance of bacterial cells from human cells appears to occur faster for DEP* when not provided BipA (Fig. 4D) than for DH5 when provided with the antibiotic cocktail (Fig. 4C).

Bacteria were added to HEK293T cell cultures and coincubated for 24 hours before washing and replenishing media. HEK293T cells express mCherry, whereas bacterial cells express Clover green protein marker. Images were taken at days 2, 4, and 7 after coincubation. (A) Untreated HEK293T cells. (B) HEK293T with commercial E. coli DH5 in the absence of antibiotic cocktail. (C) HEK293T with DH5 in presence of antibioti
c cocktail. (D) HEK293T and DEP* (mismatch repair inactivated to create hypermutator phenotype) in the absence of BipA. (E) HEK293T cells and DEP* in the presence of BipA. (F) HEK293T and DEP* in the absence of BipA until day 2 [identical at this point to condition in (D)], and then 100 M BipA was added to this condition daily until day 7.

To learn how the synthetic auxotroph behaves when supplied its essential nutrient in these coculture settings, we tested DEP* cocultures with continual resupply of 100 M BipA. Here, DEP* proliferates and in turn decreases proliferation and viability of HEK293T cells (Fig. 4E). A bacterial lawn begins to form on day 2, and at later times, human cell debris is overtaken by DEP*. This demonstrates that DEP* is fully capable of taking over the coculture if supplied with BipA. Replicates for these experiments can be found in figs. S10 to S12.

Given that DEP* grows in cocultures when BipA is provided, we sought to understand whether it could be rescued by readdition of BipA after multiple days of withholding. The possible time scale of reemergence influences applications where the duration of bacterial activity would need to be prolonged and/or repeated via limited BipA introduction while remaining contained. We find that coculturing DEP* with HEK293T cells for 2 days in the absence of BipA followed by the addition of BipA at day 2 does not rescue the DEP* growth (Fig. 4F and fig. S13). Human cells still grow and look morphologically similar to untreated cells, and bacteria are not visible. To look at analogous questions for nonauxotrophic E. coli, we removed antibiotics after 2 days of coculturing and do not observe bacterial rescue (fig. S13). We also investigated whether bacterial clearance could be delayed by the addition of antibiotic after some growth of DH5. DH5 cells grown in the absence of the antibiotic cocktail for 2 days before addition of the cocktail and maintenance to day 7 result in bacterial lawns (fig. S13, A and D). This demonstrates that antibiotic cocktails ordinarily used in mammalian cell culture maintenance can become ineffective beyond a certain amount of nonauxotrophic bacterial growth, whereas synthetic auxotrophy is subject to fewer and different constraints.

To further investigate the persistence of progenitor DEP and its evolved descendants, we performed BipA readdition studies in Lennox lysogeny broth (LB-Lennox) monoculture. Within 7 hours of BipA removal, DEP cell populations that are harvested from midexponential or stationary phases can be reactivated upon delayed BipA addition with unperturbed growth kinetics after a highly tunable lag phase (fig. S14). Further studies are ongoing to investigate the amount of time after which BipA reintroduction can recover growth of synthetic auxotrophs under different contexts.

We have shown that synthetic auxotrophy can exhibit long-term stability and function in unique contexts, enabling reliable control of microbial proliferation. Recent work has also shown that the escape rate and fitness of multiple synthetic auxotrophs can be improved by increasing the specificity of nsAA incorporation machinery (46). Collectively, these engineering and characterization efforts advance synthetic auxotrophy as a powerful safeguard for basic and applied research when using engineered microbes.

Cultures for general culturing, growth rate assays, biocontainment escape assays, MAGE, and fluorescent protein assays were prepared in LB-Lennox medium [bacto tryptone (10 g/liter), sodium chloride (5 g/liter), and yeast extract (5 g/liter)] supplemented with chloramphenicol (15 g/ml), 0.2% (w/v) l-arabinose, 20 mM tris-HCl buffer, 0.005% SDS, and variable concentration of L-4,4-biphenylalanine (BipA). Unless otherwise indicated, all cultures were grown in 96-well deep plates in 300 l of culture volumes at 34C and 400 rpm. The above media are permissive for growth of the synthetic auxotroph. Nonpermissive media are identically formulated as permissive media except for BipA, which is not included.

Construction of appropriate fluidics and chambers followed the eVOLVER framework (26) (figs. S1 and S2). The following components were included: (i) fluidics and chambers (reactor vial, inlet and outlet lines, filters, pumps, stirrers, and inlet and outlet reservoirs); (ii) light source and detector (LED and photodiode); (iii) controller hardware (circuit and microprocessors); and (iv) controller software (Arduino for controlling tasks, Raspberry Pi for computing tasks, and Python code for programming tasks) (full build of materials included in table S3). Briefly, our apparatus consisted of a custom smart sleeve (fig. S3), with the following modifications: Each vial was constructed without temperature control and was supplied by two media pumps (one for permissive media and another for nonpermissive media) and connected to one waste pump. All pumps were RP-Q1 from Takasago Fluidics, each driven off a standard N power MOSFET (metal oxide semiconductor field-effect transistor) with an Arduino controlling the gate. Like the eVOLVER system, we installed a stirring fan underneath each sleeve that consisted of magnets attached to a computer fan. By including a small stir bar within each reactor vial, we enabled efficient mixing of 1-ml working volumes. To enable automated measurement of turbidity [optical density (OD)], we used a 605-nm LED (LO Q976-PS-25) and an OPT101P-J photodiode detector. We mounted the LED and detector on custom printed circuit boards mounted to the vial sleeve to enable easier construction and better control of ambient light leakage into the light path (fig. S4). To monitor turbidity within each vial and to control pump arrays in response, we constructed printed circuit board designs in Gerber format as is standard for circuit fabrication. We attached an Arduino Mega microcontroller with an analog-digital converter and directed it using a PyMata script (47).

Chemostats were operated by automated maintenance of culture OD within a specified parameter range within exponential growth phase (20 to 80% of dynamic range) depending on linearity of photodiode measurements. Constant fixed dilutions of permissive media were used to decrease OD until desired equilibrium of cell growth and dilution rates. This resulted in a sawtooth curve (27), where time between peaks is recorded as a proxy for growth rate. Our program gradually decreased the ratio of permissive to nonpermissive media as step functions, with a specified number of dilution cycles allowed to elapse before the next decrease to provide time for acclimation. Time between OD peaks lengthened as strain fitness decreased. Once a threshold difference between ancestral peak-to-peak time and current peak-to-peak time was passed, the ratio of permissive to nonpermissive media remained fixed. This allowed cells to evolve until peak-to-peak time returns to ancestral values, which initiated the next phase of decrease in BipA concentration. To assess the quality of our continuous evolution process, we paused chemostat trials on a weekly basis for strain storage, strain evaluation, chemostat cleaning, and investigation of contamination.

Growth assays were performed by plate reader with blanking as previously described (25). Overnight cultures were supplemented with different BipA concentrations depending on the strain. The DEP progenitor strain was grown in permissive media containing 100 M BipA, and evolved DEP strains DEP.e3, DEP.e4, and DEP.e5 were grown in permissive media containing 1 M BipA. Saturated overnight cultures were washed twice in LB and resuspended in LB. Resuspended cultures were diluted 100-fold into three 150-l volumes of permissive media. BipA concentrations used in this assay were 0, 0.001, 0.01, 0.1, 0.5, 1, 10, and 100 M. Cultures were incubated in a flat-bottom 96-well plate (34C, 300 rpm). Kinetic growth (OD600) was monitored in a Biotek Eon H1 microplate spectrophotometer reader at 5-min intervals for 48 hours. The doubling times across technical replicates were calculated as previously indicated. We refer to these as technical replica
tes because although triplicate overnight cultures were used to seed triplicate experiment cultures, the overnight cultures were most often seeded from one glycerol stock.

Escape assays were performed as previously described with minor adjustments to decrease the lower detection limit for final evolved populations (23, 46). Strains were grown in permissive media and harvested in late exponential phase. Cells were washed twice with LB and resuspended in LB. Viable CFU were calculated from the mean and SEM of three technical replicates of 10-fold serial dilutions on permissive media. Twelve technical replicates were plated on noble agar combined with nonpermissive media in 500-cm2 BioAssay Dishes (Thermo Fisher Scientific 240835) and monitored daily for 4 days. If synthetic auxotrophs exhibited escape frequencies above the detection limit (lawns) on nonpermissive media, escape frequencies were calculated from additional platings at lower density. The SEM across technical replicates of the cumulative escape frequency was calculated as previously indicated.

Genomic DNA was obtained from evolved populations and ancestral clone using the Wizard Genomic DNA purification kit (Promega). Sequencing libraries were prepared as described in Baym et al. (48). Sequencing was performed using a NextSeq instrument, producing 75base pair (bp), paired-end reads. Resulting data were aligned to the E. coli C321.delA nonauxotrophic but recoded reference sequence (GenBank no. CP006698.1) and the sequence of the plasmid encoding nsAA incorporation machinery. The Millstone software suite was used to identify variants, provide measures of sequencing confidence, and predict their likelihood of altering gene function (49). Genomic variants of low confidence, low sequence coverage, or presence in the ancestral strain were discarded, prioritizing variants observed in three nonessential genes that encode membrane proteins: acrB, emrD, and trkH.

Subsequent genomic sequencing was performed on genomic DNA extracted from the evolved populations and ancestral clone using the DNeasy Blood and Tissue Kit (Qiagen). Genomic DNA was then sent to the Microbial Genome Sequencing Center (MiGS) in Pittsburgh, PA. Variants were identified through the variant calling service from MiGS.

MAGE (33) was used to inactivate the endogenous mutS gene in the DEP strain. Overnight cultures were diluted 100-fold into 3 ml of LB containing chloramphenicol, BipA, l-arabinose, and tris-HCl buffer and grown at 34C until midlog. The genome-integrated lambda Red cassette in this C321.A-derived strain was induced in a shaking water bath (42C, 300 rpm, 15 min), followed by cooling the culture tube on ice for at least 2 min. The cells were made electrocompetent at 4C by pelleting 1 ml of culture (8000 rcf, 30 s) and washing thrice with 1 ml of ice-cold 10% glycerol. Electrocompetent pellets were resuspended in 50 l of dH2O containing the desired DNA; for MAGE oligonucleotides, 5 M of each oligonucleotide was used. Allele-specific colony polymerase chain reaction (PCR) was used to identify desired colonies resulting from MAGE as previously described (50). Oligonucleotides used for MAGE and for allele-specific colony PCR are included in table S4.

This assay was performed using a similar protocol as described in the Measurement of doubling times section. The cultures for DEP and its single mutants were grown overnight in 100 M BipA. Then, cultures were diluted 100 in the media specified. Those conditions include standard media conditions and single component changes: 0% SDS, 0.01% SDS, 0.02% (w/v) arabinose, 0 mM tris-HCl, and chloramphenicol (30 g/ml). The cultures were grown in triplicate for each condition and in a SpectraMax i3 plate reader, shaking at 34C for 24 hours. The OD600 was measured about every 5 min. The doubling times were then calculated as previously described.

HEK293T cells containing one copy of mCherry marker (red) integrated into the AAVS1 locus were grown at 40 to 50% confluency in DMEM (Dulbeccos modified Eagles medium) high-glucose medium (Thermo Fisher Scientific, catalog no. 11965175) with 10% inactivated fetal bovine serum (FBS; Thermo Fisher Scientific, catalog no. 10082147), 100 MEM NEAA (nonessential amino acids; Thermo Fisher Scientific, catalog no. 11140050), and 100 diluted anti-anti cocktail [antibiotic-antimycotic: penicillin (10,000 U/ml), streptomycin (10,000 g/ml), and Gibco amphotericin B (25 g/ml); Thermo Fisher Scientific, catalog no. 15240112). Commercially acquired E. coli DH5 bacteria were used as control to the E. coli DEP mutS or DEP* strain. A plasmid containing Clover (green marker) containing a UAA stop codon compatible with the biocontained strain DEP, and under the selection marker ampicillin was transformed into both DH5 and DEP* strains to visualize them with the mammalian cells (red). BipA-dependent auxotroph DEP* bacteria were grown to an OD of 0.6 in LB medium supplemented with 1% l-arabinose, 100 M BipA, carbenicillin (100 g/ml), and chloramphenicol (25 g/ml) and then washed three times with 1 phosphate-buffered saline (PBS). DEP* culture conditions with l-arabinose, carbenicillin, and chloramphenicol supplements did slightly affect HEK293T early cell growth compared to untreated cells, although insufficient to affect conclusions drawn from these experiments. DH5 strain was grown to an OD of 0.6 with carbenicillin (100 g/ml). The pellet of 10-ml bacterial cell culture was resuspended in mammalian cell medium as described above without any antibiotics and anti-anti, and split equally among all conditions and their replicates. Auxotroph bacteria are added to HEK293T cells plated in pretreated 12-well plates in 2 ml of mammalian cell medium. The coculture is incubated overnight before the medium that contains the bacterial cells is removed. HEK293T cells were washed three times with 1x PBS (Thermo Fisher Scientific, catalog no. 10010023) and replenished with fresh media as conditions indicate. Media were replaced and added fresh to all conditions daily for 7 days. Imaging of cells was done with the inverted microscope Nikon Eclipse TS100 at days 2, 4, and 7 after initial coculture at 200 magnification.

Conditions:

Control: HEK293T grown in regular 10% FBS media with anti-anti and NEAA as described above.

DH5: HEK293T cells cocultured with this strain in mammalian cell media supplemented with carbenicillin (100 g/ml) to maintain plasmid during growth and absence of anti-anti.

DH5; anti-anti (antibiotic cocktail): HEK293T cells cocultured with this strain in mammalian cell media supplemented with carbenicillin (100 g/ml) to maintain plasmid during growth and presence of anti-anti cocktail.

DH5; anti-anti after day 2: HEK293T cells cocultured with this strain in mammalian cell media supplemented with carbenicillin (100 g/ml) to maintain plasmid during growth and absence of anti-anti cocktail. At 48 hours, anti-anti added and maintained to day 7.

DH5; anti-anti; no anti-anti after day 2: HEK293T cells cocultured with this strain in mammalian cell media supplemented with carbenicillin (100 g/ml) to maintain plasmid during growth and presence of anti-anti until day 2. After day 2, no anti-anti added and maintained to day 7.

DEP*: HEK293T cells cocultured with the biocontained strain in media supplemented with l-arabinose, chloramphenicol (25 g/ml), and carbenicillin (100 g/ml) to maintain bacteria and green marker. No bipA or anti-anti was added.

DEP*; bipA: HEK293T cells cocultured with the biocontained strain in media supplemented with l-arabinose, chloramphenicol (25 g/ml), and carbenicillin (100 g/ml) to maintain bacteria and green marker. One hundred micromolar bipA and no anti-anti added.

DEP*; bipA after day 2: HEK293T cells cocultured with the biocontained strain in media supplemented with l-arabinose, chloramphenicol (25 g/ml), and carbenicillin (100 g/ml) to maintain bacteria and green marker. No bipA or anti-anti added. At 48 hours, bipA at 100 M concentration added and maintained to day 7.

DEP*; anti-anti: HEK293T cells cocultured with the biocontaine
d strain in media supplemented with anti-anti, l-arabinose, chloramphenicol (25 g/ml), and carbenicillin (100 g/ml) to maintain bacteria and green marker. No bipA added.

DEP*; bipA; anti-anti: HEK293T cells cocultured with the biocontained strain in media supplemented with anti-anti, l-arabinose, chloramphenicol (25 g/ml), and carbenicillin (100 g/ml) to maintain bacteria and green marker. One hundred micromolar bipA added.

Persistence was evaluated by two kinds of assays: plate reader and colony count. For the plate reader case, DEP, DEP.e3, DEP.e4, and DEP.e5 cultures were grown overnight in permissible media conditions with 100 M BipA. For cells harvested at midexponential phase, the cultures were diluted 100 and grown to that state. Both stationary-phase and midexponential-phase cultures were then washed twice with LB media and resuspended in the original volume of nonpermissible media containing all specified media components except BipA. The resuspended cultures were then diluted 100 into nonpermissible media in triplicate for each time point to be tested. The specified concentration of BipA was then added back to those cultures at the specified time points. Typically, the BipA readdition occurred at 10 or 5 M concentrations and at hourly or daily intervals. The cultures were then incubated with shaking in SpectraMax i3 plate readers in a flat, clear-bottom 96-well plate with breathable and optically transparent seal for an upward of 84 hours at 34C. Approximately every 5 min, the OD600 was measured to determine cell growth kinetics.

More:
Synthetic auxotrophy remains stable after continuous evolution and in coculture with mammalian cells - Science Advances

Science, industry team up in Italy to zap virus with laser – Reuters

A rendering of an air purifier prototype developed by Italian tech company Eltech K-laser is seen in this image obtained by Reuters on June 30, 2021. Eltech K-Laser/Handout via REUTERS

ROME, July 2 (Reuters) - A United Nations-backed scientific research centre has teamed up with an Italian tech firm to explore whether laser light can be used to kill coronavirus particles suspended in the air and help keep indoor spaces safe.

The joint effort between the International Centre for Genetic Engineering and Biotechnology (ICGEB) of Trieste, a city in the north of Italy, and the nearby Eltech K-Laser company, was launched last year as COVID-19 was battering the country.

They created a device that forces air through a sterilization chamber which contains a laser beam filter that pulverizes viruses and bacteria.

"I thought lasers were more for a shaman rather than a doctor but I have had to change my mind. The device proved able to kill the viruses in less than 50 milliseconds," said Serena Zacchigna, group leader for Cardiovascular Biology at the ICGEB.

Healthy indoor environments with a substantially reduced pathogen count are deemed essential for public health in the post COVID-19 crisis, a respiratory infection which has caused more than four million deaths worldwide in barely 18 months.

Zacchigna hooked up with Italian engineer Francesco Zanata, the founder of Eltech K-Laser, a firm specialised in medical lasers whose products are used by sports stars to treat muscle inflammation and fractures.

Some experts have warned against the possible pitfalls of using light-based technologies to attack the virus that causes COVID-19.

A study published by the Journal of Photochemistry & Photobiology in November 2020 highlighted concerns ranging from potential cancer risks to the cost of expensive light sources.

But Zacchigna and Zanata dismissed any health issues, saying the laser never comes into contact with human skin.

"Our device uses nature against nature. It is 100% safe for people and almost fully recyclable," Zanata told Reuters.

The technology, however, does not eliminate viruses and bacteria when they drop from the air onto surfaces or the floor. Nor can it prevent direct contagion when someone who is infected sneezes or talks loudly in the proximity of someone else.

Eltech K-Laser has received a patent from Italian authorities and is seeking to extend this globally.

The portable version of the invention is some 1.8 metres (5.9 ft) high and weighs about 25 kg (55 lb). The company said the technology can also be placed within air-conditioning units.

In the meantime, the first potential customers are lining up, including Germany's EcoCare, a service provider of testing and vaccination solutions.

"The company aims to license the technology for German and UAE markets," an EcoCare spokesperson said in an email to Reuters.

Reporting by Giselda Vagnoni; Editing by Crispian Balmer, William Maclean

Our Standards: The Thomson Reuters Trust Principles.

See the article here:
Science, industry team up in Italy to zap virus with laser - Reuters

ASU event to address human dignity and technoscience – ASU Now

September 24, 2019

The Pew Research Center has reported that more and more people identify themselves as spiritual but not religious. How can this be explained in our highly technoscientific age? Since technoscience is taken to be secular, how can we make sense of the relationship between our radical technoscientific advances and our search for spirituality?

A group of Arizona State University researchers will explore these and other questions through a project titled Beyond Secularization: A New Approach to Religion, Science and Technology, which has received a $1.7M grant from the Templeton Religion Trust.

The Center for the Study of Religion and Conflict will serve as the lead unit for this major interdisciplinary initiative that seeks to explore the underlying assumptions about science and technology research, exploring whether religious ideas shape scientific research directions and revealing new models for understanding ideas of progress.

Conflicts at the borders of religion, science and technology have been a major research area of the centers since its inception in 2003. Partnering withHava Tirosh-Samuelson, now a Regents Professor and director of Jewish studies, the center launched a faculty seminar in 2004 that met for almost 15 years. Several externally funded projects that grew out of the seminar supported a major lecture series, international research conferences and numerous publications.

All of this work positioned the center for this latest project, which has the potential to have a major impact in how we understand not only the interplay between religion, science and technology in public life, but also how we understand ideas and meanings of progress.

Beyond Secularization builds on a small pilot project that produced over 20 articles, including a cover story in the January issue of Sojourners magazine. It will establish a collaboratory that will include graduate students, postdocs and faculty who will develop and advance new research methods and understandings over the next three to four years.

To learn more about the subject, ASU Now sat down with Tirosh-Samuelson,Ben Hurlbut,School of Life Sciences associate professor,andGaymon Bennett, School of Historical, Philosophical and Religious Studies associate director of research and associate professor,who will serve as co-principal investigators.

Question: What does the title of this project refer to?

Hurlbut: The project looks at the relationships between religion, science and technology in several important domains of public life: in environmental movements, in shifting ideas of the spiritual self that draw upon science, in arenas of high-technology innovation that are reshaping how we live and in the ways societies debate and govern the ethical implications of biotechnological transformation of life, including human life. We want to understand how science, technology and religion are related in those domains, including how lines are drawn between them. There is a pretty widespread assumption that as scientific knowledge and technological capacity increase, religion retreats into the background. And yet, if you look at how people think and talk, things are a lot messier. Go to Silicon Valley and you will encounter a lot of people who are imagining a technological future in terms of its potential to bring a kind of redemption and transcendence, a kind of eschatology. In other domains, like in public debate about biotechnologies, like human genome editing, there is a lot of drawing of lines between scientifically-grounded ethical views versus religious ones. But in all these areas, the boundaries are less clear than we tend to assume. They are a lot more mixed, a lot more hybrid, a lot fuzzier. And understanding that is important for how we think about the relationships between science, technology and religion in contemporary public life.

Q: How is this project unique?

Tirosh-Samuelson: The core work of this project will be done by a collaborative lab (co-lab, for short), which will include the three principal investigators, invited faculty, postdoctoral fellows and graduate students. This group will be studying together and will host visiting scholars from other universities around the world who will help enrich the discussion about big picture questions. The work of the co-lab will be distinctly interdisciplinary, crossing boundaries between history, science and technology studies, religious studies, sociology and anthropology. Our basic conviction is that to understand the interplay between religion, science and technology, we need to pose new questions and engage new methods. The artificial dichotomy between science and religion is no longer valid and even talking about a dialogue between religion and science is insufficient. We need to develop deeper ways to understand how these domains operate in our public life, and to do so, we must engage new disciplines that previously have not been applied to this field of inquiry. Since the project engages religion, science and technology in public life, it will have a public component, including public lectures that will involve the entire ASU community as well as an outreach program to people outside the ASU community, such as high-tech innovators in various innovations enclaves (e.g. Silicon Valley). The public aspect of the project exemplifies ASUs commitment to social embeddedness and to breaking the boundaries between the academy and the community.

"Theres been this sort of assumption that as technology progresses, as knowledge progresses, we get less religious, we become more secular."

Ben Hurlbut,School of Life Sciences associate professor

Q: Why do we see such pronounced boundaries between the religious and the secular in academia?

Hurlbut: Theres been this sort of assumption that as technology progresses, as knowledge progresses, we get less religious, we become more secular. That assumption has also been built into the way some fields study modern life, whether or not that actually corresponds with people's lived experience. So one of the things that we want to do is ask, "What are the things that we're overlooking?" Because we have operated in the social sciences, to a very significant degree, under the assumption that secularization is an inevitable result of modernization and progress, religion is either left behind or pushed to the side. It drops out of public life and becomes privatized. So, the disciplines have sort of carved themselves up in ways that are mapped onto assumptions about the world and knowledge that may not actually be correct.

Q: How have the boundaries between the religious and the secular changed over time?

Bennett: Theres this widespread belief today that if you want to transform the world, you don't really need religion. Your just need science and technology. And yet if you go someplace like Silicon Valley and you walk down Sand Hill Road and walk into a coffee shop and you sit and listen to innovators talk about what they're doing, theyre all talking about transforming the most fundamental aspects of what it means to be human. And if you tune in closely, all sorts of kinds of topics that we used to associate with religion or spirituality are being talked about in relation to technology. Questions like what does it mean to be a being with a finite body? Can we overcome our own frailty and even cure aging? What does it mean to be connected to other people and to our environments? What does it mean for us to be able to build infrastructures in the world that promised to united us together but have become the engine for so much division?

"When we study religious environmentalism, we have to think anew about terms such as 'secular,' 'religious,' 'worldliness' and 'otherworldliness.'"

Hava Tirosh-Samuelson, Regents Professor and director of Jewish studies

Q: What are some other areas where we see this happening?

Tirosh-Samuelson: The area that I wor
k on is religious environmentalism. This movement emerged in the U.S. in the 1960s when people began to be aware of the ecological crisis. Interestingly, some of the scientists who were first to note the crisis were religious practitioners who considered the environmental crisis an assault on Gods created world. The interreligious movement of religious environmentalism and the academic discourse on religion and ecology illustrate the porous boundaries between science and religion or between the religious and secular aspects of life. For religious environmentalists, the natural world, or the environment, is not simply inert matter that can be known only through science, but rather the expression of divine creativity. When we study religious environmentalism, we have to think anew about terms such as secular, religious, worldliness and otherworldliness. Our analysis of religious environmentalism is not only historically grounded, it is also attentive to religious diversity and religious differences. The way we think about the relationship between religion and science reflects the legacy of Christianity. But other world religions, for example Judaism, Islam, Hinduism or Buddhism approach these issues quite differently. In addition to religious diversity, we are going to interrogate the category of spirituality as a hybrid category that fuses the secular and the religious. We can see it in regard to environmentalism but also in other domains such as medicine and the wellness industry. But what does it mean to be spiritual but not religious and how does spirituality express itself? We will seek to address these questions.

"What does it mean to alter a world our children will inherit?"

Gaymon Bennett,School of Historical, Philosophical and Religious Studies associate director of research

Q: How are these questions relevant to peoples everyday lives?

Bennett: It's not incidental that the three research areas for this project are three areas that are some of the major areas of collective crisis in the world today. On one level, these areas seem so timely, so current the question of bioengineering will transform our bodies, or how digital innovation will change our sense of ourselves. But on another level, these are really old, really fundamental questions: What does it mean to alter a world our children will inherit? How do our religious and spiritual views of reality shape what gets to count as important, or desirable or dangerous? Our lives are saturated with science and technology. Its fundamentally changing how we relate to ourselves our bodies, our planet, our food, our lovers, our sense of a higher reality. And then of course theres the environmental crisis and the question of what we modern people have done to our relationship with nature, whether it has intrinsic meaning and what that might be. All of these areas cut across time, place, culture and tradition, and are some of the most pressing issues that humanity is facing today.

Read more here:
ASU event to address human dignity and technoscience - ASU Now

Global CRISPR Gene Editing Market: Focus on Products, Applications, End Users, Country Data (16 Countries), and Competitive Landscape – Analysis and…

New York, Feb. 01, 2021 (GLOBE NEWSWIRE) -- Reportlinker.com announces the release of the report "Global CRISPR Gene Editing Market: Focus on Products, Applications, End Users, Country Data (16 Countries), and Competitive Landscape - Analysis and Forecast, 2020-2030" - https://www.reportlinker.com/p06018975/?utm_source=GNW Application Agricultural, Biomedical (Gene Therapy, Drug Discovery, And Diagnostics), Industrial, and Other Applications [Genetically Modified Foods (GM Foods), Biofuel, And Animal (Livestock) Breeding] End-User - Academic Institutes and Research Centers, Biotechnology Companies, Contract Research Organizations (CROs), and Pharmaceutical and Biopharmaceutical Companies

Regional Segmentation

North America U.S., Canada Europe Germany, France, Italy, U.K., Spain, Switzerland, and Rest-of-Europe Asia-Pacific China, Japan, India, South Korea, Singapore, Australia, and Rest-of-Asia-Pacific (RoAPAC) Latin America Brazil, Mexico, and Rest-of-the-Latin America Rest-of-the-World

Growth Drivers

Prevalence of Genetic Disorders and Use of Genome Editing Government and Private Funding Technology Advancement in CRISPR Gene Editing

Market Restraints

CRISPR Gene Editing: Off Target Effects and Delivery Ethical Concerns and Implications with Respect to Human Genome Editing

Market Opportunities

Expanding Gene and Cell Therapy Area CRISPR Gene Editing Scope in Agriculture

Key Companies ProfiledAbcam, Inc., Applied StemCell, Inc., Agilent Technologies, Inc., Cellecta, Inc., CRISPR Therapeutics AG, Thermo Fisher Scientific, Inc., GeneCopoeia, Inc., GeneScript Biotech Corporation, Horizon Discovery Group PLC, Integrated DNA Technologies, Inc., Merck KGaA, New England Biolabs, Inc., Origene Technologies, Inc., Rockland Immunochemicals, Inc., Synthego Corporation, System Biosciences LLC, ToolGen, Inc., Takara Bio

Key Questions Answered in this Report: What is CRISPR gene editing? What is the timeline for the development of CRISPR technology? How did the CRISPR gene editing market evolve, and what is its scope in the future? What are the major market drivers, restraints, and opportunities in the global CRISPR gene editing market? What are the key developmental strategies that are being implemented by the key players to sustain this market? What is the patent landscape of this market? What will be the impact of patent expiry on this market? What is the impact of COVID-19 on this market? What are the guidelines implemented by different government bodies to regulate the approval of CRISPR products/therapies? How is CRISPR gene editing being utilized for the development of therapeutics? How will the investments by public and private companies and government organizations affect the global CRISPR gene editing market? What was the market size of the leading segments and sub-segments of the global CRISPR gene editing market in 2019? How will the industry evolve during the forecast period 2020-2030? What will be the growth rate of the CRISPR gene editing market during the forecast period? How will each of the segments of the global CRISPR gene editing market grow during the forecast period, and what will be the revenue generated by each of the segments by the end of 2030? Which product segment and application segment are expected to register the highest CAGR for the global CRISPR gene editing market? What are the major benefits of the implementation of CRISPR gene editing in different field of applications including biomedical research, agricultural research, industrial research, gene therapy, drug discovery, and diagnostics? What is the market size of the CRISPR gene editing market in different countries of the world? Which geographical region is expected to contribute to the highest sales of CRISPR gene editing market? What are the reimbursement scenario and regulatory structure for the CRISPR gene editing market in different regions? What are the key strategies incorporated by the players of global CRISPR gene editing market to sustain the competition and retain their supremacy?

Market OverviewThe development of genome engineering with potential applications proved to reflect a remarkable impact on the future of the healthcare and life science industry.The high efficiency of the CRISPR-Cas9 system has been demonstrated in various studies for genome editing, which resulted in significant investments within the field of genome engineering.

However, there are several limitations, which need consideration before clinical applications.Further, many researchers are working on the limitations of CRISPR gene editing technology for better results.

The potential of CRISPR gene editing to alter the human genome and modify the disease conditions is incredible but exists with ethical and social concerns. The global CRISPR gene editing market was valued at $846.2 million in 2019 and is expected to reach $10,825.1 million by 2030, registering a CAGR of 26.86% during the forecast.

The growth is attributed to the increasing demand in the food industry for better products with improved quality and nutrient enrichment and the pharmaceutical industry for targeted treatment for various diseases. Further, the continued significant investments by healthcare companies to meet the industry demand and growing prominence for the gene therapy procedures with less turnaround time are the prominent factors propelling the growth of the global CRISPR gene editing market.

Research organizations, pharmaceutical and biotechnology industries, and institutes are looking for more efficient genome editing technologies to increase the specificity and cost-effectiveness, also to reduce turnaround time and human errors.Further, the evolution of genome editing technologies has enabled wide range of applications in various fields, such as industrial biotech and agricultural research.

These advanced methods are simple, super-efficient, cost-effective, provide multiplexing, and high throughput capabilities. The increase in the geriatric population and increasing number of cancer cases, and genetic disorders across the globe are expected to translate into significantly higher demand for CRISPR gene editing market.

Furthermore, the companies are investing huge amounts in the research and development of CRISPR gene editing products, and gene therapies. The clinical trial landscape of various genetic and chronic diseases has been on the rise in recent years, and this will fuel the CRISPR gene editing market in the future.

Within the research report, the market is segmented based on product type, application, end-user, and region. Each of these segments covers the snapshot of the market over the projected years, the inclination of the market revenue, underlying patterns, and trends by using analytics on the primary and secondary data obtained.

Competitive LandscapeThe exponential rise in the application of precision medicine on a global level has created a buzz among companies to invest in the development of novel CRISPR gene editing. Due to the diverse product portfolio and intense market penetration, Merck KGaA, and Thermo Fisher Scientific Inc. have been the pioneers in this field and have been the major competitors in this market. The other major contributors of the market include companies such as Integrated DNA Technologies (IDT), Genscript Biotech Corporation, Takara Bio Inc, Agilent Technologies, Inc., and New England Biolabs, Inc.

Based on region, North America holds the largest share of CRISPR gene editing market due to substantial investments made by biotechnology and pharmaceutical companies, improved healthcare infrastructure, rise in per capita income, early availability of approved therapies, and availability of state-of-the-art research laboratories and institutions in the region. Apart from this, Asia-Pacific region is anticipated to grow at the fastest CAGR during the forecast period.

Countries Covered North America U.S. Canada Europe Germany Italy France Spain U.K. Switzerland Rest-of-Europe Asia-Pacific Chi
na India Australia South Korea Singapore Japan Rest-of-Asia-Pacific Latin America Brazil Mexico Rest-of-Latin America Rest-of-the-WordRead the full report: https://www.reportlinker.com/p06018975/?utm_source=GNW

About ReportlinkerReportLinker is an award-winning market research solution. Reportlinker finds and organizes the latest industry data so you get all the market research you need - instantly, in one place.

__________________________

Here is the original post:
Global CRISPR Gene Editing Market: Focus on Products, Applications, End Users, Country Data (16 Countries), and Competitive Landscape - Analysis and...

Viewpoint: Promoting science with ideology Pro-GMO vegans use animal rights advocacy to boost vaccine, biotech acceptance – Genetic Literacy Project

The COVID-19 pandemic has reminded us that we are part of a living, evolving ecosystem and often at its mercy. Despite all our accomplishments as a species, a virus accidentally unleashed on the world has wrought enormous destruction around the globe, the effects of which we probably will not be able to fully assess for many years. Although we cannot always anticipate the damage an infectious disease will do, our best bet at surviving the fallout is a commitment to science-based policies that fuel the development of better preventative strategies, most importantly vaccines. The same lesson extends to most environmental and public health challenges we face.

To many people, though, a vaccine isnt a biological roadblock to the spread of infectious disease, but a scheme hatched by Big Pharma and their stooges in government to control humanity. Its appropriate to maintain some skepticism of corporations and the governments that regulate them, indeed such critical thinking should be encouraged among consumers. Nevertheless, healthy skepticism and cynicism are not the same, and people must learn to distinguish the two if we are going to make progress in our never-ending battle against infectious disease and other maladies that threaten humanity.

While this sometimes seems like an impossible task to science advocates, the pro-GMO vegan community has illustrated how people with deep ideological commitments can embrace science, specifically crop biotechnology and vaccines, without compromising their personal beliefs.

If you want to convince someone to change their mind on a controversial issue, dont attack their worldview, which all but guarantees they will dismiss your arguments as a threat to their identity. This is a lesson Vegan GMO, a small community founded by friends with a passion for animal welfare, has taken to heart. Rather than attack the ideology of their target audience, the group uses their shared beliefs to encourage acceptance of crop biotechnology and vaccines in the broader vegan community.

Vegans sometimes oppose biotechnology because a particular application of the technology may be tested on animals or developed using animal products. This categorizes animals as property to be used for human benefit rather than sentient, living beingsan outlook many vegans find abhorrent.

But vegans do not just make animal-welfare arguments, they often rely on anti-GMO misinformation, like the long-debunked link between consuming GM crops and developing liver and kidney problems. Popular veganism proponents such as retired activist Gary Yourofsky have also latched onto playing God arguments based on the assumption that natural food is better food. God made a tomato perfectly when he created it. Leave it at that, he argued during a 2015 interview. Stop altering tomatoes, stop altering everything on this planet. Its fine the way it was created.

Jayson Merkley, a pro-GMO vegan and fellow at Cornell Universitys Alliance for Science, says the answer to this sort of rhetoric is simple: stop testing GM crops on animals, which is sometimes required before a new product can enter the food supply. This simple change in the GM crop approval process would discourage vegans from repeating pseudo-scientific anti-GMO arguments to defend their position on animal welfare.

Read the original here:
Viewpoint: Promoting science with ideology Pro-GMO vegans use animal rights advocacy to boost vaccine, biotech acceptance - Genetic Literacy Project

A kidney transplant from a pig to a human has worked. What you need to know – World Economic Forum

Earlier this week, surgeons at New York Universitys Langone Transplant Institute successfully performed a pig kidney transplant. This in itself would be unremarkable. What does mark the achievement as unprecedented is the identity of the donor a genetically modified pig.

Some days post-surgery, the recipient, a brain-dead patient whose family consented to the experimental procedure, has not rejected the kidney and tests show that it is functioning normally. This incredible feat is significant both as a demonstration of scientific control over biological systems and as a beacon of hope to others in line for a transplant.

The idea of using other species for organ transplants is not new; we have used pig heart-valves for over 50 years. Yet whole organs have presented several challenges, most notably the risk of rejection. This occurs because the body believes the transplant is an invader that must be destroyed, leading to an immune response that attacks the organ. While the triggers for rejection are not completely understood, one of the biggest barriers to cross-species transplantation is a molecule known as alpha-gal, a carbohydrate that immediately elicits a massive immune response.

To counteract this, scientists used a powerful tool of genetic engineering, CRISPR, to modify the pigs genome so that it does not produce alpha-gal. CRISPR has existed for less than a decade, yet its ability to accurately cut and paste specific pieces of genomes is already leading to breakthroughs in many areas of biology including in the development of COVID-19 vaccines.

At present, over 100,000 people in the United States are awaiting an organ donation, among whom 83%, ~91,000, are in need of a kidney. Though 54% of US citizens are registered organ donors, less than 1% of deaths result in useable organs, so supply will always outstrip demand.

Consequently, wait times for a kidney can range from four months to six years depending on blood type, geographic location, disease severity, immune system activity, and other factors. Most of those on the waiting list must have their blood cleaned via hemodialysis, a process that entails commuting to a dialysis centre and spending four hours a day, three times a week, attached to a machine simply to stay alive. The longer they are on dialysis, the smaller their chance of a successful kidney transplant becomes as the procedure can only partially compensate for the damaged organ.

Every year, 5,000 people die waiting for a transplant and another 5,000 are removed from the list because they are no longer healthy enough to receive it, meaning that only 65% of those placed on transplant lists will receive a kidney in time. This latest development could prove to be a gamechanger.

But there will be difficult questions about the ethics of modifying other species to fit our needs, and the event may spark further dialogue on the conditions pigs and other animals are currently raised in. There are also still many unanswered questions surrounding the efficacy of cross-species transplantation. Can pig kidney transplants to humans save lives? Well, before we get to an answer, more robust, longer-term trials will have to take place.

Yet the significance of this pig kidney transplant demonstration should not be underestimated this is a momentous step towards saving the lives of tens of thousands of people awaiting a transplant, not to mention the half a million with kidney failure who do not even qualify because of scarcity. It also speaks to the potential of biotechnology more broadly to transform the health outcomes of millions of people.

Written by

Cameron Fox, Project Specialist, Shaping the Future of Health and Healthcare, World Economic Forum

The views expressed in this article are those of the author alone and not the World Economic Forum.

Continued here:
A kidney transplant from a pig to a human has worked. What you need to know - World Economic Forum

SAB Biotherapeutics Debuts as Publicly Traded Next-Generation Immunotherapy Company – BioSpace

Completed Business Combination with Big Cypress Acquisition Corp.

Common stock to commence trading on the Nasdaq Global Market October 25, 2021, under the ticker symbol SABS

SIOUX FALLS, S.D., Oct. 25, 2021 (GLOBE NEWSWIRE) --SAB Biotherapeutics, Inc. (Nasdaq: SABS), (SAB), a clinical-stage biopharmaceutical company with a novel immunotherapy platform that produces specifically targeted, high-potency, fully-human polyclonal antibodies without the need for human donors, today announced the completion of its business combination with Big Cypress Acquisition Corp. (Nasdaq: BCYP) (Big Cypress), a publicly-traded special purpose acquisition company (SPAC) focused on innovative biopharmaceutical firms. The common stock and warrants of the resulting combined company, SAB Biotherapeutics Inc. will commence trading on the Nasdaq Global Market (the NASDAQ) under the ticker symbol SABS and SABSW, respectively, on October 25, 2021.

The stockholders of Big Cypress approved the transaction at a Special Meeting held on October 20, 2021. The transaction was previously approved by SABs shareholders. SABs management team will be led by Eddie Sullivan, Ph.D., Co-Founder, President, and Chief Executive Officer, who previously served as President, and Chief Executive Officer. Samuel J. Reich, formerly Chief Executive Officer and Chief Financial Officer of Big Cypress, and Jeffrey G. Spragens, Big Cypress Chairman of the Board of Directors, will join the SAB Board of Directors, with Mr. Reich assuming the role of Executive Chairman.

We are excited to enter the public markets at such a pivotal time when next-generation immunotherapies like ours are essential in driving improvement in the global health landscape. We extend our gratitude to the Big Cypress team for being our partner in driving our vision of developing scalable and highly potent polyclonal antibody therapies, said Dr. Eddie Sullivan. We would also like to thank the SAB team, as well as our new and existing shareholders, who are making our important work possible. The SAB team is committed to progressing our science and expanding the reach of our unique DiversitAb platform, now as a public company.

Dr. Sullivan added, We look forward to reporting clinical data from a number of our programs in the coming months. SAB expects to announce topline clinical data for our seasonal influenza program before the end of the year, and we expect to report clinical data from our NIH-sponsored COVID-19 clinical trials as soon as it becomes available.

SABs innovative and versatile DiversitAb platform and talented team bring a unique approach to the development of immunotherapies, which is why we chose them as our merger partner, said Samuel Reich. Our experienced biopharmaceutical team was initially impressed by the ability of SABs platform to produce high-potency fully-human polyclonal antibodies with the potential to address a variety of serious diseases with high unmet medical need. In the few months since we announced our intention to merge, SAB has achieved multiple significant milestones, reinforcing our confidence in their ability to execute and deliver on the promise of their technology. Im delighted to be joining the SAB team to advance the companys clinical programs and business strategy, with the goal of building a differentiated biopharmaceutical company committed to creating shareholder value and having a significant positive impact on human health.

Summary of TransactionOn June 22, 2021, SAB entered into a definitive business combination agreement with Big Cypress Acquisition Corp., a blank check company focused on innovative biopharmaceutical firms, which was created for the purpose of entering into a business combination with a selected biopharmaceutical company and bringing the combined entity to the NASDAQ.

The gross proceeds from the transaction after redemptions and before deducting transaction fees are approximately $30 million. SAB intends to use the proceeds from the transaction to progress its pipeline programs and to augment its recent $60.5 million award from the government, in addition to approximately $27 million remaining from previous awards.

Recent Developments Demonstrate Momentum Across Key Initiatives

AdvisorsLazard served as exclusive financial advisor to SAB. Stradling Yocca Carlson & Rauth is serving as legal counsel to SAB. Chardan served as exclusive M&A advisor and financial advisor to Big Cypress and Dentons US LLP is serving as legal counsel. Ladenburg Thalmann & Co. Inc. is acting as a capital markets advisor to Big Cypress.

About SAB Biotherapeutics, Inc.SAB Biotherapeutics, Inc. (SAB) is a clinical-stage, biopharmaceutical company advancing a new class of immunotherapies leveraging fully human polyclonal antibodies. SAB has applied advanced genetic engineering and antibody science to develop transchromosomic (Tc) Bovine that produce fully-human antibodies targeted at specific diseases, including infectious diseases such as COVID-19 and influenza, immune system disorders including type 1 diabetes and organ transplantation, and cancer. SABs versatile DiversitAb platform is applicable to a wide range of serious unmet needs in human diseases. It produces natural, specifically targeted, high-potency, human polyclonal immunotherapies. SAB is currently advancing multiple clinical programs and has collaborations with the US government and global pharmaceutical companies. For more information on SAB, visit: https://www.sabbiotherapeutics.com and follow @SABBantibody on Twitter.

Contact:Melissa Ullerich+1 605-679-4609mullerich@sabbiotherapeutics.com

Courtney Turiano (investors)Stern IR212-698-8687Courtney.Turiano@sternir.com

Forward-Looking Statements Certain statements made herein that are not historical facts are forward-looking statements for purposes of the safe harbor provisions under The Private Securities Litigation Reform Act of 1995. Forward-looking statements generally are accompanied by words such as believe, may, will, estimate, continue, anticipate, intend, expect, should, would, plan, predict, potential, seem, seek, future, outlook and similar expressions that predict or indicate future events or trends or that are not statements of historical matters. These forward-looking statements include, but are not limited to, statements regarding future events. These statements are based on the current expectations of SAB and are not predictions of actual performance. These forward-looking statements are provided for illustrative purposes only and are not intended to serve as, and must not be relied on, by any investor as a guarantee, an assurance, a prediction or a definitive statement of fact or probability. Actual events and circumstances are difficult or impossible to predict, will differ from assumption and are beyond the control of SAB.

Read the original here:
SAB Biotherapeutics Debuts as Publicly Traded Next-Generation Immunotherapy Company - BioSpace

Top 4 Applications of Genetic Engineering

The following points highlight the top four applications of genetic engineering. The applications are: 1. Application in Agriculture 2. Application to Medicine 3. Energy Production 4. Application to Industries.

An important application of recombinant DNA technology is to alter the genotype of crop plants to make them more productive, nutritious, rich in proteins, disease resistant, and less fertilizer consuming. Recombinant DNA technology and tissue culture techniques can produce high yielding cereals, pulses and vegetable crops.

Some plants have been genetically programmed to yield high protein grains that could show resistance to heat, moisture and diseases.

Some plants may even develop their own fertilizers some have been genetically transformed to make their own insecticides. Through genetic engineering some varieties have been produced that could directly fix atmospheric nitrogen and thus there is no dependence on fertilizers.

Scientists have developed transgenic potato, tobacco, cotton, corn, strawberry, rape seeds that are resistant to insect pests and certain weedicides.

Bacterium, Bacillus thurenginesis produces a protein which is toxic to insects. Using the techniques of genetic engineering, the gene coding for this toxic protein called Bt gene has been isolated from bacterium and engineered into tomato and tobacco plants. Such transgenic plants showed nee to tobacco horn worms and tomato fruit worms. These genotypes are awaiting release in USA.

There are certain genetically evolved weed killers which are not specific to weeds alone but kill useful crops also. Glyphosate is a commonly used weed killer which simply inhibits a particular essential enzyme in weeds and other crop plants. A target gene of glyphosate is present in bacterium salmonella typhimurium. A mutant of S. typhimurium is resistant to glyphosate.

The mutant gene was t cloned to E. coli and then recloned to Agrobacterium tumifaciens through its Ti Plasmid. Infection of plants with Ti plasmid containing glyphosate resistant gene has yielded crops such as cotton, tabacco maize, all of which are resistant to glyphosate.

This makes possible to spray the crop fields with glyphosate which will kill the weeds only and the genetically modified crops with resistant genes remain unaffected.

Recently Calogene, a biotech company, has isolated a bacterial gene that detoxifies; side effects of herbicides. Transgenic tobacco plants resistant to T MV mosaic virus and tomato i resistant to Golden mosaic virus have been developed by transferring virus coat protein genes susceptible plants. These are yet to be released.

The gene transfer technology can also play significant role in producing new and improved variety of timber trees.

Several species of microorganisms have been produced that can degrade toxic chemicals and could be used for killing harmful pathogens and insect pests.

For using genetic engineering techniques for transfer of foreign genes into host plant cells, a number of genes have already been cloned and complete libraries of DNA and mt DNA of pea are now known.

Some of the cloned genes include:

(i) Genes for phaseolin of french bean,

(ii) Few phaseolin leg haemoglobin for soybean,

(iii) Genes for small sub-unit RUBP carboxylase of pea, and i genes for storage protein in some cereals.

Efforts are being made to improve several agricultural crops using various techniques of genetic engineering which include:

(i) Transfer of nitrogen fixing genes (nif genes) from leguminous plants into cereals.

(ii) Transfer of resistance against pathogens and pests from wild plants to crop plants.

(iii) Improvement in quality and quantity of seed proteins.

(iv) Transfer of genes for animal proteins to crop plants.

(v) Elimination of unwanted genes for susceptibility to different diseases from cytoplasmic male sterile lines in crop like maize, where cytoplasmic male sterility and susceptibility are located in mitochondrial plasmid.

(vi) Improvement of photosynthetic efficiency by reassembling nuclear and chloroplast genes and by the possible conversion of C3 plants into C4 plants.

(vii) Development of cell lines which may produce nutritious food in bioreactors.

Genetic engineering has been gaining importance over the last few years and it will become more important in the current century as genetic diseases become more prevalent and agricultural area is reduced. Genetic engineering plays significant role in the production of medicines.

Microorganisms and plant based substances are now being manipulated to produce large amount of useful drugs, vaccines, enzymes and hormones at low costs. Genetic engineering is concerned with the study (inheritance pattern of diseases in man and collection of human genes that could provide a complete map for inheritance of healthy individuals.

Gene therapy by which healthy genes can be inserted directly into a person with malfunctioning genes is perhaps the most revolutionary and most promising aspect of genetic engineering. The use of gene therapy has been approved in more than 400 clinical trials for diseases such as cystic fibres emphysema, muscular dystrophy, adenosine deaminase deficiency.

Gene therapy may someday be exploited to cure hereditary human diseases like haemophilia and cystic fibrosis which are caused by missing or defective genes. In one type of gene therapy new functional genes are inserted by genetically engineered viruses into the cells of people who are unable to produce certain hormones or proteins for normal body functions.

Introduction of new genes into an organism through recombinant DNA technology essentially alters protein makeup and finally i body characteristics.

Vaccines:

Recombinant DNA Technology is also used in production of vaccines against diseases. A vaccine contains a form of an infectious organism that does not cause severe disease but does cause immune system of body to form protective antibodies against infective organism. Vaccines are prepared by isolating antigen or protein present on the surface of viral particles.

When a person is vaccinate against viral disease, antigens produce antibodies that acts against the viral proteins and inactivate them. With recombinant DNA technology, scientists have been able to transfer the genes for some viral sheath proteins to vaccinia virus which was used against small pox.

Vaccines produced by gene cloning are contamination free and safe because they contain only coat proteins against which antibodies are made. A few vaccines are being produced by gene cloning, e.g., vaccines against viral hepatitis influenza, herpes simplex virus, virus induced foot and mouth disease in animals.

Hormones:

Until recently the hormone insulin was extracted only in limited quantities from pancreas of cows and pigs. The process was not only costly but the hormone sometimes caused allergic reactions in some patients of diabetes.

The commercial production of insulin was started in 1982 through biogenetic or recombinant DNA technology and the medical use of hormone insulin was approved by food and drug administration (FDA) of USA in 1982.

The human insulin gene has been cloned in large quantities in bacterium E. coli which could be used for synthesis of insulin. Genetically engineered insulin is commercially available as humilin.

Lymphokines:

Lymphokines are proteins which regulate immune system in human body, -Interferon is one of the examples. Interferon is used to fight viral diseases such as hepatitis, herpes, common colds as well as cancer. Such drugs can be manufactured in bacterial cell in large quantities.

Lymphokines can also be helpful for AIDS patients. Genetically engineered interleukin-II, a substance that stimulates multiplication of lymphocytes is also available and is being currently tested on AIDS patients.

Somatostatin:

A fourteen aminoacid polypeptide hormone synthesized by hypothalamus was obtained only in a small quantity from a human cadavers. Somatostatin used as a drug for certain growth related abnormalities appears to be species specific and the polypeptide obtained from other mammals has no effect on human, hence its extraction from hypothalamus of cadavers.

Genetic engineering technique has helped in chemical synthesis of gene which is joined to the pBR 322 plasmid DNA and cloned into a bacterium. The transformed bacterium is converted into somatostatin synthesising factory. ADA (adenosine deaminase) deficiency is a disease like combined immune deficiency which killed the bubble boy David in 1984.

The children with ADA deficiency die before they are two years old. Bone marrow cells of the child after removal from the body were invaded by a harmless virus into which ADA has been inserted.

Erythropoetin, a genetically engineered hormone is used to stimulate the production of red blood cells in people suffering from severe anaemia.

Production of Blood clotting factors:

Normally heart attack is caused when coronary arteries are blocked by cholesterol or blood clot. plasminogen is a substance found in blood clots. Genetically engineered tissue plasminogen activator (tPA) enzyme dissolves blood clots in people who have suffered heart attacks. The plasminogen activator protein is produced by genetech company which is so potent and specific that it may even arrest a heart attack underway.

Cancer:

Cancer is a dreaded disease. Antibodies cloned from a single source and targetted for a specific antigen (monoclonal antibodies) have proved very useful in cancer treatment. Monoclonal antibodies have been target with radioactive elements or cytotoxins like Ricin from castor seed to make them more deadly. Such antibodies seek cancer cells and specifically kill them with their radioactivity or toxin.

Recombinant DNA technology has tremendous scope in energy production. Through this technology Ii is now possible to bioengineer energy crops or biofuels that grow rapidly to yield huge biomass that used as fuel or can be processed into oils, alcohols, diesel, or other energy products.

The waste from these can be converted into methane. Genetic engineers are trying to transfer gene for cellulase to proper organisms which can be used to convert wastes like sawdust and cornstalks first to sugar and then to alcohol.

Genetically designed bacteria are put into use for generating industrial chemicals. A variety of organic chemicals can be synthesised at large scale with the help of genetically engineered microorganisms. Glucose can be synthesised from sucrose with the help of enzymes obtained from genetically modified organisms.

Now-a-days with the help of genetic engineering strains of bacteria and cyanobacteria have been developed which can synthesize ammonia at large scale that can be used in manufacture of fertilisers at much cheaper costs. Microbes are being developed which will help in conversion of Cellulose to sugar and from sugar to ethanol.

Recombinant DNA technology can also be used to monitor the degradation of garbage, petroleum products, naphthalene and other industrial wastes.

For example bacterium pseudomonas fluorescens genetically altered by transfer of light producing enzyme called luciferase found in bacterium vibrio fischeri, produces light proportionate to the amount of its breaking down activity of naphthalene which provides way to monitor the efficiency of the process.

Maize and soybeans are extensively damaged by black cutworm. Pseudomonas fluorescens is found in association with maize and soybeans. Bacillus thuringiensis contain a gene pathogenic to the pest. The pest has, over the years, not only become dangerous to the crops but has developed resistance to a number of pesticides.

When the gene from B. thuringiensis (Bt) was cloned into pseudomonas fluorescence and inoculated into the soil, it was found that genetically engineered pseudomonas fluorescens could cause the death of cutworms.

Continued here:
Top 4 Applications of Genetic Engineering

What is gain of function research in genetics? – Cosmos Magazine

Its the rumour that wont go away that SARS-CoV-2 was accidentally leaked from a high biosecurity lab in Wuhan, China. The allegation is that the laboratory was conducting gain of function (GOF) research, and that this produced a potent version of coronavirus that led to the pandemic.

This has led to some scepticism and distrust of the field of research and whether it is necessary to conduct experiments using GOF techniques.

Essentially, GOF research is used to learn how viruses gain new functions through mutation and evolution.

A function is simply a property of an organism, such as plants that are more tolerant to drought or disease, or enzymes that evolved to make our bodies work.

The language about GOF has become loaded with negative connotations that associate this work with dangerous or risky research. But like rhetoric about genetic modification, these connections dont represent the diversity of the field or the security precautions that regulate the research. At its core, though, the research does exactly what the name suggests.

GOF research observes these mutations and sees how certain stimuli might affect evolutionary changes and properties of a virus or organism.

However, in our current climate its often spoken about in a much narrower context, as though its specifically about how a virus changes to move more easily between humans, or how viruses become more lethal. This just doesnt represent the full picture of GOF research.

Viruses evolve rapidly thats why there are so many new SARS-CoV-2 variants. GOF seeks to understand why and how these changes occur, and what environmental factors might influence the process.

In a sense, this is a know-your-enemy approach.

Beyond the benefit to fundamental biology research about the nature of viruses and evolution, GOF contributes to three clear areas: pandemic preparedness, vaccine development, and identification of new or potential pathogens.

GOF research can help us understand the rate at which mutations occur, and how many generations may be needed for a virus to change in a way that will require extra precautions in the community, which is information that is fed into epidemiological modelling.

This GOF information helps predict things such as how likely a virus is to become a nasty variant in a certain population size or density, during a certain season, or within a particular period or time. This informs how we react to a pandemic. Beyond this, it also informs how quickly a virus might mutate to overcome vaccines, and provides genetic information that may be useful in vaccine development. Specifically, GOF research can accumulate potential vaccine candidates in a database that can be accessed if an outbreak occurs because of natural evolution.

In turn, this means vaccine development can be sped up exponentially because candidates are already available.

For instance, a report from a 2015 GOF risk-assessment workshop for expert organisations revealed the genomics information from GOF research. This showed that bat-borne, SARS-like coronaviruses had many strains and mutations that had pandemic potential against which countermeasures need to be developed.

This information led to current pandemic responses and vaccine development the pandemic was already predicted because of a thorough understanding of the evolution of coronaviruses.

In another example, GOF experiments about influenza showed that the virus had the potential to be transmitted between different mammals with only a few changes to the genetic code, and has contributed to seasonal flu vaccines.

GOF research is based on observed evolution and changes to DNA or RNA.

The genome is the sum of all the genetic information in an organism. Some of this DNA or RNA is made up of genes, which often hold information on how to make a protein. These proteins perform functions in our body to make everything work.

These genes can naturally change a bit every generation. This happens because, to reproduce, the DNA of the parent must be replicated. The mechanisms that do this arent perfect, so little mistakes can be made when the DNA is copied.

Most of the time, the changes are tiny just a single unit of DNA (called a nucleotide) could be changed, and it may have no effect on the proteins made. At other times, the tiny change of a single nucleotide can make a gene gain a whole new function, which could be beneficial to an organism.

Natural mutations that occur during reproduction are one example of evolution in action.

These changes happen every generation, so organisms that can breed quickly, such as flies, can also evolve quickly as a species.

This process happens in essentially the same way with viruses, except that viruses have RNA instead of DNA and reproduce asexually. They still make proteins, and they still accumulate mutations, but the major difference is that they can reproduce very, very fast they can start reproducing within hours of being born and evolve at an exceptionally rapid rate.

This is why we have identified so many new variants of SARS-CoV-2 since the beginning of 2020. Every time the virus enters a new host, it reproduces rapidly, and mutations occur. Over time these mutations change the properties of the virus itself.

For example, new mutations may end up making the virus more virulent or cause worse symptoms because the proteins have changed their properties.

In these cases, we would say that the mutant strain has gained a function, and this is what GOF research aims to understand.

The viruses in a lab dont have a human host in which to grow, so researchers grow them in Petri dishes or animals instead.

There are two ways of using GOF in a lab: you can observe the virus mutate on its own (without intervention), or you can control small changes through genetic modification.

The first type of use involves putting the virus in different situations to see how it will evolve without intervention or aid.

This video is an example of GOF research with bacteria (not a virus, but the method is similar). The researchers put bacteria onto a giant petri dish with different concentrations of antibiotics. They leave the bacteria and watch how it naturally evolves to overcome the antibiotic.

The new strains of bacteria were able to be genetically sequenced to see what genetic changes had caused them to become antibiotic-resistant. This experiment can show how quickly the bacteria evolve, which can inform when or how often antibiotics are given, and whether there is a high-enough concentration of antibiotic that can halt the speed at which the antibiotic is overcome by resistance.

Similar experiments can be conducted with viruses to see how they might change to overcome human antibodies and other immune system protections.

Read more: What happens in a virology lab?

The second type of use is through small changes using genetic modification. This type of experiment occurs after a lot of other genetic information has already been gathered to identify which nucleotides in virus RNA might particularly contribute to a new function.

After these have been identified, a single or small nucleotide change will be made to the virus to confirm the predictions gained from genomic research. The modified virus will then be placed on a petri dish or inserted into an animal, such as a rabbit or a mouse, to see how the change affects the properties of the virus.

This type of research is done in specialised laboratories that are tightly controlled and heavily regulated under biosecurity laws that involve containment and decontamination processes.

Read more: How are dangerous viruses contained in Australia?

While the benefits of virus GOF research centre around pandemic preparedness, concerns have been raised about whether the research is ethical or safe.

In 2005, researchers used this technique for viruses when they reconstructed influenza (H1N1) from samples taken in 1918. The aim was to le
arn more about the properties of influenza and future pandemics, as influenza still circulates, but the controversial study sparked heavy debate about whether it should be acceptable.

The two major concerns are about whether this poses any threat to public health if a virus escapes the lab, or whether the techniques could be used for nefarious purposes.

In the past year, 16 years after the H1N1 study, there has been debate about whether SARS-CoV-2 had spontaneous zoonotic origins, or whether it was created in a lab in GOF experiments, and then escaped.

So now, 16 years after the first controversial H1N1 study, this speculation has pushed GOF research back into the public eye and led to many criticisms of the research field, and regulation of laboratories that use this technique.

In 2017, the US government lifted bans on GOF pathogen research after the National Institute of Health concluded that the risks of research into influenza and MERS were outweighed by the benefits, and that few posed significant threats to public health.

Following concerns about the origins of SARS-CoV-2, however, the rules surrounding GOF research, risk assessments and disclosure of experiments are now under review again, in order to clarify policy.

Read more: The COVID lab-leak hypothesis: what scientists do and dont know

Beyond this, the speculation has sparked further inquiries into the origin of SARS-CoV-2, although the World Health Organization concluded that viral escape from a laboratory was very unlikely.

Regardless, its never a bad thing to review biosafety, biosecurity and transparency policy as new evidence becomes available, and they have been frequently reviewed throughout history.

As for the concern that a government or private entity might abuse scientific techniques for malevolent purposes, scientists can, and do, support bans on research they deem ethically irresponsible, such as the controversial CRISPR babies.

Ultimately, the parameters around how scientific techniques like GOF are used and by whom is not a scientific question, but one that must be answered by ethicists.

Visit link:
What is gain of function research in genetics? - Cosmos Magazine

Biotech fit for the Red Planet – Newswise

Newswise NASA, in collaboration with other leading space agencies, aims to send its first human missions to Mars in the early 2030s, while companies like SpaceX may do so even earlier. Astronauts on Mars will need oxygen, water, food, and other consumables. These will need to be sourced from Mars, because importing them from Earth would be impractical in the long term. InFrontiers in Microbiology, scientists show for the first time that Anabaena cyanobacteria can be grown with only local gases, water, and other nutrients and at low pressure. This makes it much easier to develop sustainable biological life support systems.

"Here we show that cyanobacteria can use gases available in the Martian atmosphere, at a low total pressure, as their source of carbon and nitrogen. Under these conditions, cyanobacteria kept their ability to grow in water containing only Mars-like dust and could still be used for feeding other microbes. This could help make long-term missions to Mars sustainable," says lead author Dr Cyprien Verseux, an astrobiologist who heads the Laboratory of Applied Space Microbiology at the Center of Applied Space Technology and Microgravity (ZARM) of the University of Bremen, Germany.

Low-pressure atmosphere

Cyanobacteria have long been targeted as candidates to drive biological life support on space missions, as all species produce oxygen through photosynthesis while some can fix atmospheric nitrogen into nutrients. A difficulty is that they cannot grow directly in the Martian atmosphere, where the total pressure is less than 1% of Earth's - 6 to 11 hPa, too low for the presence of liquid water - while the partial pressure of nitrogen gas - 0.2 to 0.3 hPa - is too low for their metabolism. But recreating an Earth-like atmosphere would be expensive: gases would need to be imported, while the culture system would need to be robust - hence, heavy to freight - to resist the pressure differences: "Think of a pressure cooker," Verseux says. So the researchers looked for a middle ground: an atmosphere close to Mars's which allows the cyanobacteria to grow well.

To find suitable atmospheric conditions, Verseux et al. developed a bioreactor called Atmos (for "Atmosphere Tester for Mars-bound Organic Systems"), in which cyanobacteria can be grown in artificial atmospheres at low pressure. Any input must come from the Red Planet itself: apart from nitrogen and carbon dioxide, gases abundant in the Martian atmosphere, and water which could be mined from ice, nutrients should come from "regolith", the dust covering Earth-like planets and moons. Martian regolith has been shown to be rich in nutrients such as phosphorus, sulphur, and calcium.

Anabaena: versatile cyanobacteria grown on Mars-like dust

Atmos has nine 1 L vessels made of glass and steel, each of which is sterile, heated, pressure-controlled, and digitally monitored, while the cultures inside are continuously stirred. The authors chose a strain of nitrogen-fixing cyanobacteria called Anabaena sp. PCC 7938, because preliminary tests showed that it would be particularly good at using Martian resources and helping to grow other organisms. Closely related species have been shown to be edible, suitable for genetic engineering, and able to form specialized dormant cells to survive harsh conditions.

Verseux and his colleagues first grew Anabaena for 10 days under a mixture of 96% nitrogen and 4% carbon dioxide at a pressure of 100 hPa - ten times lower than on Earth. The cyanobacteria grew as well as under ambient air. Then they tested the combination of the modified atmosphere with regolith. Because no regolith has ever been brought from Mars, they used a substrate developed by the University of Central Florida (called "Mars Global Simulant") instead to create a growth medium. As controls, Anabaena were grown in standard medium, either at ambient air or under the same low-pressure artificial atmosphere.

The cyanobacteria grew well under all conditions, including in regolith under the nitrogen- and carbon dioxide-rich mixture at low pressure. As expected, they grew faster on standard medium optimized for cyanobacteria than on Mars Global Simulant, under either atmosphere. But this is still a major success: while standard medium would need to be imported from Earth, regolith is ubiquitous on Mars. "We want to use as nutrients resources available on Mars, and only those," says Verseux.

Dried Anabaena biomass was ground, suspended in sterile water, filtered, and successfully used as a substrate for growing of E. coli bacteria, proving that sugars, amino acids, and other nutrients can be extracted from them to feed other bacteria, which are less hardy but tried-and-tested tools for biotechnology. For example, E. coli could be engineered more easily than Anabaena to produce some food products and medicines on Mars that Anabaena cannot.

The researchers conclude that nitrogen-fixing, oxygen-producing cyanobacteria can be efficiently grown on Mars at low pressure under controlled conditions, with exclusively local ingredients.

Further refinements in the pipeline

These results are an important advance. But the authors caution that further studies are necessary: "We want to go from this proof-of-concept to a system that can be used on Mars efficiently," Verseux says. They suggest fine-tuning the combination of pressure, carbon dioxide, and nitrogen optimal for growth, while testing other genera of cyanobacteria, perhaps genetically tailored for space missions. A cultivation system for Mars also needs to be designed:

"Our bioreactor, Atmos, is not the cultivation system we would use on Mars: it is meant to test, on Earth, the conditions we would provide there. But our results will help guide the design of a Martian cultivation system. For example, the lower pressure means that we can develop a more lightweight structure that is more easily freighted, as it won't have to withstand great differences between inside and outside," concludes Verseux.

###

The project was funded by the Alexander von Humboldt Foundation.

Read the original here:
Biotech fit for the Red Planet - Newswise

Science, industry team up in Italy to zap COVID with laser – New York Post

ROME, July 2 A United Nations-backed scientific research centre hasteamedupwith an Italian tech firm to explore whetherlaserlight can be used to kill coronavirusparticles suspended in the air and help keep indoor spaces safe.

The joint effort between the International Centre for Genetic Engineering and Biotechnology (ICGEB) of Trieste, a city in the north ofItaly, and the nearby Eltech K-Lasercompany, was launched last year as COVID-19 was battering the country.

They created a device that forces air through a sterilization chamber which contains alaserbeam filter that pulverizesviruses and bacteria.

I thoughtlasers were more for a shaman rather than a doctor but I have had to change my mind. The device proved able to kill theviruses in less than 50 milliseconds, said Serena Zacchigna, groupleader for Cardiovascular Biology at the ICGEB.

Healthy indoor environments with a substantially reduced pathogen count are deemed essential for public health in the post COVID-19 crisis, a respiratory infection which has caused more than four million deaths worldwide in barely 18 months.

Zacchigna hookedupwith Italian engineer Francesco Zanata, the founder of Eltech K-Laser, a firm specialised in medicallasers whose products are used by sports stars to treat muscle inflammation and fractures.

Some experts have warned against the possible pitfalls of using light-based technologies to attack thevirusthat causes COVID-19.

A study published by the Journal of Photochemistry & Photobiology in November 2020 highlighted concerns ranging from potential cancer risks to the cost of expensive light sources.

But Zacchigna and Zanata dismissed any health issues, saying thelasernever comes into contact with human skin.

Our device uses nature against nature. It is 100% safe for people and almost fully recyclable, Zanata told Reuters.

The technology, however, does not eliminateviruses and bacteria when they drop from the air onto surfaces or the floor. Nor can it prevent direct contagion when someone who is infected sneezes or talks loudly in the proximity of someone else.

Eltech K-Laserhas received a patent from Italian authorities and is seeking to extend this globally.

The portable version of the invention is some 1.8 metres (5.9 ft) high and weighs about 55 lb. The company said the technology can also be placed within air-conditioning units.

In the meantime, the first potential customers are liningup, including Germanys EcoCare, a service provider of testing and vaccination solutions.

The company aims to license the technology for German and UAE markets, an EcoCare spokesperson said in an email to Reuters.

See the rest here:
Science, industry team up in Italy to zap COVID with laser - New York Post

Will We Ever Fully Understand Humans Impact on Nature? – The Nation

Elizabeth Kolbert. (Photo by John Kleiner)

To say that Earth is in crisis is an understatement. Atmospheric warming, ocean warming, ocean acidification, sea-level rise, deglaciation, desertification, eutrophicationthese are just some of the by-products of our speciess success, journalist Elizabeth Kolbert warns us about in her new book, Under a White Sky: The Nature of the Future. Kolbert has been studying the consequences of humanitys impact on Earth for decades as a contributor to The New Yorker and as the author of such books as the 2015 Pulitzer Prizewinning The Sixth Extinction, an exploration of the concept of extinction that posits mankind as a cataclysm as great as the asteroid that annihilated the dinosaurs.

In Under a White Sky, Kolbert ponders the nature of the future by examining a new pattern she attributes to the recursive logic of the Anthropocene: human interventions attempting to answer for past human interventions in the environment. The book chronicles the casualties of short-sighted human meddling with the planet and its resources and the present-day efforts being made to address that meddlingor, as Kolbert puts it, efforts to control the control of nature. Interviews with scientists in a wide array of disciplinesclimate scientists, climate entrepreneurs, biologists, glaciologists, and geneticistsreveal a trend of projects aiming to transform nature in order to save it. From the Mojave to lava fields in Iceland, Kolbert takes readers on a globe-spanning journey to explore these projects while weighing their pros, cons, and ethical implications (the books title refers to the way the sky could be bleached of color as a potential side effect of solar geoengineering, one of the proposed interventions to combat global warming). The issue, at this point, Kolbert writes, is not whether were going to alter nature, but to what end?

I spoke to Kolbert over the phone the day after President Joe Bidens inauguration. We talked about what its like to write a book about a big question you dont yet have the answer to, and what it will take to undo the environmental damage incurred during the Trump years.

Naomi Elias

Naomi Elias: You describe Under A White Sky as a book about people trying to solve problems created by people trying to solve problems. Can you explain that a little?

Elizabeth Kolbert: The pattern that Im looking at in the book is ways in which humans have intervenedor, if you prefer, mucked around withthe natural world and then have decided that the consequences are bad and are now looking for new forms of intervention to try to solve those problems. I start with the example of the Chicago River, which was reversed in an extraordinary engineering project at the beginning of the 20th century. The Chicago River used to flow east into Lake Michigan, which also happened to be Chicagos only source of drinking water. All of Chicagos human and animal waste flowed into Lake Michigan and there were constant outbreaks of typhoid and cholera. So Chicago decided, Well, we really have to do something about it, and what they did was this incredible engineering project, and now it flows basically to the southwest and eventually into the Mississippi, and all of Chicagos waste flows in the same direction. When the canal that reversed the river was put into place, there was a headline in The New York Times that ran something like, Water Flows in the Chicago River Again. It was so thick with muck that people joked a chicken could walk across it without getting its feet wet. That created a big problem that connected two huge drainage systems, the Great Lakes drainage system and the Mississippi drainage system, that has now led to all these species, including many invasive species, crossing from one basin into the other. It was having bad effects on the ecology of both systems, so to try to prevent these species from crossing from one basin to the other, theyve now electrified a significant chunk of this canal. So thats an intervention, as it were, on top of an intervention, and that is really the pattern that the book explores.

NE: The book visits project sites in Iceland, Australia, New Orleans, and the California desert. What drew you to the projects you write about?

EK: The first project that got me started down this whole path was the super coral project, which is currently in Hawaii and partly in Australia. As the oceans warm, corals are having a lot of trouble surviving. We get these coral-bleaching events that Im sure people have heard about. Some scientists were looking at how we can save coral reefs and the idea they came up with was that we need to intervene and try to coax along evolution so that these creatures can survive climate change. That struck me as a really interesting project, and got me thinking about this question of, Can we intervene to redress our own interventions? Once I started seeing that pattern, I started to see it everywhere. I could have gone to many different parts of the world and written stories that made the same point, but the projects that I went to were emblematic in some way. They were taking on different issues like climate change or invasive species, the loss of wetlandsthe list goes on.

NE: Did any of these effortsbe it the Harvard team trying to combat global warming by firing diamonds into the stratosphere or the group looking to reduce rodent populations with genetic manipulationconvince you that our best chance of averting climate apocalypse really is to control the control of nature? Are we digging ourselves out of a hole or just digging a deeper one?

EK: You know, you have identified the question at the center of the book. That is a question that I dont claim to answer. Im not a prophet. Im really trying to tease out that question in the book. Look at it, and have some fun with it, to be honest, and get people to think about the pattern. In many cases, these solutions are working to a certain extent. New Orleans would not exist without massive human intervention to solve the problems of water. In New Orleansa city thats essentially significantly below sea levelit turns out you need flooding to keep the land from subsiding even further because thats actually what built the land, the flooding that dropped a lot of sediment across the Mississippi Delta over many millennia. Are you getting into a trap when you pile these interventions on top of each other? Do you have alternatives? These are the big questions of our time.

If you like this article, please give today to help fund The Nations work.

NE: Id like to talk about your feelings about the popular phrase for the geological epoch were living in, the Anthropocene. In 2017 you gave a lecture at Manhattans New School, in which you said, Thinking scientifically about mans place in the world used to mean acknowledging our insignificance. This new human-centered term, the age of man, completely upends that. Can you talk about your feelings about the term and what it means for how we think about our relationship to the Earth?

EK: I think we are at this interesting turning point thats on some level been the subject of all the books Ive written, and a lot of the articles as well. We first decentered humans, right? It wasnt that the sun revolved around the Earth, it was that the Earth revolved around the sun. Theres a lot of these discoveries that have proved people are not the center of the universe, but then we get to the present moment, where we do have to acknowledge that we are becoming the dominant force in many very essential ways. We have to acknowledge that and, on some level, take responsibility for that. This term, the Anthropocene, is kind of a shorthand for all the ways that humans are affecting the Earth on what is sometimes called a geological scale. We are changing the carbon cycle very dramatically, were changing the nitrogen cycle, were acidifying the ocean. Weve even got to the point where we regularly cause earthquakes. We are definitely driving evolution; we are probably driving speciation. We are at this moment
of tremendous human impact and we need to rise to that challenge of thinking about what we want the world to look like now that we are such a dominant force.

NE: In the book, you take note of the way the scientists you speak to encode a sense of moral urgency into their analysis of the climate crisis, which is something I feel present in contemporary climate reporting too. Youve been on the climate beat for decades. Have you felt a shift in the work? Do you feel like you now have an agenda when you write?

EK: Theres definitely been a shift in the sense that, when I started out almost 20 years ago, there was still, among a lot of pretty knowledgeable people, a lot of confusion. What is climate change? Is it real? Do I have to worry about it? The conversation has moved dramatically, at least in a big chunk of the US and a big chunk of the world. But I do not consider myself an advocate. Im a journalist and I try to report stories that I think illuminate the situation that were in. Ive thought about, you know, Should I be writing some sort of prescriptive journalism? But thats not really me.

NE: At the end of that same 2017 lecture you conclude, We are the fate of Earth. You call humans ethical agents and say that were failing as ethical agents if we dont acknowledge our impact.

EK: Yes, I certainly stand by those words. I mean, this book is on the one hand grappling with, on the other hand sort of playing around with, those questions. Our impact on the planet and the untold number of other species with whom we share the planet and whom we frankly dont spend a lot of time thinking aboutand dont even understand to a great extentI think will come to be seen as one of the great tragedies and the great ethical failings of humanity.

NE: So many of the things you discuss in the book were set in motion long before the 2016 election, but its hard to overstate what a setback the last four years of the Trump administration have been for the climate. An analysis from The New York Times cites over 100 environmental protections Trump reversed concerning areas like wetland and wildlife protection, and air and water pollution. In his inauguration speech yesterday, President Biden talked about answering the cry for survival Earth was letting out, and he immediately signed an order to rejoin the Paris climate accord. Im wondering what your thoughts are on what your job is going to look like under the Biden administration, and if you think we dodged some kind of metaphorical asteroid?

EK: I think what Trump did was egregious. It was an attempt to set us off completely on the wrong trajectory. Its a very complicated situation legally because now a lot of regulations will have to be rewritten. Its going to occupy the EPA for years, unfortunately. Thats very sad and just a waste of time and of human effort, when we should be doing a lot of other things. But, you know, there are great forces at work here and fortunately some of those continue to go in the right direction, like the tremendous decrease in prices of wind power and solar power that continued despite Donald Trumps best efforts to try to undermine renewable power. One could spend the next four years doing nothing but looking at the legal ins and outs of trying to undo that, and I think that that would be a noble thing to do. What Im thinking about areI dont want to call them bigger questions, but theyre the questions of our human impact on the planet, which are not going to change because Joe Biden suddenly rejoined the Paris Agreement, unfortunately.

View original post here:
Will We Ever Fully Understand Humans Impact on Nature? - The Nation

Geoengineering: What could possibly go wrong? Elizabeth Kolbert’s take, in her new book – Bulletin of the Atomic Scientists

Editors note: This story was originally published by Grist. It appears here as part of theClimate Deskcollaboration. Elizabeth Kolbert is a former member of the Science and Security Board of the Bulletin of the Atomic Scientists.

In Australia, scientists collect buckets of coral sperm, mixing one species with another in an attempt to create a new super coral that can withstand rising temperatures and acidifying seas. In Nevada, scientists nurse a tiny colony of one-inch long Devils Hole pupfish in an uncomfortably hot, Styrofoam-molded pool. And in Massachusetts, Harvard University scientists research injecting chemicals into the atmosphere to dim the suns lightand slow down the runaway pace of global warming.

These are some of the scenes from Elizabeth Kolberts new book,Under a White Sky, a global exploration of the ways that humanity is attempting to engineer, fix, or reroute the course of nature in a climate-changed world. (The title refers to one of the consequences of engineering the Earth to better reflect sunlight: Our usual blue sky could turn apale white.)

Kolbert, a New Yorker staff writer, has been covering the environment for decades: Her first book,Field Notes from a Catastrophe, traced the scientific evidence for global warming from Greenland to Alaska; her second,The Sixth Extinction, followed the growing pace of animal extinctions.

Under a White Skycovers slightly different ground. Humanity is now, Kolbert explains, in the midst of the Anthropocenea geologic era in whichweare the dominant force shaping earth, sea, and sky. Faced with that reality, humans have gotten more creative at using technology to fix the problems that we unwittingly spawned: Stamping out Australias cane toad invasion with genetic engineering, for example, or using giant air conditioners to suck carbon dioxide out of air and turn it into rock. As Kolbert notes, tongue-in-cheek: What could possibly go wrong?

This interview has been condensed and lightly edited for clarity.

Osaka:Under a White Skyis about a lot of things rivers, solar geoengineering, coral reefs but its also about what nature means in our current world. What got you interested in that topic?

Kolbert: All books have complicated births, as it were. But about four years ago, I went to Hawaii to report on a project that had been nicknamed the super coral project. And it was run by a very charismatic scientist namedRuth Gates, who very sadly passed away about two years ago. We have very radically altered the oceans by pouring hundreds of billions of tons of carbon dioxide into the airand we cant get that heat out of the oceans in any foreseeable timescale. We cant change the chemistry back. And if we want coral reefs in the future, were going to have to counter what weve done to the oceans by remaking reefs so they can withstand warmer temperatures. The aim of the project was to see if you could hybridize or crossbreed corals to get more vigorous varieties.

This ideathat we have to counteract one form of intervention in the natural world (climate change) with another form of intervention (trying to recreate reefs)just struck me as a very interesting new chapter in our long and very complicated relationship with nature. And once I started to think about it that way, I started to see that as a pretty widespread pattern. Thats really what prompted the book.

Osaka: Some of these human interventions to save nature seem hopeful and positiveand others go wrong in pretty epic ways. How do you balance those two types of stories?

Kolbert: The book starts with examples that probably will strike many readers as Okay, that makes sense. That makes sense. But it goes from regional engineering solutions through biotechnology, through gene editing, and all the way up to solar geoengineering. So it kind of leads you down what we might call a slippery slope. And one of the interesting things about these cases is that they will divide up people differently. Even people who consider themselves environmentalists will come down on different sides of some of these technologies. The bind were in is so profound that theres no right answer.

Osaka: So someone who accepts what were doing to save the Devils Hole pupfish might not necessarily accept gene-editing mosquitos or dimming the sun through solar geoengineering.

Kolbert: Exactly. And I think sometimes those linesseemclearer than they are once you start to think about it.

Osaka: At one point in the book, theres a quote that is (apocryphally) attributed to Einstein: We cannot solve our problems with the same thinking we used when we created them. But you dont say whether you agree with that sentiment or not. Is that on purpose?

Kolbert: Yeah, you can read the book and say, Im really glad people are doing these things, and I feel better. Or you can read the book and say, as one scientific quote does, This is a broad highway to hell. And both of those are very valid reactions.

Osaka: When you write about geoengineering, you point out that many scientists conclude that its necessary to avoid catastrophic levels of warming, but that it could also be a really bad ideKolbert Do you think that in 15 or 20 years youll be writing about a geoengineering experiment gone wrong, much as youre writing now about failed attempts to protect Louisiana from flooding?

Kolbert: I might argue about the timescales. Im not sure Ill be reporting on it in 15 years, but I thinkyoumight be reporting on it in 30 years.

At the moment, its still the realm of sci-fi, and Im not claiming to have any particular insight into how people are going to respond in the future. But the case thats made in the book by some very smart scientists is that we dont have very many tools in our toolbox for dealing with climate change quickly, because the system has so much inertia. Its like turning around a supertanker: It takes literally decades, even if we do everything absolutely right.

Osaka: Youve reported on climate change for a long time. How does it feel to see geoengineering being explored as a more valuableand potentially necessaryoption?

Kolbert: Well, one thing I learned in the course of reporting the book was that what we now refer to as geoengineering was actually the very first thing that people started to think about when they first realized we were warming the climate. The very first report about climate change that was handed to Lyndon Johnson in 1965 wasnt about how we should stop emittingit was: Maybe we should find some reflective stuff to throw into the ocean to bounce more sunlight back into space!

Its odd, its kind of almost freakish, and I cant explain it, except to say that it sort of fits the pattern of the book.

Osaka: Theres been a longstanding fight in environmentalism between a technology-will-save-us philosophy and a return-to-nature philosophy. Based on the reporting in this book, do you think that the technology camp has won?

Kolbert: I think the book is an attempt to take on both of those schools of thought. On some level, technologyhaswoneven people who would say dont do geoengineering still want to put up solar panels and build huge arrays of batteries, and those are technologies! But where does that leave us? It goes back to Ruth Gates and the super coral project. There was a big fight among coral biologists about whether a project like that should even be pursued. The Great Barrier Reef is the size of Italyeven if you have some replacement coral, how are you going to get them out on the reef? But Gatess point was, were not returning. Even if we stopped emitting carbon dioxide tomorrow, youre not getting the Great Barrier Reef back as it was in a foreseeable timeframe.

My impulse as an old-school environmentalist is to say Well, lets just leave things alone. But the sad fact is that weve intervened so much at this point that evennot intervening is itself an intervention.

Osaka: Now that we have a US president who takes climate change seriously, do you think we could actually start cutting carbon em
issions quickly

Kolbert: I really do want to applaud the first steps that theBiden administration has taken. I think they show a pretty profound understanding of the problem. But the question, and its a big one, is What are the limits? Will Congress do anything? What will happen in theSupreme Court? The United States is no longer the biggest emitter on an annual basis, but on a cumulative basis were still the biggest. And we still dont have resolution on how much carbon dioxdie we can put up there to avoid 1.5 or 2 degrees Celsius (3.6 degrees Fahrenheit) of warming. Those are questions with big error bars. If were lucky, I think we can avoid disastrous climate change. But if were not lucky, were already in deep trouble.

Osaka: Is there anything else you want to say about the book?

Kolbert: It sounds kind of weird after our conversation, but the book was actually a lot of fun to write. It sounds odd when youre talking about a book where the subject is so immensely serious.

Osaka: You mean like when the undergraduates in Australia are tossing each other buckets of coral sperm?

Kolbert: Yes! There is always humor in all these situations. I hope that sense of fun comes through.

See more here:
Geoengineering: What could possibly go wrong? Elizabeth Kolbert's take, in her new book - Bulletin of the Atomic Scientists

Genomics and genre – Science

If the double helix is an icon of the modern age, then the genome is one of the last grand narratives of modernity, writes Lara Choksey in her new book, Narrative in the Age of the Genome. Hybridizing literary criticism with a genre-spanning consideration of a dozen distinct literary works, and imbued throughout with deep concern for the peripheral, the possible, and the political, the book seeks to challenge the whole imaginative apparatus for constructing the self into a coherent narrative, via the lexicon and syntax of the molecular.

To a reading of Richard Dawkins's The Selfish Gene (1976) as a repudiation of class struggle and E. O. Wilson's Sociobiology (1975) as a defense of warfare, Choksey juxtaposes another kind of ambiguous heterotopia in which genetic engineering is a tool of neoliberal self-fashioning. In Samuel R. Delany's Trouble on Triton (1976), Bron, a transgender ex-gigolo turned informatics expert, is caught between sociobiology and the selfish gene, between the liberal developmentalism of progressive evolution, and the neoliberal extraction and rearrangement of biological information. Even the undulating interruptions and parentheticals of Bron's thoughts [mimic] the description of the activation and silencing of genes, she suggests, tying together gene and genre in a way that encapsulates neoliberal alienation.

Choksey next explores the ways in which collectivist fantasies of biological reinvention under Soviet Lysenkoism fused code and cultivation through a close reading of Arkady and Boris Strugatsky's Roadside Picnic (1972) in which cultivated utopian dreamworlds become contaminated by alien forces, resulting in fundamental ecological transformations beyond the promised reach of human control. The novel brings to light not forgotten Soviet utopias but literal zombies and mutations. In a world where planned cultivation fails entirely in the face of the unfamiliar, even as new biological weapons are being developed, Earth itself viscerally reflects a fractured reality of lost promisesa world in crisis with all meaning gone, and survival itself a chancy proposition.

Framed as a family history, The Immortal Life of Henrietta Lacks is actually a horror story, argues Choksey.

As the promise of precision medicine emerged, so too did new forms of memoir. In Kazuo Ishiguro's Never Let Me Go (2005) and the film Gattaca (1997), for example, the traditional aspirational narrative of a pilgrim's progress is subverted: As the unitary subject disappears into data, algorithms, and commodities, a new grammar of existence emerges, albeit one in which the inherited problems of the pastracism, ableism, and the fiction of heteronormativityremain ever-present.

In Saidiya Hartman's Lose Your Mother (2006) and Yaa Gyasi's Homegoing (2016), Choksey sees a reorientation of genomics away from the reduction of self to code and toward new forms of kinship and belonging that offer a reckoning with the histories of brutalization and displacement upon which liberal humanism is founded. Even as genomics seeks to locate the trauma of enslavement at the level of the molecular, communities seeking reunion and reparation know that technology alone cannot do the cultural work of caring for history that narrative can offer.

Reading Rebecca Skloot's The Immortal Life of Henrietta Lacks (2010) as a biography of Black horror which tries, time and again, to resolve itself as family romance, Choksey identifies the perils of narratives unable to recognize their own genre. She argues that by blurring the lines not between fact and fiction but between horror and family history, the dehumanization of Black lives as experimental biomatter echoes inescapably with larger histories of the extraction of Black flesh for the expansion of colonial-capitalist production.

What emerges as most compelling out of this entire tapestry of readings is the author's interpretation of the limits and failures of the extraordinary cultural power of the genome. Concluding that genomics has privileged a particular conception of the human that is in the process of being reconfigured, Choksey ventures that the uncomplicated subject, the Vitruvian Man of the Human Genome Project, has reached its end. What is left is neither dust, stardust, nor a face erased in the sand (as Foucault would have it) but rather whatever might emerge next from the unwieldy kaleidoscope of possible meanings.

Originally posted here:
Genomics and genre - Science

Could the Immune System Hold the Key To Alzheimer’s Disease? – The Wire Science

Photo: Matt Artz.

For nearly 30 years, the hunt for a cure for Alzheimers disease has focused on a protein called beta-amyloid. Amyloid, the hypothesis goes, builds up inside the brain to bring about this memory-robbing disorder, which afflicts some 47 million people worldwide.

Billions of dollars have poured into developing therapies aimed at reducing amyloid thus far, to no avail. Trials of anti-amyloid treatments have repeatedly failed to help patients, sparking a reckoning among the fields leaders.

All along, some researchers have toiled in the relative shadows, developing potential strategies that target other aspects of cells that go awry in Alzheimers: molecular pathways that regulate energy production, or clean up cellular debris, or regulate the flow of calcium, an ion critical to nerve cell function. And increasingly, some of these scientists have focused on what they suspect may be another, more central factor in Alzheimers and other dementias: dysfunction of the immune system.

With the fields thinking narrowed around the amyloid hypothesis, immunological ideas have struggled to win favour and funding. There was no traction, says Mal Tansey, a University of Florida neuroscientist whose work focuses on immunology of the brain. The committees that review grant applications didnt want to hear about immunological studies, she says.

But over the past decade, the immune system connection to Alzheimers has become clearer. In several massive studies that analysed the genomes of tens of thousands of people, many DNA variants that were linked to heightened Alzheimers risk turned out to be in genes involved in immunity specifically, a branch of the bodys defences known as the innate immune system. This branch attacks viruses, bacteria and other invaders quickly and indiscriminately. It works, in part, by triggering inflammation.

A further connection between inflammation and Alzheimers turned up in March 2020, in an analysis of electronic health records from 56 million patients, including about 1.6 million with rheumatoid arthritis, psoriasis and other inflammatory diseases. When researchers searched those records for Alzheimers diagnoses, they found that patients taking drugs that block a key molecular trigger of inflammation, called tumour necrosis factor (TNF), have about 50 to 70% lower odds of having an Alzheimers diagnosis than patients who were prescribed those drugs but did not take them.

This newer wave of studies opened peoples eyes to the idea that the immune system might be a major driver of Alzheimers pathology, says Sharon Cohen, a behavioural neurologist who serves as medical director at the Toronto Memory Program in Canada. Over time, Cohen says, researchers began thinking that maybe inflammation is not just an aftereffect, but actually a pivotal, early effect.

Tansey is trying to harness this growing realisation to develop new therapies. A drug she helped to develop nearly 20 years ago relieved Alzheimers-like features in mice and recently showed encouraging results in a small study of people with the disease. I think we were onto something way back when, she says.

Early hunch

Tansey got interested in neurodegenerative disease in the late 1990s, while working as a postdoctoral fellow at Washington University in St. Louis. Her research focused on molecules that promote the survival of certain neurons that degenerate in Parkinsons disease in lab dish experiments, anyway. But after six years on a meagre postdoc salary, and with her husband about to start neurology training at UCLA, she took a job at a biotech company in the Los Angeles area, called Xencor. She tackled a project that the company had on the back burner: designing new drugs to inhibit that inflammatory molecule TNF.

At the time, doctors already used two such drugs to treat autoimmune disorders such as psoriasis and rheumatoid arthritis. But these drugs have harmful side effects, largely owing to TNFs complicated biology. TNF comes in two forms: one thats anchored to the membranes of cells, and a soluble form that floats around in the spaces in between. The soluble TNF causes inflammation and can kill cells infected with viruses or bacteria its a necessary job but, in excess, destroys healthy tissues. The membrane-bound form of TNF, on the other hand, confers protection against infection to begin with. The drugs in use at the time inhibited both forms of TNF, leaving people at risk for infections by viruses, bacteria and fungi that typically only cause problems for people with weakened immune systems.

Using genetic engineering, Tansey and her Xencor colleagues designed a drug that prevents this potentially dangerous side effect by targeting only the harmful, soluble form of TNF. It gloms onto the harmful TNF and takes it out of circulation. In tests, injections of the drug reduced joint swelling in rats with a condition akin to arthritis.

By the time the work was published in Science in 2003, Tansey had returned to academia, starting up her own lab at the University of Texas Southwestern Medical Centre in Dallas. And as she scoured the scientific literature on TNF, she began to think again about those experiments shed done as a postdoc, on neurons destroyed during Parkinsons disease. She read studies showing that the brains of Parkinsons patients have high levels of TNF and she wondered if TNF could be killing the neurons. There was a clear way to find out: Put the TNF-blocking drug shed helped to develop at Xencor into the brains of rats that were manipulated to develop Parkinsons-like symptoms and watch to see what happened.

Her hunch proved correct the drug slowed the loss of neurons in Parkinsons rats. And that led Tansey to wonder: Could TNF also be involved in the loss of neurons in other forms of neurodegeneration, including Alzheimers disease? Mulling over the nuanced roles of innate immune cells, which seem to help or hurt depending on the context, she started rethinking the prevailing amyloid hypothesis. Perhaps, she thought, amyloid ends up clumping in the Alzheimers brain because immune cells that would normally gobble it up get sluggish as people age: In other words, the amyloid accumulated as a consequence of the disease, not a cause.

The double-edged nature of immune activity also meant that our immune systems might, if unchecked, exacerbate problems. In that case, blocking aspects of immune function specifically, inflammation might prove helpful.

The idea that blocking inflammation could preserve cognition and other aspects of brain function has now found support in dozens of studies, including several by Tanseys lab. Using an approach that induced Alzheimers-like neurological symptoms in mice, neuroscientist Michael Heneka, a researcher at Germanys University of Bonn, and his colleagues found that mice engineered to lack a key molecule of the innate immune system didnt form the hallmark amyloid clumps found in Alzheimers.

Tansey and colleagues, for their part, showed that relieving inflammation with the drug Tansey helped develop at Xencor, called XPro1595, could reduce amyloid buildup and strengthen nerve cell connections in mice with Alzheimers-like memory problems and pathology. Her team has also found that mice on a high-fat, high-sugar diet which causes insulin resistance and drives up Alzheimers risk have reduced inflammation and improved behavior on tests of sociability and anxiety when treated with XPro1595.

All told, hints from human genetic and epidemiologic data, combined with growing evidence from mouse models, was shifting or pointing toward the role of the immune system, says Heneka, who coauthored a 2018 article in the Annual Review of Medicine about innate immunity and neurodegeneration. And the evidence is growing: In 2019, a study of more than 12,000 older adults found that people with chronic inflammation suffered greater mental losses over a period of 20 years a clue, again, that inflammation could be an early driver of cognitive decline.

The accumulating data convin
ced Tansey that it was time to test this idea in people that instead of targeting amyloid, we need to start targeting the immune system, she says. And it needs to be early. Once too much damage is done, it may be impossible to reverse.

Targeting innate immunity

Immune-based strategies against Alzheimers are already being pursued, but most are quite different than what Tansey was proposing. Companies mostly work with the adaptive immune system, which attacks pathogens or molecules very specifically, recognising them and marking them for destruction. Experimental therapies include antibodies that recognise amyloid and target it for removal.

INmune Bio, in La Jolla, California, is one of several biotech companies taking a different approach: trying to fight degenerative brain disease by targeting the less specific innate immune system. The immune system is a 50-50 partnership, says RJ Tesi, the CEO. If youre about to have a prize fight, youre not going to jump in with one hand tied behind your back. Likewise, with Alzheimers or cancer, you dont want to go into the ring with half the immune system being ignored. To pursue this strategy, INmune Bio bought commercial rights to XPro1595. (Tansey is a paid consultant for INmune Bio but is not involved in any of the companys trials.)

INmune Bio initially focused on cancer, so when it designed its Alzheimers trial, it used a strategy commonly used in cancer drug trials. In Tesis view, a key reason that experimental cancer drugs succeed far more often than experimental neurology drugs is the use of molecular disease indicators called biomarkers. These are measures such as genetic variants or blood proteins that help to distinguish patients who, from the outside, may all seem to have the exact same disease, but may actually differ from one another.

By using biomarkers to select participants, cancer researchers can enrol the patients most likely to respond to a given drug but many neurology trials enrol patients based solely on their diagnosis. And thats problematic, says Tesi, because scientists are coming to realise that a diagnosis of Alzheimers, for instance, might actually encompass various subtypes of disease each with its own underlying biology and each, perhaps, requiring a different treatment.

In an ongoing trial of XPro1595, INmune Bio aims to enrol 18 people with mild to moderate Alzheimers disease, all of whom have elevated levels of biomarkers for excessive inflammation, including one called C-reactive protein. In July, the company reported early data from six participants who were treated with the TNF inhibitor once a week for 12 weeks and assessed for brain inflammation using a specialised magnetic resonance imaging (MRI) technique.

Over the 12-week period, brain inflammation fell 2.3 percent in three participants who received the high-dose TNF inhibitor compared with a 5.1 percent increase in 25 Alzheimers patients whose data were collected previously as part of a major long-term study of Alzheimers disease. Three participants who got a low dose of XPro1595 had a smaller 1.7% increase in brain inflammation. In this small trial, the researchers did not track changes in cognition. But their MRI analysis showed that inflammation was reduced by about 40 percent in a particular bundle of nerve fibres called the arcuate fasciculus that is important for language processing and short-term memory.

Its early days, Cohen says interim results in just six people. However, in a small sample size like that, you might not expect to see anything. Past studies of anti-inflammatory drugs did not show a benefit in Alzheimers patients, but scientists are now reexamining these trial failures, Cohen says. Maybe the idea of the immune system is important, but our therapies were too blunt, she says.

Its not just INmune Bio that has researchers excited about the prospect of tinkering with innate immunity to tackle brain disease. Alector, a South San Francisco biotech company, is developing potential therapeutics to activate the innate immune system to fight Alzheimers. Some of their experimental drugs are intended to boost the activity of innate immune cells in the brain called microglia. Tiaki Therapeutics in Cambridge, Massachusetts, meanwhile, is using computational methods to identify potential treatments for people with neuroinflammatory diseases who have specific gene signatures. And another company, Shanghai-based Green Valley, is investigating a drug that includes a mix of seaweed sugars that, the company claims, alters gut bacteria to tamp down brain inflammation.

Its encouraging to see so many different approaches to harnessing the innate immune system to fight Alzheimers, Heneka says. He predicts, however, that a variety of treatments will be needed to tackle such a multifaceted, complicated disease.

But Tansey suspects that chronic inflammation is a crucial factor that takes a toll on the brain over the course of many years. Although lowering inflammation will not solve everything, she says, I think it will buy you a lot. Because its the dark passenger of the journey.

This article originally appeared in Knowable Magazine, an independent journalistic endeavour from Annual Reviews.

Read more here:
Could the Immune System Hold the Key To Alzheimer's Disease? - The Wire Science

Synthetic Biology Used To Develop a New Type of Genetic Design – Technology Networks

Richard Feynman, one of the most respected physicists of the twentieth century, said "What I cannot create, I do not understand". Not surprisingly, many physicists and mathematicians have observed fundamental biological processes with the aim of precisely identifying the minimum ingredients that could generate them. One such example are the patterns of nature observed by Alan Turing. The brilliant English mathematician demonstrated in 1952 that it was possible to explain how a completely homogeneous tissue could be used to create a complex embryo, and he did so using one of the simplest, most elegant mathematical models ever written. One of the results of such models is that the symmetry shown by a cell or a tissue can "break" under a set of conditions. However, Turing was not able to test his ideas, and it took over 70 years before a breakthrough in biology technique was able to evaluate them decisively. Can Turing's dream be made a reality through Feynman's proposal? Genetic engineering has proved it can.

Now, a research team from the Institute of Evolutionary Biology (IBE), a joint centre of UPF and the Spanish National Research Council (CSIC), has developed a new type of model and its implementation using synthetic biology can reproduce the symmetry breakage observed in embryos with the minimum amount of ingredients possible.

The research team has managed to implement via synthetic biology (by introducing parts of genes of other species into the E. coli bacteria) a mechanism to generate spatial patterns observed in more complex animals, such as Drosophila melanogaster (fruit fly) or humans. In the study, the team observed that the strains of modified E. coli, which normally grow in (symmetrical) circular patterns, do as in the shape of a flower with petals at regular intervals, just as Turing had predicted.

"We wanted to build symmetry breaking that is never seen in colonies of E. coli, but is seen in patterns of animals, and then to discover which are the essential ingredients needed to generate these patterns", says Salva Duran-Nebreda, who conducted this research for his doctorate in the Complex Systems laboratory and is currently a postdoctoral researcher at the IBE Evolution of Technology laboratory.

Bacteria E. coli forming patterns induced by the new synthetic system. Credit: Jordi Pla /ACS.

Using the new synthetic platform, the research team was able to identify the parameters that modulate the emergence of spatial patterns in E. coli . "We have seen that by modulating three ingredients we can induce symmetry breaking. In essence, we have altered cell division, adhesion between cells and long-distance communication capacity (quorum sensing), that is to say, perceive when there is a collective decision", Duran-Nebreda comments.

The observations made in the E. coli model could be applied to more complex animal models or to insect colony design principles. "In the same way that organoids or miniature organs can help us develop therapies without having to resort to animal models, this synthetic system paves the way to understanding as universal a phenomenon as embryonic development in a far simpler in vitro system", says Ricard Sol, ICREA researcher with the Complex Systems group at the IBE, and head of the research.

The model developed in this study, the first of its kind, could be key to understanding some embryonic development events. "We must think of this synthetic system as a platform for learning to design different fundamental biological mechanisms that generate structures, such as the step from a zygote to the formation of a complete organism. Moreover, such knowledge on the frontier between mechanical and biological processes, could be very useful for understanding developmental disorders", Duran-Nebreda concludes.

Reference: Duran-Nebreda S, Pla J, Vidiella B, Piero J, Conde-Pueyo N, Sol R. Synthetic Lateral Inhibition in Periodic Pattern Forming Microbial Colonies. ACS Synth Biol. 2021. doi:10.1021/acssynbio.0c00318.

This article has been republished from the following materials. Note: material may have been edited for length and content. For further information, please contact the cited source.

Visit link:
Synthetic Biology Used To Develop a New Type of Genetic Design - Technology Networks

Experts Predict the Hottest Life Science Tech in 2021 and Beyond – The Scientist

Through the social and economic disruption that COVID-19 caused in 2020, the biomedical research community rose to the challenge and accomplished unprecedented feats of scientific acumen. With a new year ahead of us, even as the pandemic grinds on, we at The Scientist thought it was an opportune time to ask what might be on the life science innovation radar for 2021 and beyond. We tapped three members of the independent judging panel that helped name our Top 10 Innovations of 2020 to share their thoughts (via email) on the year ahead.

Paul Blainey: Value is shifting from the impact of individual technologies (mass spectrometry, cloning, sequencing, PCR, induced pluripotent stem cells, next generation sequencing, genome editing, etc.) to impact across technologies. In 2021, I think researchers will increasingly leverage multiple technologies together in order to generate new insights, as well as become more technology-agnostic as multiple technologies present plausible paths toward research goals.

Kim Kamdar: Partially in reaction to the COVID-19 pandemic, one 2021 headline will be the continued innovation focused on consumerization of healthcare, which is redefining how consumers engage with providers across each stage of care. Consumers are even selective about their healthcare choices now, and the retail powerhouses like CVS and Walmart have and will continue to develop solutions to meet the needs of their customers. While this was already underway prior to the pandemic, the crisis has spurred on this activity with the goal of making healthcare more accessible and affordable and ultimately delivering on better health outcomes for all Americans.

Robert Meagher: I think this is easymRNA delivery. This is something that has been in development for years for numerous applications, but the successful development and FDA emergency use authorization of two COVID-19 vaccines based on this technology shines a very bright spotlight on this technology. The vaccine trials and now widespread use of the vaccines will give developers a lot of data about the technology, and sets a baseline for understanding safety and side effects when considering future therapeutic applications outside of infectious disease.

PB:Single-cell technology is here to stay, although its use will continue to change. One analogy to be drawn is the shift we saw from the popularity ofde novo genome sequencing (during the human genome project and the early part of the NGS [next-generation sequencing] era to the rich array of re-sequencing applications practiced today. I expect new ways to use single-cell technology will continue to be discovered for some time to come.

KK: Innovation in single-cell technology has the potential to transform biological research driving to a level of resolution that provides a more nuanced picture of complex biology. Cost has been a key barrier for broader adoption of single-cell analysis. As better technology is developed, cost will be reduced and there will be an explosion in single-cell research. This dynamic will also allow for broader adoption of single-cell technology from translational research to clinical applications particularly in oncology and immunology.

RM: Yesthere is continuing innovation in this space, and room for continued innovation. One area that we have seen development recently, and I see it continuing, is to study single cells not just in isolation, but coupled with spatial information: understanding single cells and their interactions with their neighbors. I also wonder if the COVID-19 pandemic will spur increased interest in applying single-cell techniques to problems in infectious disease, immunology, and microbiology. A lot of the existing methods for single-cell RNA analysis (for example) work well for human or mammalian cells, but dont work for bacteria or viruses.

PB: The promises of CRISPR and gene editing are extraordinary. I cant wait to see how that field continues to develop.

KK: Much of the CRISPR technology focus since it was unveiled in 2012 has been on its utility to modify genes in human cells with the goal of treating genetic disease. More recently, scientists have shown the potential of using the CRISPR gene-editing technology for treatment of viral disease (essentially a programmable anti-viral that could be used to treat diseases like HIV, HBV, SARS, etc. . . .). These findings, published in Nature Communications, showed that CRISPR can be used to eliminate simian immunodeficiency virus (SIV) in rhesus macaque monkeys. If replicated in humans, in studies that will be initiated this year, CRISPR could be utilized to address HIV/AIDS and potentially make a major impact by moving a chronic disease to one with a functional cure.

PB: New therapeutic modalities that expand the addressable set of diseases are particularly exciting. Cell-based therapies offer versatile platforms for biological engineering that leverage the power of human biology. It is also encouraging to see somatic cell genome editing technology advance toward the clinic for the treatment of serious diseases.

The level of innovation that occurred in 2020 to combat COVID-19 will provide a more rapid, focused, and actionable reaction to future pandemics.

Kim Kamdar, Domain Associates

RM: Besides the great success with mRNA-based vaccines that sets the stage for other clinical technologies based on mRNA delivery, the other area that is really in the spotlight this year is diagnostics. There are a lot of labs and companies, both small and large, that have some really innovative products and ideas for portable and point-of-care diagnostics. For a long time, this was often thought of in terms of a problem for the developing world, or resource-limited locations: think, for example, of diagnostics for neglected tropical diseases. But the COVID-19 pandemic and the associated need for diagnostic testing on a massive scale has caused us to rethink what resource-limited means, and to understand the challenge posed by bottlenecks in supply chains, skilled personnel, and high-complexity laboratory facility. There has been a lot of foundational research over the past couple of decades in rapid, portable, easy-to-use diagnostics, but translating these to clinically useful products often seemed to stall, I suspect for lack of a lucrative market for such tests. But we are now starting to see FDA [emergency use authorization for] home-based tests and other novel diagnostic technologies to address needs with the COVID-19 pandemic, and I suspect that this paves the way for these technologies to start being applied to other diagnostic testing needs.

PB: Seeing the suffering and destruction wrought by COVID-19, it is obvious that we need to be prepared with more extensive, equitable, and better-coordinated response plans going forward. While rapid vaccine development and testing were two bright spots last year, there are so many important areas that demand progress. As we learn about how important details become in a crisisno matter how small or mundanediagnostic technologies and the calibration of public health measures are two areas that merit major focus.

KK: The life science community response to the COVID-19 pandemic has already proven to be light-years ahead of previous responses particularly in areas such as vaccine development and diagnostics. It took more than a year to sequence the genome of the SARS virus in 2002. The COVID-19 genome was sequenced in under a month from the first case being identified. Scientists and clinicians were able to turn that initial information to multiple approved vaccines at a blazing speed. Utilizing messenger RNA (mRNA) as a new therapeutic modality for vaccine development has now been validated. Vaccine science has been forever changed. The pandemic has also focused a much-needed level of attention to diagnostics, forcing a rethink of how to increase access, affordability, and actionability of diagnostic testing. The level of innovation that occurred in 2020 to combat COVID-19 will provide a more rapid, fo
cused, and actionable reaction to future pandemics. In addition, the elevation of a science advisor (Dr. Eric Lander) to a cabinet level position in the Biden administration bodes well for our future ability to ground in data and as President Biden himself framed, refresh and reinvigorate our national science and technology strategy to set us on a strong course for the next 75 years, so that our children and grandchildren may inhabit a healthier, safer, more just, peaceful, and prosperous world.

RM: One thing that really kick-started research to address COVID-19 was the early availability of the complete genome sequence of the SARS-CoV-2 virus, and the ongoing timely deposition of new sequences in nearreal-time as isolates were sequenced. This is in contrast to cases where deposition of large number of sequences may lag an outbreak by months or even years. I foresee the nearreal-time sharing of sequence information to become the new standard. Making the virus itself widely and inexpensively available, in inactivated form, as well as well-characterized synthetic viral RNA standards and proteins also helped spur research.

A trend Im less fond of is the rapid publication of nonpeer reviewed results as preprints online. Theres a great benefit to getting new information out to the community ASAP, but unfortunately I think the rush to get preprints up in some cases results in spreading misleading information. This problem is compounded with uncritical, breathless press releases accompanying the posting of preprints, as opposed to waiting for peer-review acceptance of a manuscript to issue a press release. I think the solution may lie in journals considering innovative approaches to speeding up peer review, or a way to at least perform a basic check for rigor prior to posting a preliminary version of the manuscript. Right now the extremes are: post an unreviewed preprint, or wait months or even years with multiple rounds of peer review including extensive additional experiments to satisfy the curiosity of multiple reviewers for high impact publications. Is there a way to prevent manuscripts from being published as preprints with obvious methodological errors or errors in statistical analysis, while also enabling interesting, well-done yet not fully polished manuscripts to be available to the community?

Paul Blaineyis an associate professor of biological engineering at MIT and a core member of the Broad Institute of MIT and Harvard University. The Blainey lab integrates new microfluidic, optical, molecular, and computational tools for application in biology and medicine. The group emphasizes quantitative single-cell and single-molecule approaches, aiming to enable studies that generate data with the power to reveal the workings of natural and engineered biological systems across a range of scales. Blainey has a financial interest in several companies that develop and/or apply life science technologies: 10X Genomics, GALT, Celsius Therapeutics, Next Generation Diagnostics, Cache DNA, and Concerto Biosciences.

Kim Kamdaris managing partner at Domain Associates, a healthcare-focused venture fund creating and investing in biopharma, device, and diagnostic companies. She began her career as a scientist and pursued drug-discovery research at Novartis/Syngenta for nine years.

Robert Meagheris a principal member of Technical Staff at Sandia National Laboratories. His main research interest is the development of novel techniques and devices for nucleic acid analysis, particularly applied to problems in infectious disease, biodefense, and microbial communities. Most recently this has led to approaches for simplified molecular diagnostics for emerging viral pathogens that are suitable for use at the point of need or in the developing world. Meaghers comments represent his professional opinion but do not necessarily represent the views of the US Department of Energy or the United States government.

More here:
Experts Predict the Hottest Life Science Tech in 2021 and Beyond - The Scientist

Uncertain future: Will Europe’s Green Deal encourage or cripple crop gene-editing innovation? – Genetic Literacy Project

The EU Green Deal and its Farm-to-Fork and Biodiversity Strategies stipulate ambitious policy objectives that will fundamentally impact agricultural businesses and value chains. Are these objectives realistic? And how do they fit with the EUs policies on food security, the internal market, international trade and multilateral economic agreements? As significant conflicts of goals become apparent, the discussion on expectations, preconditions and consequences is now underway.

The Farm to Fork Strategy concretely foresees a reduction of pesticide and fertilizer use of 50% and 20% by 2030, respectively. In addition, 25% of EUs agricultural land is supposed to be put under organic farming conditions, which generally means a reduction in productivity. Unfortunately, the strategy is less concrete about the important role of innovation in general and plant breeding innovation specifically to compensate for productivity losses and to contribute to a more sustainable agriculture.

On July 25, 2018 the European Court of Justice (ECJ) published its ruling on mutagenesis breeding, including targeted genome editing techniques. This ruling subjected new tools like CRISPR Cas-9 to the EUs strict rules and requirements for GMOs, and with that effectively prohibited European plant breeders and farmers from utilizing these powerful technologies. These regulatory obstacles are not based on evidence showing that genome editing poses a risk to human health or the environment, but rather on political interference in the regulatory approval process. The COVID pandemic made this abundantly clear. In July 2020, for example, the EU suspended some of its excessive genetic engineering rules to facilitate the development of COVID vaccines, and has since celebrated the approval of these important drugs while trying to prevent the use of biotechnology in agriculture.

Since the discovery of the laws of genetics by Gregory Mendel in 1866, plant breeders have continuously integrated the latest plant biology innovations into their toolbox to develop enhanced crops that help farmers sustainably grow the food we all depend on.

Europes seed sector, technology developers and public researchers have always been important actors in this evolving effort and remain global leaders in developing improved plant breeding methods. They work tirelessly to provide farmers with crop varieties that fit the needs of a highly productive and sustainable agriculture system and meet the exacting demands of consumers. It is no secret that these experts understand the value of new breeding techniques (NBTs) like CRISPR and want to employ them.

Contrary to the claim of some environmental groups that genome editing provides new avenues of control through modifying specific plant traits, most notably insect and herbicide resistance, industrial applications of this sort are only one aspect of NBT research, and a minor one at that. Our recent survey of 62 private plant breeding companies, 90% of which are small and medium size firms (SMEs), confirms that EU plant breeders are able and willing to use these technologies to develop a wide range of crop species and traits for farmers. From grape vine to wheat, NBTs can generate innovation to protect Europes traditional crops from pests and diseases and other threats posed by climate change.

Independent of their size, many companies are already using NBTs in their R&D pipelines for technology development, gene discovery and to produce improved plant varieties. These activities cover a wide range of agricultural and horticultural cropsfrom the so-called cash crops like maize and soybean to minor crops like pulses, forage crops and chicoryand span a wide diversity of characteristics, including yield, plant architecture, disease and pest resistance, food-quality traits and abiotic stresses like drought and heat.

Link:
Uncertain future: Will Europe's Green Deal encourage or cripple crop gene-editing innovation? - Genetic Literacy Project