The Prometheus League
Breaking News and Updates
- Abolition Of Work
- Ai
- Alt-right
- Alternative Medicine
- Antifa
- Artificial General Intelligence
- Artificial Intelligence
- Artificial Super Intelligence
- Ascension
- Astronomy
- Atheism
- Atheist
- Atlas Shrugged
- Automation
- Ayn Rand
- Bahamas
- Bankruptcy
- Basic Income Guarantee
- Big Tech
- Bitcoin
- Black Lives Matter
- Blackjack
- Boca Chica Texas
- Brexit
- Caribbean
- Casino
- Casino Affiliate
- Cbd Oil
- Censorship
- Cf
- Chess Engines
- Childfree
- Cloning
- Cloud Computing
- Conscious Evolution
- Corona Virus
- Cosmic Heaven
- Covid-19
- Cryonics
- Cryptocurrency
- Cyberpunk
- Darwinism
- Democrat
- Designer Babies
- DNA
- Donald Trump
- Eczema
- Elon Musk
- Entheogens
- Ethical Egoism
- Eugenic Concepts
- Eugenics
- Euthanasia
- Evolution
- Extropian
- Extropianism
- Extropy
- Fake News
- Federalism
- Federalist
- Fifth Amendment
- Fifth Amendment
- Financial Independence
- First Amendment
- Fiscal Freedom
- Food Supplements
- Fourth Amendment
- Fourth Amendment
- Free Speech
- Freedom
- Freedom of Speech
- Futurism
- Futurist
- Gambling
- Gene Medicine
- Genetic Engineering
- Genome
- Germ Warfare
- Golden Rule
- Government Oppression
- Hedonism
- High Seas
- History
- Hubble Telescope
- Human Genetic Engineering
- Human Genetics
- Human Immortality
- Human Longevity
- Illuminati
- Immortality
- Immortality Medicine
- Intentional Communities
- Jacinda Ardern
- Jitsi
- Jordan Peterson
- Las Vegas
- Liberal
- Libertarian
- Libertarianism
- Liberty
- Life Extension
- Macau
- Marie Byrd Land
- Mars
- Mars Colonization
- Mars Colony
- Memetics
- Micronations
- Mind Uploading
- Minerva Reefs
- Modern Satanism
- Moon Colonization
- Nanotech
- National Vanguard
- NATO
- Neo-eugenics
- Neurohacking
- Neurotechnology
- New Utopia
- New Zealand
- Nihilism
- Nootropics
- NSA
- Oceania
- Offshore
- Olympics
- Online Casino
- Online Gambling
- Pantheism
- Personal Empowerment
- Poker
- Political Correctness
- Politically Incorrect
- Polygamy
- Populism
- Post Human
- Post Humanism
- Posthuman
- Posthumanism
- Private Islands
- Progress
- Proud Boys
- Psoriasis
- Psychedelics
- Putin
- Quantum Computing
- Quantum Physics
- Rationalism
- Republican
- Resource Based Economy
- Robotics
- Rockall
- Ron Paul
- Roulette
- Russia
- Sealand
- Seasteading
- Second Amendment
- Second Amendment
- Seychelles
- Singularitarianism
- Singularity
- Socio-economic Collapse
- Space Exploration
- Space Station
- Space Travel
- Spacex
- Sports Betting
- Sportsbook
- Superintelligence
- Survivalism
- Talmud
- Technology
- Teilhard De Charden
- Terraforming Mars
- The Singularity
- Tms
- Tor Browser
- Trance
- Transhuman
- Transhuman News
- Transhumanism
- Transhumanist
- Transtopian
- Transtopianism
- Ukraine
- Uncategorized
- Vaping
- Victimless Crimes
- Virtual Reality
- Wage Slavery
- War On Drugs
- Waveland
- Ww3
- Yahoo
- Zeitgeist Movement
-
Prometheism
-
Forbidden Fruit
-
The Evolutionary Perspective
Daily Archives: July 9, 2022
The 10 best albums of the 1970s year by year – Far Out Magazine
Posted: July 9, 2022 at 8:07 am
Today Ive given myself the task of picking out the ten best albums of the 1970s, with only one selection allowed for each year. The task was in equal measure pleasurable and gruelling; I found myself having to ruthlessly omit some of my favourite artists and albums as I very quickly remembered what a deliciously brilliant decade the 1970s were for music.
After waving goodbye to the Beatles and the hippie dream, the 70s brought on an age of darker and dirtier music as we welcomed heavier forms of rock such as metal and punk. Meanwhile, the abstract creative ideas of the 60s seemed to propagate into a wider variety of musical styles as genres were blended amid the burgeoning technological advancement in production and synthesised sound.
In my selections, I wanted to display the evolution of popular music across the decade and disallowed myself from picking any one artist twice. Otherwise, you would have a list chock full of David Bowie and Rolling Stones albums. Rather than crudely allowing my subjective taste to rain on the parade, Ive left my mind ajar to albums that must take pride of place because of their dominating contemporary impact and ongoing influence.
That said, there are some selections below that dont follow the usual grain and will likely stick in many a craw. If you find yourself simmering over some of the selections, just remember, its only a bit of fun and I found it immensely difficult, especially for 1972.
So, without further ado, please read on and see how many you agree with. Better still, have a go at coming up with your own list using the same limitations. You may discover, as I did, that its a little like tackling the advanced sudoku in the Sunday paper.
As we kicked off the decade, we said goodbye to the Beatles, the most important rock group in history. For this selection, I was close to picking their farewell album, Let it Be; however, despite containing some great hits, the album is tarnished by the lack of cohesion within the group at the time of recording and was nothing on the bands previous four albums.
Elsewhere in 1970, The Velvet Underground earned a shoutout with Loaded, their most commercial yet least adventurous album, but for me, the battle came to a head between the three post-Beatles solo albums released in the year. In the end, George Harrisons All Things Must Pass took home the prize for its creative scope and the way it seems to embody so passionately everything the Quiet Beatle stood for a true, epic masterpiece.
Funkadelic were the true pioneers of funk-rock music, and they hit the gold mine with their seminal third studio album, Maggot Brain, in 1971. Best known for its ten-minute title track, the George Clinton-produced masterpiece seems to embody the dark and dingy miasma of urban decay and hedonism that cloaked many US cities in the 1970s.
The experimental sound blended the psychedelic rock sound of Jimi Hendrix with funk along with folk-blues and flecks of gospel music. While 1971 was host to a few other serious contenders, nothing before or after Maggot Brain sounds quite like it. Its one of the few experimental albums I can keep coming back to without skipping any of the tracks.
The Rolling Stones were churning out increasingly impressive albums over the late 1960s, and as they entered the 70s, they were undoubtedly the biggest thing in rock music. For me and most others, as I understand The Stones reached their absolute peak between 1971-72 with Sticky Fingers and Exile on Main St.
Deciding between the two was as easy as flipping a coin, if the coin was solid lead and the size of a car. The two albums are difficult to compare as they are so sonically and thematically disparate. Ultimately, I landed on Exile on Main Street because it seems more permeated with the bands DNA thanks to the enveloped history of their tax exile in France.
As they departed the early psychedelic rock sound of their early years in the 1960s helmed by Syd Barrett, Pink Floyd grew toward the more spacious and atmospheric prog-rock sound that would ultimately define their peak success in the 1970s. The 1971 album Meddle threw a few ideas together but was undeniably highlighted by its 23-minute side two epic, Echoes. It was the sound of Echoes that Pink Floyd ran with as they looked to record The Dark Side of the Moon.
The 1973 album is seen by many as Pink Floyds greatest achievement as it marries the maturation of Roger Waters conceptual songwriting with the bands blossoming instrumental chemistry. While containing elements of jazz, gospel and blues music, the album runs from start to finish seamlessly as a deeply absorbing journey through the cheery themes of greed, death, mental illness and the relentless attrition of time.
Joni Mitchell released this sixth studio album to a chorus of critical and commercial acclaim, having established herself as a mainstay of the early 70s singer-songwriter wave with Ladies of the Canyon (1970) and Blue (1971) earlier in the decade. The 1974 release showed a distinct shift in style to a highly accessible folk sound infused with jazz.
The album was highly influential on subsequent pop acts of the 1970s, especially Fleetwood Macs Stevie Nicks, who once recalled taking LSD to listen to the album: I was with my producer, at his house, with a set of speakers that were taller than that fireplace, and I was in a safe place. And I sat there on the floor and listened to that record That was a pretty dynamic experience. Meanwhile, in an interview with Rolling Stone, Mitchell recalled Bob Dylan falling asleep when she first played it to him I suppose you cant please everyone.
Bob Dylans form waned toward the late 1960s, following his heyday between 1963s The Freewheelin Bob Dylan and 1966s Blonde on Blonde. Over the early 1970s, Dylans commercial and critical peak looked to be sinking on the horizon following 1970s New Morning. He hadnt released much to entice his fans apart from a few uneven Self Portrait outtakes and his soundtrack for Pat Garrett & Billy The Kid (1973).
Fortunately, the tides changed in 1974 with the release of Planet Waves, which somehow became Dylans first-ever number one album. Though 1974s moderate return to form has since withered into the background, save for Forever Young, Dylan took this momentum into Blood on the Tracks. The 1975 album came like a bolt from the blue and is now considered Dylans greatest album outside of the 60s.
David Bowie undoubtedly hit a fruitful peak in the 1970s following the release of Hunky Dory and Ziggy Stardust in 71 and 72, respectively. For many, and myself, on another day, this list would have featured Ziggy Stardust or Low (1977), but today, I want to show my undying love for this mid-decade masterpiece.
Station to Station saw the genesis of a new alter ego for the Starman, The Thin White Duke, who is stylishly introduced in the albums eponymous ten-minute opener. The remainder of the record is near faultless with a great balance between the upbeat (Golden Years/Stay) and the slower and introspective (Wild is the Wind/Word on a Wing).
Fleetwood Mac created one of the decades most popular albums in Rumours. The album is a masterclass from start to finish, with the endlessly talented, feuding bunch of musicians pouring pure emotion into the cauldron. What resulted was a finely balanced selection of candid pop-rock classics.
1977 was another particularly challenging year to choose from. It was home to some cracking albums, including four from the co-habiting David Bowie and Iggy Pop, Steely Dans Aja, The Clash, and Sex Pistols iconic debut. But Im hoping you will agree that Rumours is just one of those flawless releases that touched far too many among us to be ignored.
This gem from the Manchester post-punk pioneers is probably among the more obscure picks on this list. In 1978 punk music was transitioning into its more round-cut and erudite phase, post-punk. Magazine were a band that so distinctly embodied this transition, as Buzzcocks founding member Howard Devoto split off in search of a new sound.
Real Life was a bold and brilliant debut venture for the band that incorporated more complex arrangements and textures to the classic sound of punk. Devoto shows the true depth of his lyrical capabilities with a diverse range of absorbing themes and memorable, often chilling lines.
Released in 1979 on the famous Factory records, Unknown Pleasures was the seminal debut album for Ian Curtis and co. The album was produced by Martin Hannett, who incorporated a number of unconventional production techniques into the industrial and gritty sound of the bands earlier singles.
1979 was home to a vast array of brilliant albums, most of which fall within the then-booming post-punk genre. But for me, none were quite so iconic and influential as this belter, boasting hits like Shadowplay, Shes Lost Control Again, Disorder and New Dawn Fades.
Most popular
Link:
The 10 best albums of the 1970s year by year - Far Out Magazine
Posted in Hedonism
Comments Off on The 10 best albums of the 1970s year by year – Far Out Magazine
Gene Expression Visualized in Brains of Live Mice in Real Time – Genetic Engineering & Biotechnology News
Posted: at 8:03 am
Scientists led by a team at the University of Minnesota Twin Cities say they have developed a novel method that allows scientists and engineers to visualize mRNA molecules in the brains of living micereportedly for the first time. The study reveals new insights into how memories are formed and stored in the brain and could provide scientists with new information about diseases such as Alzheimers, according to the researchers who published their work Real-time visualization of mRNA synthesis during memory formation in live mice in Proceedings of the National Academy of Sciences (PNAS).
It is well known that mRNA is produced during the process of forming and storing memories, but the technology for studying this process on the cellular level has been limited. Previous studies have often involved dissecting mice in order to examine their brains. The new technique gives scientists a window into RNA synthesis in the brain of a mouse while it is still alive.
Memories are thought to be encoded in populations of neurons called memory trace or engram cells. However, little is known about the dynamics of these cells because of the difficulty in real-time monitoring of them over long periods of time invivo. To overcome this limitation, we present a genetically encoded RNA indicator (GERI) mouse for intravital chronic imaging of endogenousArcmessenger RNA (mRNA)a popular marker for memory trace cells, write the investigators.
We used our GERI to identifyArc-positive neurons in real time without the delay associated with reporter protein expression in conventional approaches. We found that theArc-positive neuronal populations rapidly turned over within 2 d in the hippocampal CA1 region, whereas 4% of neurons in the retrosplenial cortex consistently expressedArcfollowing contextual fear conditioning and repeated memory retrievals. Dual imaging of GERI and a calcium indicator in CA1 of mice navigating a virtual reality environment revealed that only the population of neurons expressingArcduring both encoding and retrieval exhibited relatively high calcium activity in a context-specific manner.
This invivo RNA-imaging approach opens the possibility of unraveling the dynamics of the neuronal population underlying various learning and memory processes.
We still know little about memories in the brain, explained Hye Yoon Park, PhD, an associate professor in the University of Minnesota department of electrical and computer engineering and the studys lead author. Its well known that mRNA synthesis is important for memory, but it was never possible to image this in a live brain. Our work is an important contribution to this field. We now have this new technology that neurobiologists can use for various different experiments and memory tests in the future.
The process involved genetic engineering, two-photon excitation microscopy, and optimized image processing software. By genetically modifying a mouse so that it produced mRNA labeled with green fluorescent proteins, the researchers were able to see when and where the mouses brain generated Arc mRNA, the specific type of molecule they were looking for.
Because the mouse is alive, the scientists could study it for longer periods of time. Using this new process, the researchers performed two experiments on the mouse in which they were able to see in real time over a month what the neurons were doing as the mouse was forming and storing memories.
Historically,neuroscientists have theorized that certain groups of neurons in the brain fire when a memory is formed, and that those same cells fire again when that moment or event is remembered.However, in both experiments, the researchers found that different groups of neurons fired each day they triggered the memory in the mouse.
Over the course of several days after the mouse created this memory, they were able to locate a small group of cells that overlapped, or consistently generated the Arc mRNA each day, in the retrosplenial cortex (RSC) region of the brain, a group which they believe is responsible for the long-term storage of that memory.
Our research is about memory generation and retrieval, Park said. If we can understand how this happens, it will be helpful for us in understanding Alzheimers disease and other memory-related diseases. Maybe people with Alzheimers disease still store the memories somewherethey just cant retrieve them. So in the very long-term, perhaps this research can help us overcome these diseases.
Scientists from Seoul National University and the Korea Institute of Science and Technology were also involved in this research.
Watch a 3D video visualizing the hippocampus region of a live mouse brain.
Posted in Genetic Engineering
Comments Off on Gene Expression Visualized in Brains of Live Mice in Real Time – Genetic Engineering & Biotechnology News
Manufacturing Biotherapeutics Based On Synthetic Biology Lessons Learned – BioProcess Online
Posted: at 8:03 am
By Antoine Awad, chief operating officer, Synlogic, Inc.
The rise of high-throughput molecular biology and DNA sequencing, in parallel with the increased sophistication of computational models, has enabled the field of synthetic biology, where precision genetic engineering is used to program bacterial cells in much the same way we program computers to perform different functions. In 2014, our co-founders, Jim Collins and Tim Lu, recognized world experts in synthetic biology, pitched the idea to Atlas Ventures of forming the first company that would apply the principles of synthetic biology to the creation and development of biotherapeutics. The idea was that this approach would allow us to address significant medical needs using a completely new approach based on our drug candidates, which we call synthetic biotics. Within eight years, Synlogic opened five INDs with the FDA, dosed more than 350 patients, and built a clinical-stage pipeline focused on metabolic and immunological diseases. This includes achieving proof of concept in one program (in phenylketonuria, or PKU), and proof of mechanism in another (hyperoxaluria, or HOX). From the beginning, we knew that as pioneers, manufacturing would present challenges and also would be a critical success factor.
Our drug candidates to date have used the same starter strain, or chassis, a well-studied probiotic called E. coli Nissle 1917. As live potential biotherapeutics, these present unique challenges. As is the case with many biotechnology companies, especially those advancing innovative therapeutic approaches, we evaluated the benefits of outsourcing manufacturing to third parties that have specialized expertise in producing medicines based on synthetic biology. We started discussions early in our development programs and assessed all our options to determine the optimal pathway that would deliver the levels of quality and precision that are essential in development of drugs based on synthetic biology.
Using E. coli Nissle (ECN) has advantages in a history of robust safety data validated in more than 100 years of clinical research. A challenge, however, when producing ECN as synthetic biotics is the need to strike a balance between increasing cell densities and inducing target enzymes. A disproportionate focus on one of these parameters can have an adverse effect on the other. While the technology to grow cells is very effective, the cells need to be kept alive and able to maintain high viability, which is imperative to their proper function in disease targeting. Fermenting bacteria for protein production is common, but expertise in maintaining high cell viability is both essential and rare.
To help address these challenges, we reached out to contract development and manufacturing organizations (CDMOs) with specialized expertise. While many CDMOs were using fermentation techniques for industrial purposes, these technologies would not meet good manufacturing practice (GMP) standards and FDA compliance guidelines for production of biotherapeutics. Many also do not have both fermentation and lyophilization (freeze drying) capabilities under one roof. The ones that do often have limited lyophilization capacity that does not align with fermentation scaling. Among the limited number of CDMOs that will work with live bacteria, most have long lead times and high costs, especially following demands on production associated with the COVID-19 pandemic.
Given the limited options for third-party support available, Synlogic invested in manufacturing to meet our needs at every phase of development and to keep a vigilant focus on product viability. Our drugs include cells that must remain metabolically active; over time, they will die unless they are formulated into a stable powder. To minimize the duration of our processing time, we decided to co-locate fermentation and downstream processing and lyophilization to prevent cell death and maintain high drug viability. We also implemented a lyophilization step that enhances the shelf life of our therapies and allows for more patient-friendly presentation as an oral powder.
In operations involving fermenters, lyophilizers, and analytical instruments in quality control settings, automation is critical to make processes efficient and minimize production costs. For example, a fermenter for E. coli Nissle must run between 16 and 22 hours. Without automated capabilities, this process would require manufacturing operators to be on-site around the clock. Automated technologies also play a central role in helping us meet both demand and quality control (QC) requirements at every stage of the product life cycle.
Our ambr 15 and ambr 250 high-throughput automated bioreactors or fermenters are used in process development, process optimization, and scale down models. With these systems, we can test different conditions and process parameters in a short timeframe and at low volumes, which gives us a quicker path to an established process while reducing costs per experiment. We have another high-throughput automatic analyzer that enables screening and analysis of fermentation metabolites. With this production system in place, we can better understand what is required to keep cells healthy, growing, and active. The technology also allows us to be faster and more confident in our decision-making and potentially reduce cycle time.
We also implemented a range of single-use technologies throughout our facility as well as customized processes to address specific challenges in manufacturing our biotherapeutics. Single-use technology allows us to switch between programs faster by minimizing required cleaning and risk of cross-contamination. It also reduces the facility footprint, thus decreasing the necessary up-front capital investment. We also established a cleanroom that incorporates procedures and layouts that reduce the risk of microbial contamination and product cross-contamination through an air pressure cascade, segregation of product operations, and cleaning requirements.
One of the major challenges with any new technology or therapeutic approach is the ability to rapidly scale manufacturing as needed from early-stage research through to commercialization. Recognizing our needs in terms of scaling up as well as the challenges in considering both in-house capabilities and engagement of CDMOs, we quickly recognized the potential benefits of a hybrid approach.
Our physical cleanrooms come with a menu of services that can be handled by CDMOs, including inventory control, warehousing, environmental monitoring, and other support areas. Meanwhile, we built an internal infrastructure at Synlogic that is able to meet product needs based on available resources and our own highly experienced staff who are trained in GMPs. Our in-house capabilities include process development, analytical development, formulation, current GMP production, packaging and labeling, QC, and quality assurance. In a hybrid model, we have the flexibility to outsource some of the required tests and assays to labs/CROs when needed. The facility was also designed to handle our process needs with the ability to readily scale up and expand further as our development programs advance.
When planning a manufacturing strategy, it can be advantageous for biotechnology companies to co-establish research and CMC process development in the same facility, allowing for more efficient exchange of technical expertise. Generally, companies advancing a program into clinical development can often handle production needs related to Phase 1 or Phase 2 clinical trials internally when required scales are more modest.
It is important that companies consider investing in automated processes wherever feasible and recognize that scaling up can require larger equipment and potentially exponential increases in the need for raw materials and consumables, many of which can have long procurement times. Planning early is essential to address potential supply chain issues and avoid bottlenecks. It can also often be advantageous to consider collaborating with regulators and other stakeholders early in the development process. Early input from regulatory agency contacts and consultants can support smoother transitions as companies advance to later stage clinical development.
Whether companies decide to establish in-house manufacturing capabilities, outsource to CDMOs, or build a hybrid model, planning to meet production goals at every stage can require significant levels of innovation and flexibility. Teams must be prepared to address new challenges and make quick, thoughtful decisions throughout the product life cycle to be successful. These demands can be even more important in emerging areas of research such as synthetic biology that can require development of entirely new and previously untried strategies and technologies to keep manufacturing on track.
About the Author:
Antoine (Tony) Awad is chief operating officer at Synlogic. He has more than 18 years of experience in the biotech and pharma industry with substantial experience in the development and manufacturing of novel therapeutics from pre-IND studies through global commercialization. Prior to joining Synlogic, he was most recently at Abpro Therapeutics and served as senior vice president of CMC and operations, where he was responsible for the development of bi-specific antibodies for oncology and leading corporate operational functions. Previously, he was at L.E.A.F. Pharmaceuticals and Merrimack Pharmaceuticals. Awad is a graduate of Boston University and holds a bachelors degree in biochemistry and molecular biology and conducted graduate research at Boston University School of Dental Medicine.
More here:
Manufacturing Biotherapeutics Based On Synthetic Biology Lessons Learned - BioProcess Online
Posted in Genetic Engineering
Comments Off on Manufacturing Biotherapeutics Based On Synthetic Biology Lessons Learned – BioProcess Online
Bitcoin: Why Environmentalists Need to Relax, It’s Better Than You Think – BeInCrypto
Posted: at 8:03 am
Bitcoin isnt the great evil we have been led to believe. Until youve looked deeply into something, youre not in a position to judge it, says Daniel Batten.
John Lennon once said, Life is what happens when youre busy making other plans. I remembered this line today when I recalled the plans I was making for this year, and what actually happened.
For as long as I can remember, protecting the environment was my #1 value. At drama school, a friend affectionately teased me by christening me Daniel loves trees more than people Batten. In my teenage years and twenties, I was a regular on protest marches: against the confiscation of indigenous land, against Genetic Engineering, and against the logging of native forest to name a few. The most recent one I went on was 9 years ago against Deep Sea Oil Drilling off New Zealands coast.
I still remember the day a friend from Greenpeace, an organization I supported over 4 decades, rang me up to ask me to lead an action against one of McDonalds environmental practices. Together with a flock of humans dressed as chickens, we stormed the McDonalds HQ and I, dressed in Ronald McDonald-like regalia, announced my resignation. Oh, and there is a slight chance youll be arrested he mentioned at the end. I was CEO of a technology company Id founded at the time. Without telling my board what I was planning, I said yes.
Slowly it dawned on me though, I was a better supporter of technology than I was a protestor. What if I used that skill to make a difference? That led me to create a ClimateTech VC fund.
It was an easy decision. Id been investing in technology companies for 19 years, so I knew what to look for, what to avoid and how to optimize a founding teams chance of success.
We invested in companies that not only made commercial sense but who we felt proud of. One company is on a mission to decarbonize the entire Zinc industry by 2045. Another has a goal to remove 50% of all CO2 emissions from the Greenhouse industry by 2030. The way theyre going, theyll probably make it.
About the same time, a friend of mine started talking to me a lot more about Bitcoin. I felt conflicted. On one side, I could see the social good: how it helped build a world where wealth transfer from the poor to the rich via quantitative easing was no longer possible.But as an environmentalist, Id heard the stories about its energy usage and was unconvinced if it did enough good to justify its carbon emissions.
When Greenpeace came out against Bitcoin the inner conflict intensified. My Bitcoiner friend was telling me that it helped build out the renewable grid. My friends at Greenpeace were saying this was Greenwash propagated by greedy Bitcoin investors who would say anything to increase the user-adoption upon which their returns depended.
I realized I had to do my own research.
I didnt know what Id find, but I suspected the truth would lie somewhere in the middle.
For the first time in my life, I spent an extended period of time simply researching something I was curious about and wanted the answer to.
My research led me to read more about Climate Change, CO2 emissions, and methane emissions than Id ever read. Ill be honest: discovering the true enormity of the climate crisis and the task ahead of us was not easy reading. I forced myself to understand physics, energy, the jargon of the electrical grid, energy trading, and Bitcoin mining. I interviewed or listened to interviews with climate scientists, solar engineers, grid operators, analysts at utility companies, utility scale wind operators, solar installers, battery experts and onchain analysts.
Slowly but surely, a picture started to emerge from the haze. It was a consistent picture, consistently espoused by all the people I spoke to who had looked into it deeply.
The conclusion was this:
1. Bitcoin mining can be used in a way that is bad for the environment. Examples of this include the re-opening of a gas plant in NY State for the sole purpose of Bitcoin mining.
2. Bitcoin mining can also be used in a way that is good for the environment. Examples of this include the solar and wind operators I discovered who would not have got financing to build their plants had it not been for having a Bitcoin mining customer.
That looks like a neutral outcome, some arguments for and against. The outcome was anything but neutral.
I also found out
1. That the direction Bitcoin mining is heading is towards renewable energy
2. That the rate at which it is transitioning to renewable energy is faster than any other industry Id seen (as a VC who sees around 50 cleantech pitches a year, wed seen a whole stack!)
3. The current % of renewable energy use is also higher than any other industry
4. All the solar engineers, battery engineers, grid operators and utility analysts I spoke to, people whod widely studied how you build out a renewable grid, said the same thing.
1. You cannot build a renewable grid without having flexible load customers
2. Bitcoin miners are the best flexible load customers theyve seen
3. Bitcoin mining is increasingly using energy that would have otherwise been wasted (such as solar energy at midday or wind at midnight when people didnt need it)
4. Bitcoin mining provided a path to retire all fossil fuel-based turbines needed as backups during peak load times
5. Bitcoin mining helped with maintaining the frequency and voltage regulation of the grid (which become progressively harder with every 10% of variable renewable energy you add)
6. Bitcoin mining could make power more affordable to consumers by reducing the curtailment fees that utilities otherwise had to pay to renewable operators for not taking their surplus power.
There were a host of other benefits too. But this would require some deeper analysis (and jargon) about how electrical grids work.
But really, the first two points say it all: grids built on variable renewable energy must have flexible customers who can adjust their usage according to generation supply. They must also be able to reduce their usage given minutes notice. Bitcoin miners are the only customers who provide this flexibility.
Or to put it even more bluntly: without Bitcoin mining, the renewable grid will simply not happen it will remain an ideal: Grid operators and utilities will say they are working towards it.
So what problem with renewables does Bitcoin uniquely solve?
Grid operators #1 goal is to maintain the stability of the grid. Cost-effectiveness and renewable composition is important, but not as important as stability. Thats because when grids fail, people die and grid operators lose their jobs (as recently happened during the Texas 2021 Winter Blackouts). Even with battery technology, this stability becomes progressively harder to achieve when you base your grid on variable renewable energy.
While we must move away from fossil fuel plant, coal and gas did offer one advantage over renewables: you could increase or decrease the generation of a gas plant at will. Solar and Wind dont have that flexibility. They are also highly unpredictable.
When you add inflexible and unpredictable generators such as solar and wind, unless you counterbalance this with flexible, predictable customers the entire grid will become unstable because of either under-supply or oversupply of electricity and there will be a higher risk of blackouts.
Ironically, as climate change bites, extreme weather events are becoming more common. This means that grid operators are faced with the Herculean task of trying to transition to a grid made up of variable renewable energy at a time when even the existing grid is becoming more unstable due to climate events.
Grid operators have investigated a number of other options: Hydrogen, batteries, pumped hydro, device control programs, and Demand response based on curtailing steel plants. None of them come anywhere close to the flexibility of Bitcoin miners. Even batteries are only a partial solution for reasons I cover in detail in my separate article
Bitcoins location-agnostic and time-of-day agnostic features turn out to make it the ideal way to remove most of the worlds atmospheric methane caused by human intervention too, but thats another story.
In summary: here are some quotes directly from some of the key players.
Never in my wildest dreams would I have imagined a customer as ideal as Bitcoin miners [Utility Scale Wind Operator]
I started off researching batteries as a solution to the intermittency of solar. I soon realized that without another offtaker for surplus power, batteries were incomplete. After testing a number of possible offtakers, I realized the that best one by far was Bitcoin Mining. [Sam Kivi, Solar Engineer]
We can use that cryptocurrency to find a home for more solar and more wind to come to our grid. Then they reduce consumption when we need that power for other customers. So its a great balancing act. [Brad Jones, Interim CEO, ERCOT (The Grid Operator for Texas)]
A big component of how we get [to our 2050 renewable grid goal] is new demand response strategies. And Bitcoin mining is different because you can reduce demand in minutes to the exact level you need with pinpoint precision. You just dont see that level of flexibility or response in these legacy (demand response) programs. [lead analyst, major US Electrical Utility]
So, there you have it. It wasnt what I was expecting to find. I had to put aside everything I thought I knew about Bitcoin. The more I go through life, the more I realize that until youve looked deeply into something, youre not in a position to judge it.
A deeper look at Bitcoin reveals a surprising truth. Bitcoiners sometimes despair that the mainstream media narrative does not take this deeper look. I would encourage these Bitcoiners to not be concerned. Every novel disruptive technology gets attacked because, well, it disrupts some people who do not want to be disrupted.
Meanwhile behind the scenes, Bitcoin is enabling one of the single most important transitions of a generation: the transition to the renewable grid. Sooner or later this truth will become so undeniable that Bitcoin detractors will be forced to choose a different attack vector. In the meantime, as an environmentalist, I could not be more delighted that we have Bitcoin in our world.
There is a phrase in the climate movement: the perfect is the enemy of the good. Solar isnt perfect but its good. So is the case for Wind. For Bitcoin also, its in the same club: not perfect, but good. Bitcoin offers us a practical way to build out the renewable grid at a time were in a footrace against a fast-warming world.
Daniel Batten is a ClimateTech investor, author, ESG analyst and environmental campaigner who previously founded and led his own tech company which exited in 2019.
Got something to say about Bitcoin and the enviroment or anything else? Write to usor join the discussion in our Telegram channel. You can also catch us on Tik Tok, Facebook, or Twitter.
DisclaimerAll the information contained on our website is published in good faith and for general information purposes only. Any action the reader takes upon the information found on our website is strictly at their own risk.
See original here:
Bitcoin: Why Environmentalists Need to Relax, It's Better Than You Think - BeInCrypto
Posted in Genetic Engineering
Comments Off on Bitcoin: Why Environmentalists Need to Relax, It’s Better Than You Think – BeInCrypto
Dickson Prize Day will celebrate past three winners – University Times
Posted: at 8:03 am
The Pitt research community will celebrate the Dickson Prize in Medicine winners for 2020, 2021 and 2022 with the the first-ever Dickson Prize Day on July 19.
The Dickson Prize in Medicine is the most prestigious honor given by the Pitt School of Medicine. It has been given out annually since 1971 to an American biomedical researcher who has made significant, progressive contributions to medicine. The award consists of a specially commissioned medal, a $50,000 honorarium and an invitation to present a seminal lecture at Pitt. The winners for the past two years were unable to travel to campus because of the pandemic.
For the Dickson Prize Day, the past three years winners will give in-person talks to the Universitys research community, followed by talks from Pitt faculty whose work complements that of the Dickson Prize recipients.
Pitt faculty panelists include Warren Ruder, Vaughn Cooper, Alexander Deiters, Alison Morris, Toren Finkel, Arjumand Ghazi, Aditi U. Gurkar, Andrey A. Parkhitko, Kay Brummond, Robert Ferris and JoAnne L. Flynn.
The event will be from 8 a.m. to 5:15 p.m. July 19 in the University Club, with a reception to follow. All events also will be livestreamed.Register now.
Carolyn Bertozzi,the Dickson Prize honoree for 2022, will give a talk, Therapeutic Opportunities in Glycoscience. Bertozzi is a professor of chemistry at Stanford University.
Bertozzis research interests span the disciplines of chemistry and biology, with an emphasis on studies of how sugar molecules on cell surfaces are important contributors to diseases like cancer, inflammation and bacterial infection. Her lab has identified ways to modify these sugar molecules through bioorthogonal chemistry a method that employs chemical reactions that do not interfere with normal cellular processes. This approach has allowed her to develop new therapeutic approaches to treat many diseases, including most recently in the field of cancer immunotherapy.
In addition to her research, Bertozzi works actively to translate her science into new therapies. She has cofounded several startups, including Redwood Bioscience, Enable Biosciences, InterVenn Biosciences, OliLux Biosciences and Lycia Therapeutics.
Cynthia Kenyon, the 2021 Dickson Prize winner, will discuss The Plasticity of Aging. Kenyon is vice president of aging research at Calico Life Sciences, an American Cancer Society professor, and emeritus professor of biochemistry and biophysics at the University of California-San Francisco.
In 1993 as a faculty member at UCSF, Kenyon discovered that a single gene mutation could double the lifespan of healthy roundwormsa finding that sparked an intensive study of the molecular biology of aging. Her research showed that the aging process is not random and haphazard as previously thought but, instead, is subject to active genetic regulation. Her work led to the realization that a hormonal network influences the rate of aging in many organisms, possibly including humans.
Since then, Kenyon and her lab members discovered a variety of genes that influence aging by coordinating diverse processes that protect cells and tissues. In addition, Kenyon and her team found that different kinds of tissues work together to control the pace of the aging process, and that individual neurons and germ cells can control an animals lifespan.
James J. Collins, 2020 Dickson Prize honoree, will give a talk on Harnessing Synthetic Biology and Deep Learning to Fight Pathogens. Collins is the professor of medical engineering and science and professor of biological engineering at the Massachusetts Institute of Technology. He also is affiliated faculty with theBroad Instituteof MIT and the Wyss Institute at Harvard University.
Using engineering principles to design and construct synthetic gene networks, he was one of the first to harness the biochemical and biophysical properties of nucleic acids and proteins to create biological circuits. A seminal 2000 publication in Nature describing the successful creation of a bistable, synthetic gene switch inEscherichia colihas been cited more than 4,000 times and marks the arrival of an important new discipline in biomedicine.
Collins later demonstrated that synthetic gene networks could be linked with a cells genetic circuitry as a regulatory mechanism to create programmable cells for biomedical applications.More recently, Collins has created engineered microbes and whole-cell biosensors to serve asin vivodiagnostics and therapeutics. One innovative platform that he and colleagues developedembeds freeze-dried, cell-free synthetic gene networks onto paperand other materials with a wide range of potential clinical and research applications.
The resulting materials contain properties of a living cell, are stable at room temperature and can be activated by simply adding water.Collins work on freeze-dried, cell-free synthetic biology has established a platform for a new class ofrapid, programmablein vitrodiagnosticsfor emerging pathogens, including drug-resistant bacteria and viruses.Collinsand his team currently are developing a rapid self-activatingCOVID-19 face mask as a wearable diagnostic.
Excerpt from:
Dickson Prize Day will celebrate past three winners - University Times
Posted in Genetic Engineering
Comments Off on Dickson Prize Day will celebrate past three winners – University Times
Climate Action Farming: What farmers need to do to adapt as our planet heats upand why organic and ‘regenerative agriculture’ cannot meet the…
Posted: at 8:03 am
If you are doubtful whether climate change is real, just ask a farmer. Set aside the debate over the degree to which humans or natural cycles are driving it, farmers globally are grappling with temperature extremes, droughts broken by intense rainfalls and flooding, increasing wind intensity, and more and more dangerous pests. Its a global threat.
As crop productivity is particularly sensitive to shifting and increasingly unfavorable weather patterns, farmers need to find ways to enhance their systems resilience. As with other sectors, agriculture needs to address its unique emissions footprint. But looking beyond these challenges, agriculture has the opportunity to help mitigate the threat of climate change.
There are a number of practices that farmers could adopt to manage their land in ways that would optimally remove carbon dioxide from the atmosphere and store it underground in stable forms of organic matter. Such systems could be called climate action farming. While these practices are fairly well known they are not yet widely adopted because of practical and economic challenges. It will take a true paradigm shift in mainstream farming for these methods to be adopted at the scale and in the time frame necessary to have sufficient impact on the trajectory of climate change.
Fortunately, there is a historical precedent a similarly significant farming paradigm shift, a response to a different environmental crisis of human origin that helped rescue American agriculture 90 years ago. The crisis was the Dust Bowl of the 1930s and the solution that evolved was something called no-till farming.
This year marks the 60th anniversary of the first commercial no-till field grown in Kentucky in 1962.What is no-till farming and why is it so important for addressing climate change?
Minimum tillage is the foundation for climate action farming as the potential to capture and store carbon is enhanced by complimenting that system with things like cover crops, some diverse crop rotation or double cropping options, certain forms of animal farming integrations, and controlled wheel traffic to minimize soil compaction.
Unfortunately, traditional farming methods depend on plowing and tillage to control weeds and to prepare a seed bed. In the process, they degrade the soil. Much of the captured carbon dioxide is then released through accelerated microbial breakdown of the organic matter, the soil particles become smaller and more susceptible to erosion, and earthworm populations crash. These tilled soils are less able to capture and store water. The impacts of tillage have always been problematic, but they are more so in an age of climate change with greater heat stress and more extreme rainfall and drought events.
In contrast, no-till farming systems developed over the last 60 years are able to restore soil health, give crops greater climate resilience and reduce the use of fossil fuel for the plowing process. No-till farming is a system that avoids virtually all mechanical disruption of soil. It was originally developed as a solution to soil erosion.
Here is how it works. Undisturbed, natural soils are diverse, living systems that are fed by the roots of growing plants. Over time some of the carbon captured by the plants is stored in the soil in the form of long-lived organic chemicals that help to form three-dimensional aggregated particles that give the soil physical stability, air circulation and nutrient buffering. The plants are able to absorb key nutrients through their roots for growth, and then earthworms help to recycle those nutrients by carrying some of last seasons plant material down from the surface through their tunnels that also enhance aeration. The resulting, healthy soils are good at capturing and storing rainwater and at resisting erosion by wind or water.
Sixty years ago, no-till was a radical concept. Today, more than 100 million acres of US farmland are tended using no-till and related methods. According to the United Nations Environmental Program,no-tillage operations in the United States have helped avoid 241 million metric tons of carbon dioxide since the 1970s. Thats equivalent to the annual emissions of about 50 million cars.
It is important to understand what enabled the large-scale shift to minimum tillage and also to identify the factors that stood in the way because these same factors will almost certainly influence the trajectory of agricultures role in addressing climate challenges going forward
The Dust Bowl was a prolonged and massive series of wind erosion events that occurred during a drought in the 1930s.
Leading up to that time, what had once been deep, rich prairie soil had been degraded by decades of plow-intensive farming methods. In 1934 a soil surveyor named Hugh Hammond Bennett timed his congressional testimony about the crisis to a day when one of those dust clouds was sweeping over Washington DC.
Congress acted unanimously to establish the Soil Conservation Service which made some progress in reducing erosion with conservation tillage methods that involved less dramatic soil disturbance,but the fundamental farming paradigm had not changed;nor had the erosion problem been fully addressed.Then in 1943 an agricultural agent in Ohio named William Faulkner published a book titled Plowmans Folly which raised the then radical concept of farming without any plowing or tillage as the solution to wind and water erosion.
Faulkners vision of farming without the use of the plow was initially advanced through the efforts of visionary mechanical engineers and agronomists in the public sector. They developed prototype equipment and did the trial and error in field plots until they had systems that could be shown to farmers, often in the form of a demonstration plot. It was one such demo by an agronomist from Illinois named George McKibben that inspired that Kentucky farmer to try no-tilling in 1962. There was also a good deal of ongoing cooperation between the researchers and the early adopters of no-till because both then and now the growers have much to add to the process as described in the next section.
There has been a great deal of public sector research on the other climate action procedures such as cover cropping, forage double cropping, and non-traditional crop rotations. A watch-out for today is declining funding for that sort of public, applied research, and a shortage of young people pursuing careers in agricultural sciences at just the time when the baby boom generation is retiring. Hopefully increasing climate concern could help remedy that situation.
After the research phase for new cropping methods, the next stage was early commercialization. Dwane Beck, the long-term director of the Dakota Lakes Research Farm, says that there are three groups that provided the leadership during this phase: innovators, adapters and early adopters.
The innovators are those who have the inclination to try out-of-the-box ideas and who are not dissuaded by the potential for failure. As Beck puts it, if you dont ever fail you arent being creative enough. Some of their ideas actually worked. It then fell to the adapters to work out the practical details. These two groups were often involved in the development of new, specialized machinery long before the mainstream manufacturers stepped in.
In the next stage, early adopter farmers are open to new ideas; they need to see and touch real-world examples before trying it out on their own farms.
These three kinds of farmers are the kind of leaders needed to advance climate-action farming. Agricultural trade publications such as No-till Farmer, Strip-Till Farmer and Progressive Farmer are full of stories about these leaders and their innovation with new methods and systems. This is the base for the next phase of wider adoption.
The early adopters of no-till faced criticism and even ridicule about their trashy farms, so much so that many avoided the traditional coffee shop meeting places where they would be subject to that kind of harassment. Fellow no-tillers ended up developing their own supportive communities in which they could find affirmation, encouragement, advice and assistance.
By 1972 they had their own magazine, No-till Farmer, which featured stories based on research and experience as well a cartoon series featuring tongue in cheek exchanges between Plowman Pete and No-till Ned.
They instituted annual meetings and gave out No-till Innovator awards.
Farmers who self-identify as No-tillers still comprise a distinct community today as do strip-tillers who favor a different, but still soil health-compatible approach. Strip-till is a field tillage system that combines no-till and shallow tillage of strip that make up a small portion of the field to produce row crops. Striptillsystems remove residue from the soil surface over the seedbed, resulting in soil temperatures similar to conventional tillage systems.
Most observers agree that these cutting-edge farmers are most likely to drive the expansion of climate action farming. Climate activists in the non-farming population need to see past the unfair but all too common portrayal of agriculture as a faceless, monolithic sector engaged in industrial farming. Yes, it is an industry (and an amazingly productive one), but it is made up of nearly all family farms and it includes a sizable subset of thoughtful people who should be seen and respected as climate care allies, not as an enemy.
As one of the main purposes of tillage is to take care of weeds, no-till farmers had to find other ways to manage them. The adoption of no-till increased more rapidly once certain key herbicides, such as glyphosate, paraquat, and 2,4-D became available in the 1970s through the early 90s. Herbicides also enabled termination of the vegetation that was kept on the field from fall through spring without soil disturbance.
Another major driver of no-till expansion was the introduction of genetically engineered, herbicide-tolerant crops, starting in the 1990s, with glyphosate-tolerant crops the most popular. As farmers adapted to the new technology, some complications emerged. Weed control, an ongoing challenge for all of agriculture, is now complicated by the emergence of herbicide-resistant weed species, regulatory limitations and the difficulty of finding new herbicide chemistry.
Some of the climate care practices can help with weed management while others create new weed and vegetation management issues. Herbicides will continue to be important, but there is a great deal of room for innovation here including things like small, autonomous mobile devices that are guided by aerial imagery to hunt down and mechanically terminate invasive weeds.
Farming is a business, and generally one that is high risk with moderate reward. The transition to no-till does save money because of reduced fuel consumption, but there are capital expenses for the specialized equipment. Over time untilled farmland fields can produce higher yields, particularly under drought stress. This is because undisturbed soil has larger aggregates and more earthworm activity which allows it to better aerate, better capture and retain rain or irrigation water, and more effectively buffer nutrient supplies. Thus, the system tends to pay for itself in the medium term.
Climate action farming systems further enhance these soil health benefits, and they have some bottom-line impacts through reduced fertilizer requirements or revenue from grazing or through double cropping. Some systems can reduce income if some growing seasons are devoted to less profitable rotational crops for instance in order to increase carbon capture with deep-rooted crops. There are seed and fertilizer costs for some practices. In any case these systems may eventually be able to pay for themselves. These economic tradeoffs are a logical topic for applied research in the public sector and something that would also logically fall to the adapters described above.
There are ways that farmers can be paid to farm in ways that sequester carbon. Companies can buy carbon credits if they have a carbon footprint goal that they cant achieve by reducing their own emissions. McKinsey estimates that there will be a $50 billion market for carbon credits by 2030. There are programs specifically designed to pay farmers to adopt practices in the climate action category.
But questions abound both from within and outside the farming community. Can it pay enough to make it worthwhile for the farmers? Does it discriminate against those who have already made desirable changes? How should the sequestration be measured and verified? What about rented land on which the next lease-holder may change practices? What happens after a contract expires? [There is a thorough exploration of these questions in the farming trade publication AgriPulse.]
There are major efforts underway to use genetic improvement methods to develop crops that are more climate resilient, more productive and more pest resistant. Some of this can be accomplished through conventional breeding, but CRISPR gene editing and other modern biotechnology tools are in many cases more effective. It may also be possible to modify crops in ways that make them even better at carbon capture and storage.
Unfortunately, there are significant barriers that limit the use of advanced genetic technologies for all but a few crops in selected regions. Some industries, such as citrus growers, have been reluctant to adopt biotechnology techniques for fear of consumer resistance even in cases in which genetic engineering could help contain its biggest nemesis, citrus greening.
The European Union, most prominently among some regions and countries, has ignored the advice of their own scientists and blocked most uses of genetic engineering that could benefit their own farmers and the public. EU rejectionism has had global implications. Through their leverage as a major importer and through post-colonial influence in Africa and elsewhere, they have effectively blocked these technologies being used on other crops (e.g. wheat, although a climate-adaptive GE wheat is now grown in Argentina and exported elsewhere). There had been some hope that the next generation of genetic technology, gene editing, would break through these barriers, but this remains uncertain at best.
The term regenerative agriculture is often used in discussions of how agriculture can contribute to climate action. As with organic, the diverse and evolving definitions for what is considered regenerative are often ideological rather than scientific, excluding synthetic fertilizers and crop protection products and also excluding GMOs and gene editing.
In terms of soil disturbance regenerative agriculture definitions typically call for the use of conservation tillage because without herbicides, true no-till or strip-till is impractical at scale.
A technologically constrained, organic-like system is definitely not a good model for climate action farming. Organic farming has a clearly documented yield disadvantage on average 15-40% which means that if it were ever to be widely practiced it would require the use of much more land just to maintain current yields, a climate-unfriendly proposition in a world in which there is no more arable land, save for clear-cutting forestry.
Also, the organic business model depends on a price premium, sometimes double the price of conventionally produced food, with no evidence of nutrition or health advantages. This makes it inherently worse at serving those in lower economic strata. No-till has seen vastly more adoption than organic, particularly for the row crops which represent the majority of land available for agricultural carbon sequestration. Yet there was never a price premium associated with the crops that were managed with that system.
If society hopes to see farmers play a role in climate change mitigation, it is more appropriate to respectfully ask them what can you do? rather than to tell them to do something without the help of key technologies. The ideas, innovation, communities, and technologies that enabled large-scale adoption of no-till farming represent an encouraging model for how agriculture could make a next-generation farming method paradigm shift.
Only farmers can make this happen. They need research support, rational regulations, possibly carbon market income, understanding land owners, and the respect of the broader climate action movement. This challenge is too important to intermingle with ideological or marketing agendas.
Steve Savageis a plant pathologist and senior contributor to the GLP.Follow Steve on Twitter@grapedoc
Jon Entineis the foundingexecutivedirectorof theGenetic Literacy Project, and winner of 19 major journalism awards. He has written extensively in the popular and academic press on agricultural and population genetics. You can follow him on Twitter@JonEntine
Posted in Genetic Engineering
Comments Off on Climate Action Farming: What farmers need to do to adapt as our planet heats upand why organic and ‘regenerative agriculture’ cannot meet the…
Legal confusion on essentially derived varieties must be cleared up for the sake of innovation – Green light to gene-editing plants risks uncertainty…
Posted: at 8:03 am
The UK Government announced on 10 May in the Queens Speech that it intends to pass the Genetic Technology (Precision Breeding) Billto permit the use of gene-editing technologies in the commercialisation of plants and animals to improve agricultural efficiency and food production in the UK. This comes on top of a recent change in legislationback in Aprilto permit research and development using gene-editing techniques.
Both changes come in a post-Brexit change to the regulatory landscape, bringing the contentious issue of GMOs and intellectual property to the fore once again. The ground-breaking news follows a decade of a largely EU-wide moratorium on the approval and commercialisation of gene-edited crops.
With the fast pace of change brought by genome editing as part of new breeding techniques (NBTs) to develop plant varieties, there is a real concern that breeders may seek to launch new crops with a biotech trait, by using existing plant material without authorisation or any compensation to the holder of plant breeders rights in that existing variety.
Meanwhile, some biotech companies argue that the uncertainty of the legal concept of essentially derived varieties (EDVs) is stifling innovation. Others cite the difficulty of policing use of material of their protected initial variety, bringing a claim for infringement and the risk of damage to their business from competition from a derived variety.
We expect to see increased regulatory challenges and disputes in this area and businesses must monitor the latest developments to ensure they are prepared. A revolution in genetic engineering has transformed plant breeding. Development of all new plant varieties typically involves access to, and use of, plant material of existing plant varieties. The traditional process of crossing and selection relies largely on unintended recombination of genes and random mutants occurring naturally, which are selected and developed further over many years. Using multiplication techniques, a new variety can take over ten years to get to market. However, new breeding techniques can reduce the development time for a new variety to only two to four years.
One of the most controversial legal areas of plant breeding is the concept of essentially derived varieties (EDVs). The concept first set out in the 1991 UPOV Convention is predicated upon an open model of innovation, which allows breeders to use plant material for the purpose of breeding a new variety, the so-called breeders exception (Art 15(1)(iii)).
Recognising the advent of biotechnology, UPOV introduced a compulsory limitation upon the breeders exception in respect of EDVs under Art 14(5)(b). The main driver for the limitation was to prevent plagiarism of traditionally-bred varieties using new technologies, but it was also in response to erosion of the minimum distance requirement in phenotypical characteristics, as part of the DUS (distinct, uniform and stability) requirements for protecting a new variety.
The complex EDV concept was intended to prevent the commercialisation of a bred derived variety, without the consent of the owner of the plant breeders rights in the protected initial variety. The idea was that, given the significant contribution that the work in breeding the initial variety had made to the derived variety, the owner of rights in the initial variety should be entitled to licence the material embodied in the derived variety and obtain a financial benefit. It would prevent the developer from simply patenting the biotech trait and then commercialising the trait by inserting it into plant material already protected by plant breeders rights, to produce an improved variety. It is not intended to prevent the developer of a derived variety from obtaining protection for that variety.
The complex definition of EDV raises many legal issues and uncertainties, including what amounts to predominantly derived, and whether it is relevant to the expression of the essential characteristics that the EDV has strong phenotypical similarity to, as well as genetic identity. The concept has polarised views and been applied differently around the world, making this an even more complex issue to grasp.
The lack of clarity on EDVs has already led to significant controversies and litigation, notably in the case of Danziger'Dan' Flower Farmv.AsteFlowers B.V. where the Dutch Court of Appeal, The Hague, held that the Gypsophila variety, Blancanieves, was not an EDV of Million Stars Dangypmini.
There is also a long-running dispute over the mandarin varieties Nadorcott and Tango, over whether Tango is an EDV of Nadorcott. The latest spin-off in 2021 pitted the King of Morocco (owner of Nadorcott) against a Spanish farming business over alleged unlicensed production of the variety, with the Advocate General giving his Opinion upon a preliminary reference to the CJEU on 14 October 2021 from a reference from the Supreme Court in Spain. Cases have been heard over whether Tango is an EDV in the US, Australia and South Africa and before the Community Plant Variety Office as to whether Tango is a distinct variety capable of protection.
So, what is the way forward? The debate continues but there have been some recent developments which businesses should be aware of. For example, a Working Group on EDVs was set by UPOV for the revision of the Guidance on EDVs. In addition, a draft text was last discussed on 19 October 2021, with a view to being adopted by the UPOV Council.
Some experts are keen that the new Guidance provides a more practical framework to distinguish between the cases which are clearer instances of EDV or non-EDV and how borderline cases may be determined. They argue that the inventive contribution of both the initial variety and the derived variety should be reviewed, in terms of the expression of their essential characteristics.
The legal presumption may be upon the owner of the protected initial variety to prove essential derivation and then claim dependency. However, some argue that the burden of proof should be reversed as the developer of the derived variety may have better data about the gene edit.
There is a role for the use of biochemical and molecular markers, though they need to be used cautiously, if used to determine whether a variety is an EDV.
There may also be a role for plant variety or patent offices to be able to provide an early declaratory decision, based upon receiving technical evidence. Alternative dispute resolution, such as expert determination or mediation facilitated by an industry expert, may also become more commonplace.
In addition, there may be value in developing industry-standard approaches to licensing and model form contractual clauses, perhaps drawing upon other licensing regimes (such as SEPs) or perhaps even under the Nagoya Protocol for plant genetic resources.
Biotech companies and plant breeders alike should closely monitor these developments and weigh up the risks of dispute, to avoid disruption to a smooth launch of a new crop into the desired market.
The UPOV Council needs to take a clear position soon on how to determine easily an EDV to give certainty to industry when using gene editing to develop new varieties, on situations when a licence may be required and royalties payable before launching commercially.
This article was first published in May 2022, on IAM media.com here.
Read the rest here:
Legal confusion on essentially derived varieties must be cleared up for the sake of innovation - Green light to gene-editing plants risks uncertainty...
Posted in Genetic Engineering
Comments Off on Legal confusion on essentially derived varieties must be cleared up for the sake of innovation – Green light to gene-editing plants risks uncertainty…
Helen Clark vs John Campbell: the Corngate interview, 20 years on – The Spinoff
Posted: at 8:03 am
It was the election interview that gripped the nation, taking the then white-hot issue of genetically modified food and turning it into incredible political TV drama. Duncan Greive reflects on Corngate an epochal moment in our media history.
The first thing that strikes you is the staging the studio is pitch dark, with bright spotlights on Helen Clark and John Campbell. He is rounding into the early era of his cult status as a probing, fearless interviewer; she is at the height of her power and influence as prime minister.
Clark looks fierce, Campbell locked in. Hes holding his clipboard and unleashing a volley of very heated statements. Did you mislead the Royal Commission? he asks, and later, Its about whether or not we can trust you, repeating it for emphasis. Feel free to shoot the messenger, he says toward the end.
Clark was not told about the specifics of the interview, and is clearly furious as a result, responding to Campbells questions with relentless real-time media criticism: You may think this is a really smart way to set up the prime minister, she says at one point. The more this interview goes on, the more offended I am, she says later. Its simply preposterous to carry on.
Every moment of it is extraordinary. The original tape seems to have essentially vanished, with Three not responding to requests for the archive and only a grainy six-minute clip available on NZ On Screen (Update, July 9: Three has now responded and surfaced the full interview from their archives!). Still, you can feel the heat even after all these years.
Given the challenges we confront today, its head-spinning to think all this arose over some delicious corn. The core of the issue was whether, during a time of a major debate about the safety of genetically modified crops, GM corn was accidentally released into New Zealands food supply. Campbell had been given an advance copy of Nicky Hagers book Seeds of Distrust, which alleged that a field of GM corn was mistakenly grown and distributed to consumers here, and furthermore, that cabinet had known and conspired with officials from the Ministry for Agriculture to cover it up.
The interview became known as Corngate and was one of the defining flashpoints of the decade, accurately described by a contemporary report as a bomb, dropped right into late stages of the 2002 election period. It derailed Labours campaign, pitted them bitterly against the Greens, caused a lasting rift between TV3 and Clark, and played a role in making genetic engineering a politically untouchable subject to this day.
The idea that a television interview could have such impact speaks to the era in which it aired. Twenty years ago TV news presenters were figures of huge socio-cultural power, and sometimes major newsmakers in their own right.
John Hawkesby received a $5.2m payout after his ill-fated three-week stint presenting the 6pm news on One. A few years later, Paul Holmes resigned from TVNZ in a fury over what he perceived as an insulting contract offer, and for a brief, glorious moment we had an impossible bounty of current affairs in primetime: Close Up on One, Campbell Live on TV3 and Paul Holmes (the show) on Prime all competing with the juggernaut that was mid-2000s Shortland Street.
This was the absolute apex of television as the agenda-setting centre of our lives, and TV3, playing David to TVNZs Goliath, had a pair of white-hot young stars reading the 6pm news in Carol Hirschfeld and John Campbell. The channel had become a beloved challenger to state-owned monolith TVNZ, innovating on style and form. Hirschfeld and Campbell had come to embody the network young, sharp and fearless. Campbell was a brilliant interviewer, smart yet with a rare ability to emotionally connect with the audience.
He had been inserting live political interviews into bulletins for some years, yet Corngate represented a massive escalation. It was a major break from the schedule TV3 was still years away from elevating Campbell to his own show in Campbell Live, and had settled on syndicated airings of broad American sitcoms like Home Improvement as its best weapon to confront TVNZs Holmes and Shortland Street at 7pm. The very fact of the interview breaking that 7pm routine gave it a huge sense of occasion.
It came towards the end of an oddly discombobulated election campaign. National was at its lowest ebb, careening toward its worst-ever election result under the leadership of a baby-faced Bill English, who barely warranted the withering stare of Clark. She was a prime minister of immense force of will and personality, probably our most imposing since Rob Muldoon. But with the capitulation of the right, the chaos of a divided left bloc became the focal point of the election.
Labour had governed its first term in coalition with the Alliance, which became a cautionary tale for minor parties thereafter, collapsing to a less than 2% share of the party vote in 2002. In their stead came a thicket of smaller players that collectively amassed an MMP record 37% of the vote, with NZ First, Act, United Future and the Greens all attracting over 6.5% of the electorates support.
It was the latter party that became Clark and Labours biggest headache, resulting in some memorable lines, including the prime minister referring to the Greens as goths and anarcho-feminists in the days before the Corngate interview. That this has not become the Greens official slogan is one of the enduring mysteries of our politics.
The focal point of much of this rancour was genetic modification (GM), a relatively new form of science that relied on gene editing to produce novel or altered organisms. It has applications across medicine and industry, but its use in agriculture drew the most attention. Proponents saw the potential for higher-yielding or more drought-resistant crops, or livestock that was less prone to particular forms of disease. The science has subsequently essentially settled in favour of GM, but it was highly contentious in our politics at the time, and had been subject to a Royal Commission in 2000.
The commission came back with a cautious endorsement of the technology, saying, New Zealand should keep its options open. It would be unwise to turn our back on the potential advantages on offer. It did little to resolve divisions over the issue, which saw the country split into two camps. Broadly speaking, business and the agricultural sector saw GM as a science crucial to growth, while much of the environmentally minded left saw it as dangerous, unproven practice that risked our clean, green reputation. Following the commissions report, a two-year moratorium preventing applications for the release of genetically modified organisms was put in place in 2001.
As the country rounded into the 2002 election, investigative journalist Nicky Hager was working on his third book. Seeds of Distrust examined the accidental release of GM corn, the potential for contamination of other crops and the decision not to notify the public of the incident. He scheduled it for publication on July 10 less than three weeks before the general election on July 27.
Campbell had worked with Hager twice before, fronting major stories accompanying both Secret Power, which covered the Waihpai spy base and its links to international espionage networks, and Secrets and Lies, which exposed the infiltration of West Coast environmental groups. The pair had become close, and Hager sent Campbell the manuscript for Seeds of Distrust, which emerged on a gigantic roll of fax paper. Campbell pored over it for weeks, recalls Hirschfeld, with the support of news editors Mark Jennings and Mike Brockie. (Jennings did not respond to a request for comment for this story.)
The team knew that this was a big story, but had to approach it carefully. TV3 went to Clarks office with an innocuous cover, suggesting a general conversation about genetic engineering rather than a laser focus on the decision not to publicise the potential release of the GM corn in November 2000. The fateful interview took place in-studio, but 20 years on, Hirschfeld says the infamous spotlit aesthetic was not an attempt to dial up the drama. It instead had a much more banal explanation. What was the thinking? We had no money, she laughs.
They recorded enough footage for that electric half hour of television, cutting only around five minutes, which Campbell says was an attempt to give viewers the whole context. Afterwards, Clark left in a hurry, but not before telling Campbell exactly what she thought of him. She used the word treachery, Campbell recalls, while Hirschfeld remembers you traitor. Clark declined to be interviewed for this story, but has in the aftermath made her view of the interview abundantly clear.
TV3 cut it together into the half-hour special and aired it the following evening, July 9, 2002. It began with an opening segment in which Campbell interviewed Hager and travelled to the location of the alleged leak of the corn; the final two segments were dominated by that extraordinary interview with Clark. It was an immediate sensation, deeply uncomfortable yet incredibly compelling TV.
It was almost unbearable to watch, says Hager now. Writing for the NZ Herald, Jeremy Rees described it as a study of outrage and anger. In the days that followed, media reporting of the special tended to side with Clark, with Russell Brown typifying the criticism on his Hard News segment on bFM, saying that the way it did emerge dropped like a bomb on the election campaign was simply wrong.
Clark certainly thought so. She labelled Campbell a sanctimonious little creep, and interrupted her campaign to respond with a fusillade delivered from the lectern at a hastily arranged press conference. There she made it clear she blamed the Greens, and leader Jeanette Fitzsimons. I am going to sing from the rooftops that this is a very dirty campaign where the Greens and their supporters have descended to the gutter of the National Party.
Fitzsimons did not deter that impression when she issued a press release saying she was deeply distressed that the prime minister apparently decided to let this contaminated crop be grown, harvested, eaten and possibly exported in 2000/2001, and that the government participated in efforts to keep the truth from the public. Similarly, the fact Seeds of Distrust was published by Craig Potton, a former Greens candidate, made it easy for Clark and Labour to frame it as an orchestrated hit though Hager is adamant there was no collusion, and says the Green Party was privately furious with him for distracting from its policy agenda.
The interview became the defining moment of the campaign, and while it didnt impact the result, the bitter taste lingered, and reared up again a year later.
The Broadcasting Standards Authority received numerous complaints about the episode, including one from Mike Munro, the prime ministers chief press secretary, on behalf of himself and Clark. The regulator ultimately released a highly publicised ruling in July 2003. It ran to 92 pages and broadly vindicated the complainants, saying that standards were breached on multiple counts around balance and fairness. It faulted the tenor of the interview with Hager versus that with Clark as neither impartial nor objective, and the fact that Clark was not advised of the source of the allegations.
By BSA standards it was damning, but not unequivocal, and allowed for TV3 to issue a press release that quoted Jennings as saying we knew the story was right, we knew we had done our homework and the BSA ruling largely validates that view.
Russell Brown wrote a reflective response for Public Address afterwards in which he acknowledged that many parties got consumed by the heat of the moment and overdid their reactions, himself included, but faulted TV3 for only preserving the raw footage of the Clark interview, and not that of Hager. Brown says this left the unavoidable impression that it had treated Hagers allegations far more credulously than it did Clarks response.
Campbell disputes that characterisation today, saying he asked for and received the primary materials on which the book was based, and Hirschfeld says Campbell spent weeks on the story. There were endless sessions going over the details, she says. Ive never seen him so prepared.
Clark herself was clearly completely blindsided, and spent much of the interview underlining that fact through gritted teeth. It is simply not acceptable to set up the prime minister on something which happened a long time back in the term of government, that she was not the minister responsible for, she says at one point. Campbell was unconvinced of this then I think you do remember what happened, he says at one point and remains so today. She had a forensic rigour about her she was across the detail of all the portfolios, he says.
While it ultimately had little obvious effect on the election outcome, it had a huge impact on the journalists involved, coming up against a popular PM at the height of her influence. Hirschfeld recalls Campbell so spaced out after it aired that he was nearly run down on Ponsonby Road while getting out of a car. Hager remains frustrated by what happened, and the way the furore completely overwhelmed the book upon which it was based. I couldnt bear to look at it for years afterwards, he says, also believing that Clark has never forgiven him.
Campbell got it most personally, though, with one incident still seared into his memory. One night, not long after the special aired, he was out walking his one-year-old daughter through Three Lamps in Ponsonby, near his home. A woman he describes as having a patrician bearing approached him, and bent down to peer into the buggy. I pity you, having him for a father, Campbell recalls her saying. His relationship with the prime minister was also seriously damaged by the incident. It took years to recover.
Clarks fury dimmed but did not pass, and Hirschfeld remembers TV3 being pointedly left until dead last for an interview as late as the 2005 election campaign. But by that stage their lives had changed, and in some ways Corngate, for all its complexity as an incident, had helped them grow.
Hirschfeld went on to become the producer of Campbell Live, a full-time 7pm current affairs show that cemented Campbell as a star. The fearlessness of that 2002 interview was present in his live interviews as he supplanted Holmes to become the emblematic broadcaster of the era. Hager moved on, too. During the next election he picked up the threads of what was to become The Hollow Men, his book about Nationals 2005 campaign, and the one he considers his best.
As for Clark, she would go on to win a third term, and leave the rancour of Corngate to bulldoze through future controversies, from anti-smacking legislation to the pain of the Foreshore and Seabed Bill. Perhaps the most lasting scar of the era is a political circumspection around genetic modification issues that lingers to this day, with Greens food policy embodying a tension in stressing affordability while confining GM to the lab.
Despite the BSA ruling, Campbell remains proud of the work. To me, it wasnt a GM story, it was a political story. At a time of huge public interest in and fear about genetic modification, did bureaucrats and politicians combine to cover up the release of GM material into the environment? Hager and Campbell both remain convinced that they did. Campbells only regret is that his team were not more candid about the topic of the interview, but isnt sure whether the prime minister would have fronted had they been more direct. Was it the best of all the shitty options? I dont know, he says.
Twenty years on, our politics and media have changed immeasurably, and Hirschfeld expresses a sadness that such an interview has no place in primetime today. Watching it now its obvious why it had such an impact. It was enormous, says Hirschfeld, and it was immensely compelling television that still retains its power to this day.
Follow Duncan Greives NZ media podcast The Fold on Apple Podcasts, Spotify or your favourite podcast provider.
View post:
Helen Clark vs John Campbell: the Corngate interview, 20 years on - The Spinoff
Posted in Genetic Engineering
Comments Off on Helen Clark vs John Campbell: the Corngate interview, 20 years on – The Spinoff
How Cuba is eradicating child mortality and banishing the diseases of the poor – Peoples Dispatch
Posted: at 8:03 am
The authors at a clinic in Palpite in Cuba. Photo: Odalys Miranda/Twitter
Palpite, Cuba, is just a few miles away from Playa Girn, along the Bay of Pigs, where the United States attempted to overthrow the Cuban Revolution in 1961. Down a modest street in a small building with a Cuban flag and a large picture of Fidel Castro near the front door, Dr. Dayamis Gmez La Rosa sees patients from 8 am to 5 pm. In fact, that is an inaccurate sentence. Dr. Dayamis, like most primary care doctors in Cuba, lives above the clinic that she runs. I became a doctor, she told us as we sat in the clinics waiting room, because I wanted to make the world a better place. Her father was a bartender, and her mother was a housecleaner, but thanks to the Revolution, she says, she is a primary care doctor, and her brother is a dentist. Patients come when they need care, even in the middle of the night.
Apart from the waiting room, the clinic only has three other rooms, all of them small and clean. The 1,970 people in Palpite come to see Dr. Dayamis, who emphasizes that she has in her care several pregnant women and infants. She wants to talk about pregnancy and children because she wants to let me know that over the past three years, not one infant has died in her town or in the municipality. The last time an infant died, she said, was in 2008 when a child was born prematurely and had great difficulty breathing. When we asked her how she remembered that death with such clarity, she said that for her as a doctor any death is terrible, but the death of a child must be avoided at all costs. I wish I did not have to experience that, she said.
The region of the Zapata Swamp, where the Bay of Pigs is located, before the Revolution, had an infant mortality rate of 59 per 1,000 live births. The population of the area, mostly engaged in subsistence fishing and in the charcoal trade, lived in great poverty. Fidel spent the first Christmas Eve after the Revolution of 1959 with the newly formed cooperative of charcoal producers, listening to them talk about their problems and working with them to find a way to exit the condition of hunger, illiteracy, and ill-health. A large-scale project of transformation had been set into motion a few months before, which drew in hundreds of very poor people into a process to lift themselves up from the wretched conditions that afflicted them. This is the reason why these people rose in large numbers to defend the Revolution against the attack by the US and its mercenaries in 1961.
To move from 59 infant deaths out of every 1,000 live births to no infant deaths in the matter of a few decades is an extraordinary feat. It was done, Dr. Dayamis says, because the Cuban Revolution pays an enormous attention to the health of the population. Pregnant mothers are given regular care from primary care doctors and gynecologists and their infants are tended by pediatriciansall of it paid from the social wealth of the country. Small towns such as Palpite do not have specialists such as gynecologists and pediatricians, but within a short ride a few miles away, they can access these doctors in Playa Larga.
Walking through the Playa Giron museum earlier that day, the museums director Dulce Mara Limonta del Pozo tells us that the many of the captured mercenaries were returned to the US in exchange for food and medicines for children; it is telling that this is what the Cuban Revolution demanded. From early into the Revolution, literacy campaigns and vaccination campaigns developed to address the facts of poverty. Now, Dr. Dayamis reports, each child gets between 12 and 16 vaccinations for such ailments as smallpox and hepatitis.
In Havanas Center for Genetic Engineering and Biotechnology (CIGB), Dr. Merardo Pujol Ferrer tells us that the country has almost eradicated hepatitis B using a vaccine developed by their Center. That vaccineHeberbiovac HBhas been administered to 70 million people around the world. We believe that this vaccine is safe and effective, he said. It could help to eradicate hepatitis around the world, particularly in poorer countries. All the children in her town are vaccinated against hepatitis, Dr. Dayamis says. The health care system ensures that not one person dies from diarrhea or malnutrition, and not one person dies from diseases of poverty.
What ails the people of Palpite, Dr. Dayamis says, are now the diseases that one sees in richer countries. It is one of the paradoxes of Cuba, which remains a country of limited meanslargely because of the US governments blockade of this island of 11 million peopleand yet has transcended the diseases of poverty. The new illnesses that she says are hypertension and cardiovascular diseases as well as prostate and breast cancer. These problems, she points out, must be dealt with by public education, which is why she has a radio show on Radio Victoria de Girn, the local community station, each Thursday, called Education for Health.
If we invest in sports, says Ral Forns Valenciano, the vice-president of the Institute of Physical Education and Recreation (INDER), then we will have less problems of health. Across the country, INDER focuses on getting the entire population active with a variety of sports and physical exercises. Over 70,000 sports health workers collaborate with the schools and the centers for the elderly to provide opportunities for leisure time to be spent in physical activity. This, along with the public education campaign that Dr. Dayamis told us about, are key mechanisms to prevent chronic diseases from harming the population.
If you take a boat out of the Bay of Pigs and land in other Caribbean countries, you will find yourself in a situation where healthcare is almost nonexistent. In the Dominican Republic, for example, infant mortality is at 34 per 1,000 live births. These countriesunlike Cubahave not been able to harness the commitment and ingenuity of people such as Dr. Dayamis and Dr. Merardo. In these other countries, children die in conditions where no doctor is present to mourn their loss decades later.
Excerpt from:
How Cuba is eradicating child mortality and banishing the diseases of the poor - Peoples Dispatch
Posted in Genetic Engineering
Comments Off on How Cuba is eradicating child mortality and banishing the diseases of the poor – Peoples Dispatch
Sickle cell disease gene therapy study set back by the mice – Cosmos
Posted: at 8:03 am
Sickle cell disease (SCD) is a debilitating illness affecting up to 40% of the population in some African countries. Its caused by mutations in the gene that makes haemoglobin the protein that carries oxygen in red blood cells.
It might one day be possible to treat this disease using gene editing by switching back on the production of a healthy form of haemoglobin called foetal haemoglobin, which is usually only produced by the body when were in the womb.
But a new study testing this promising new treatment in mice has found that scientists still have a long way to go before it can be attempted in humans. The research has been published in Disease Models & Mechanisms.
Healthy red blood cells (RBCs) are shaped similar to a donut but with an indentation instead of a hole.
In sickle cell disease the abnormal haemoglobin distorts the RBCs shape when they arent carrying oxygen. Instead, sickled RBCs are C-shaped, like the farm tool called a sickle, and they become hard and sticky, and die earlier.
Because of their shape, sickled RBCs can become stuck and stop blood flow when travelling through small blood vessels. This causes patients to suffer from episodes of excruciating pain, organ damage and a reduced life-expectancy.
Although current treatments have reduced complications and extended the life expectancies of affected children, most still die prematurely.
Red blood cells are made from haematopoietic stem cells in our bone marrow. These stem cells are able to develop into more than one cell type, in a process called haematopoiesis.
Researchers hope to edit the genes of these stem cells so that they produce RBCs with foetal haemoglobin instead of the abnormal protein and can be reintroduced into the body to alleviate the symptoms of SCD.
Get an update of science stories delivered straight to your inbox.
Unfortunately, they found that although two types of lab mice had the symptoms of sickle cell disease, their foetal haemoglobin gene and surrounding DNA were not properly configured, making the stem-cell treatment ineffective or even harmful.
These mice called Berkley and Townes mice were genetically engineered in different ways to carry several human haemoglobin genes (replacing the mice genes) so scientists could study sickle cell disease in an animal model.
The researchers removed stem cells from the mice and used CRISPIR-Cas9 to try to turn on the healthy foetal haemoglobin gene. They then put the reprogrammed stem cells back into the mice and monitored the animals for 18 weeks to find out how the treatment affected them.
Surprisingly, 70% of Berkley mice died from the therapy and production of foetal haemoglobin was activated in only 3.1% of the stem cells. On the other hand, treatment did not affect the survival of Townes mice and even activated the foetal haemoglobin gene in 57% of RBCs.
Even then, the levels of foetal haemoglobin produced were seven to 10 times lower than seen when this approach was used in human cells grown in the laboratory and were not high enough to reduce clinical signs of sickle cell disease.
We realised that we did not know enough about the genetic configurations of these mice, says senior author Dr Mitchell Weiss, chair of the haematology department at St Jude Childrens Research Hospital, US.
The researchers sequenced the mices haemoglobin genes and surrounding DNA, and discovered that Berkley mice instead of having a single copy of the mutated human gene had 22 randomly arranged, broken-up copies of the mutated human sickle cell disease gene and 27 copies of the human foetal haemoglobin.
This caused the fatal effects seen and meant that the mice cannot be used to test this treatment in the future.
Our findings will help scientists using the Berkeley and Townes mice decide which to use to address their specific research question relating to sickle cell disease or haemoglobin, concludes Weiss.
Additionally, this work provides a reminder for scientists to carefully consider the genetics of the mice that they are using to study human diseases and find the right mouse for the job.
See the original post here:
Sickle cell disease gene therapy study set back by the mice - Cosmos
Posted in Genetic Engineering
Comments Off on Sickle cell disease gene therapy study set back by the mice – Cosmos