The Prometheus League
Breaking News and Updates
- Abolition Of Work
- Ai
- Alt-right
- Alternative Medicine
- Antifa
- Artificial General Intelligence
- Artificial Intelligence
- Artificial Super Intelligence
- Ascension
- Astronomy
- Atheism
- Atheist
- Atlas Shrugged
- Automation
- Ayn Rand
- Bahamas
- Bankruptcy
- Basic Income Guarantee
- Big Tech
- Bitcoin
- Black Lives Matter
- Blackjack
- Boca Chica Texas
- Brexit
- Caribbean
- Casino
- Casino Affiliate
- Cbd Oil
- Censorship
- Cf
- Chess Engines
- Childfree
- Cloning
- Cloud Computing
- Conscious Evolution
- Corona Virus
- Cosmic Heaven
- Covid-19
- Cryonics
- Cryptocurrency
- Cyberpunk
- Darwinism
- Democrat
- Designer Babies
- DNA
- Donald Trump
- Eczema
- Elon Musk
- Entheogens
- Ethical Egoism
- Eugenic Concepts
- Eugenics
- Euthanasia
- Evolution
- Extropian
- Extropianism
- Extropy
- Fake News
- Federalism
- Federalist
- Fifth Amendment
- Fifth Amendment
- Financial Independence
- First Amendment
- Fiscal Freedom
- Food Supplements
- Fourth Amendment
- Fourth Amendment
- Free Speech
- Freedom
- Freedom of Speech
- Futurism
- Futurist
- Gambling
- Gene Medicine
- Genetic Engineering
- Genome
- Germ Warfare
- Golden Rule
- Government Oppression
- Hedonism
- High Seas
- History
- Hubble Telescope
- Human Genetic Engineering
- Human Genetics
- Human Immortality
- Human Longevity
- Illuminati
- Immortality
- Immortality Medicine
- Intentional Communities
- Jacinda Ardern
- Jitsi
- Jordan Peterson
- Las Vegas
- Liberal
- Libertarian
- Libertarianism
- Liberty
- Life Extension
- Macau
- Marie Byrd Land
- Mars
- Mars Colonization
- Mars Colony
- Memetics
- Micronations
- Mind Uploading
- Minerva Reefs
- Modern Satanism
- Moon Colonization
- Nanotech
- National Vanguard
- NATO
- Neo-eugenics
- Neurohacking
- Neurotechnology
- New Utopia
- New Zealand
- Nihilism
- Nootropics
- NSA
- Oceania
- Offshore
- Olympics
- Online Casino
- Online Gambling
- Pantheism
- Personal Empowerment
- Poker
- Political Correctness
- Politically Incorrect
- Polygamy
- Populism
- Post Human
- Post Humanism
- Posthuman
- Posthumanism
- Private Islands
- Progress
- Proud Boys
- Psoriasis
- Psychedelics
- Putin
- Quantum Computing
- Quantum Physics
- Rationalism
- Republican
- Resource Based Economy
- Robotics
- Rockall
- Ron Paul
- Roulette
- Russia
- Sealand
- Seasteading
- Second Amendment
- Second Amendment
- Seychelles
- Singularitarianism
- Singularity
- Socio-economic Collapse
- Space Exploration
- Space Station
- Space Travel
- Spacex
- Sports Betting
- Sportsbook
- Superintelligence
- Survivalism
- Talmud
- Technology
- Teilhard De Charden
- Terraforming Mars
- The Singularity
- Tms
- Tor Browser
- Trance
- Transhuman
- Transhuman News
- Transhumanism
- Transhumanist
- Transtopian
- Transtopianism
- Ukraine
- Uncategorized
- Vaping
- Victimless Crimes
- Virtual Reality
- Wage Slavery
- War On Drugs
- Waveland
- Ww3
- Yahoo
- Zeitgeist Movement
-
Prometheism
-
Forbidden Fruit
-
The Evolutionary Perspective
Monthly Archives: March 2024
Analysis | House AI task force leaders take long view on regulating the tools – The Washington Post
Posted: March 8, 2024 at 6:25 am
Happy Thursday! Maybe this year we'll get lucky and the State of the Union will feature a discussion of intermediary liability or duty of care standards. Send predictions and observations to: cristiano.lima@washpost.com.
House leaders took a key step toward hatching a game plan on artificial intelligence last month by launching a new bipartisan task force, which will issue recommendations for how Congress could boost AI innovation while keeping the tools in check.
But the lawmakers leading the effort told The Technology 202 in a joint interview that implementing a full response will probably be a lengthy undertaking as they consider the technologys vast impact across elections, national security, the economy and more.
Rep. Jay Obernolte (R-Calif.), who was tapped by House leaders to chair the group, pointed to Europes efforts to agree on a comprehensive AI law as a cautionary tale.
If you look at the attempts in Europe to create an omnibus bill for the regulation of AI, you'll see some of the fallacies in that, said Obernolte, one of the few lawmakers with computer science bona fides. They've had to rewrite that bill several times as the face of AI has changed.
We dont envision a 5,000 page bill that deals with 57 topics and then were done with AI, said Rep. Ted Lieu (D-Calif.), the task forces co-chair. Its going to be a multiyear process, and there'll be a variety of different bills that try to tackle different aspects of AI.
The task force is set to release a report by the end of year, but that doesnt preclude more immediate legislative action on discrete issues, Obernolte and Lieu said.
Like Senate Majority Leader Charles E. Schumer (D-N.Y.), Obernolte pointed to the risks that AI-generated content poses to elections as one area with potential for fast action.
There should be broad bipartisan agreement that no one should be allowed to impersonate a candidate with AI so we're going to be looking at what we can do to tighten up the regulations to try and prevent that, he said.
Lieu seconded the sentiment and floated the idea of criminal and civil enhancements to make fines or jail time steeper if certain crimes are perpetrated using AI.
One way to provide more deterrence is to say, look, if you use AI to impersonate a voice that defrauds someone, [that] would enhance the punishment that you may get, he said.
Obernolte said hes hopeful that Congress will prioritize taking up the Create AI Act, which aims to fully stand up the National Artificial Intelligence Research Resource (NAIRR). The White House in January launched a pilot version of the center, which is set to run for two years.
In the Senate, Schumer has come under fire from some of his colleagues for keeping his series of AI insight forums closed to the public. (In response, he has noted that the chamber has held many public committee hearings on AI over the years.)
In the House, Obernolte and Lieu said they are planning to have both public and private sessions to dig into the many facets of AI.
We want to have open meetings in a traditional hearing format to make sure that we're being transparent with the public, Obernolte said. But we're also going to have some closed meetings because it's very important to me that everyone feels comfortable asking questions that could come off as ignorant.
While Schumers bipartisan AI working group has yet to unveil any proposals or legislative text, he predicted in June that there would be action from the Senate in months not years.
House leaders, meanwhile, did not launch the task force until nearly a year after Schumer unveiled his plans, prompting concern from some members that the chamber was absent from the debate.
Obernolte and Lieu pushed back on those suggestions.
Were going to chip away at this over the next several years, and we can do that because there are short-term harms, medium-term harms and long-term harms that need to be mitigated, Obernolte said. I don't think that that's inconsistent with what the Senate is doing at all.
Their offices have had informal contacts over the last year with the leaders of Schumers working group, he added, but said they are very aware that we want to work with them and I think theyre very open to working with us. Lieu agreed: Were just getting started.
Thats all for today thank you so much for joining us! Make sure to tell others to subscribe toThe Technology 202 here. Get in touch with Cristiano (via email or social media) and Will (via email or social media) for tips, feedback or greetings!
View post:
Analysis | House AI task force leaders take long view on regulating the tools - The Washington Post
Posted in Ai
Comments Off on Analysis | House AI task force leaders take long view on regulating the tools – The Washington Post
Don’t Give Your Business Data to AI Companies – Dark Reading
Posted: at 6:25 am
COMMENTARY
Artificial intelligence (AI) is challenging our preexisting ideas of what's possible with technology. AI's transformative potential could upend a variety of diverse tasks and business scenarios by applying computer vision and large vision models (LVMs) to usher in a new age of efficiency and innovation.
Yet, as businesses embrace the promises of AI, they encounter a common peril: Every AI company seems to have an insatiable appetite for the world's precious data. These companies are eager to train their proprietary AI models using any available images and videos, employing tactics that sometimes involve inconveniencing users, likeCAPTCHAsmaking you identify traffic lights. Unfortunately, this clandestine approach has become the standard playbook for many AI providers, enticing customers to unwittingly surrender their data and intellectual contributions, only to be monetized by these companies.
This isn't an isolated incident confined to a singlebad applein the industry. Even well-known companies such asDropboxandGitHubhave faced accusations. And whileZoomhas sinceZoom has shifted its stanceon data privacy, such exceptions merely underscore the norm within the industry.
Handing over your business data to AI companies comes with inherent risks. Why should you help train models that may ultimately benefit your competitors? Moreover, in instances where the application of AI could contribute to societal well-being such as identifying wildfires or enhancing public safety why should such data be confined to the exclusive benefit of a few tech giants? The potential benefits of freely sharing and collaboratively improving such data should be harnessed by communities worldwide, not sequestered within the vaults of a select few tech corporations.
To address these concerns,transparencyis the key. AI companies should be obligated to clearly outline how they intend to use your data and for what specific purposes. This transparency will empower businesses to make informed decisions about the fate of their data and guard against exploitative practices.
In addition, businesses should maintain control over how their data is used. Granting AI companies unrestricted access risks unintended consequencesand compromises privacy. Companies must be able to assert their authority in dictating the terms under which their data is used, ensuring alignment with their values and objectives.
Permission should be nonnegotiable. AI companies must seekexplicit consentfrom businesses before utilizing their data. This not only upholds ethical standards but also establishes a foundation of trust between companies and AI providers.
Lastly, businesses aren't just data donors; they are contributors to the development and refinement of AI models. They deserve compensationfor the use of their data. A fair and equitable system should be in place, acknowledging the value businesses bring to the further development of AI models.
The responsibility lies with businesses to safeguard their data and interests. A collective demand for transparency, control, permission, and fair compensation can pave the way for an era in which AI benefits businesses and society at large, fostering collaboration and innovation while safeguarding against the pitfalls of unchecked data exploitation.
Don't surrender your business data blindly demand a future where AI works for you, not the other way around.
Link:
Don't Give Your Business Data to AI Companies - Dark Reading
Posted in Ai
Comments Off on Don’t Give Your Business Data to AI Companies – Dark Reading
NIST, the lab at the center of Bidens AI safety push, is decaying – The Washington Post
Posted: at 6:25 am
At the National Institute of Standards and Technology the government lab overseeing the most anticipated technology on the planet black mold has forced some workers out of their offices. Researchers sleep in their labs to protect their work during frequent blackouts. Some employees have to carry hard drives to other buildings; flaky internet wont allow for the sending of large files.
And a leaky roof forces others to break out plastic sheeting.
If we knew rain was coming, wed tarp up the microscope, said James Fekete, who served as chief of NISTs applied chemicals and materials division until 2018. It leaked enough that we were prepared.
NIST is at the heart of President Bidens ambitious plans to oversee a new generation of artificial intelligence models; through an executive order, the agency is tasked with developing tests for security flaws and other harms. But budget constraints have left the 123-year-old lab with a skeletal staff on key tech teams and most facilities on its main Gaithersburg, Md., and Boulder, Colo., campuses below acceptable building standards.
Interviews with more than a dozen current and former NIST employees, Biden administration officials, congressional aides and tech company executives, along with reports commissioned by the government, detail a massive resources gap between NIST and the tech firms it is tasked with evaluating a discrepancy some say risks undermining the White Houses ambitious plans to set guardrails for the burgeoning technology. Many of the people spoke to The Washington Post on the condition of anonymity because they were not authorized to speak to the media.
Even as NIST races to set up the new U.S. AI Safety Institute, the crisis at the degrading lab is becoming more acute. On Sunday, lawmakers released a new spending plan that would cut NISTs overall budget by more than 10 percent, to $1.46 billion. While lawmakers propose to invest $10 million in the new AI institute, thats a fraction of the tens of billions of dollars tech giants like Google and Microsoft are pouring into the race to develop artificial intelligence. It pales in comparison to Britain, which has invested more than $125 million into its AI safety efforts.
The cuts to the agency are a self-inflicted wound in the global tech race, said Divyansh Kaushik, the associate director for emerging technologies and national security at the Federation of American Scientists.
Some in the AI community worry that underfunding NIST makes it vulnerable to industry influence. Tech companies are chipping in for the expensive computing infrastructure that will allow the institute to examine AI models. Amazon announced that it would donate $5 million in computing credits. Microsoft, a key investor in OpenAI, will provide engineering teams along with computing resources. (Amazon Jeff Bezos owns The Post.)
Tech executives, including OpenAI CEO Sam Altman, are regularly in communication with officials at the Commerce Department about the agencys AI work. OpenAI has lobbied NIST on artificial intelligence issues, according to federal disclosures. NIST asked TechNet an industry trade group whose members include OpenAI, Google and other major tech companies if its member companies can advise the AI Safety Institute.
NIST is also seeking feedback from academics and civil society groups on its AI work. The agency has a long history of working with a variety of stakeholders to gather input on technologies, Commerce Department spokesman Charlie Andrews said.
AI staff, unlike their more ergonomically challenged colleagues, will be working in well-equipped offices in the Gaithersburg campus, the Commerce Departments D.C. office and the NIST National Cybersecurity Center of Excellence in Rockville, Md., Andrews said.
White House spokeswoman Robyn Patterson said the appointment of Elizabeth Kelly to the helm of the new AI Safety Institute underscores the White Houses commitment to getting this work done right and on time. Kelly previously served as special assistant to the president for economic policy.
The Biden-Harris administration has so far met every single milestone outlined by the presidents landmark executive order, Patterson said. We are confident in our ability to continue to effectively and expeditiously meet the milestones and directives set forth by President Biden to protect Americans from the potential risks of AI systems while catalyzing innovation in AI and beyond.
NISTs financial struggles highlight the limitations of the administrations plan to regulate AI exclusively through the executive branch. Without an act of Congress, there is no new funding for initiatives like the AI Safety Institute and the programs could be easily overturned by the next president. And as the presidential elections approach, the prospects of Congress moving on AI in 2024 are growing dim.
During his State of the Union address on Thursday, Biden called on Congress to harness the promise of AI and protect us from its peril.
Congressional aides and former NIST employees say the agency has not been able to break through as a funding priority even as lawmakers increasingly tout its role in addressing technological developments, including AI, chips and quantum computing.
After this article published, Senate Majority Leader Charles E. Schumer (D-N.Y.) on Thursday touted the $10 million investment in the institute in the proposed budget, saying he fought for this funding to make sure that the development of AI prioritizes both innovation and safety.
A review of NISTs safety practices in August found that the budgetary issues endanger employees, alleging that the agency has an incomplete and superficial approach to safety.
Chronic underfunding of the NIST facilities and maintenance budget has created unsafe work conditions and further fueled the impression among researchers that safety is not a priority, said the NIST safety commission report, which was commissioned following the 2022 death of an engineering technician at the agencys fire research lab.
NIST is one of the federal governments oldest science agencies with one of the smallest budgets. Initially called the National Bureau of Standards, it began at the dawn of the 20th century, as Congress realized the need to develop more standardized measurements amid the expansion of electricity, the steam engine and railways.
The need for such an agency was underscored three years after its founding, when fires ravaged through Baltimore. Firefighters from Washington, Philadelphia and even New York rushed to help put out the flames, but without standard couplings, their hoses couldnt connect to the Baltimore hydrants. The firefighters watched as the flames overtook more than 70 city blocks in 30 hours.
NIST developed a standard fitting, unifying more than 600 different types of hose couplings deployed across the country at the time.
Ever since, the agency has played a critical role in using research and science to help the country learn from catastrophes and prevent new ones. Its work expanded after World War II: It developed an early version of the digital computer, crucial Space Race instruments and atomic clocks, which underpin GPS. In the 1950s and 1960s, the agency moved to new campuses in Boulder and Gaithersburg after its early headquarters in Washington fell into disrepair.
Now, scientists at NIST joke that they work at the most advanced labs in the world in the 1960s. Former employees describe cutting-edge scientific equipment surrounded by decades-old buildings that make it impossible to control the temperature or humidity to conduct critical experiments.
You see dust everywhere because the windows dont seal, former acting NIST director Kent Rochford said. You see a bucket catching drips from a leak in the roof. You see Home Depot dehumidifiers or portable AC units all over the place.
The flooding was so bad that Rochford said he once requested money for scuba gear. That request was denied, but he did receive funding for an emergency kit that included squeegees to clean up water.
Pests and wildlife have at times infiltrated its campuses, including an incident where a garter snake entered a Boulder building.
More than 60 percent of NIST facilities do not meet federal standards for acceptable building conditions, according to a February 2023 report commissioned by Congress from the National Academies of Sciences, Engineering and Medicine. The poor conditions impact employee output. Workarounds and do-it-yourself repairs reduce the productivity of research staff by up to 40 percent, according to the committees interviews with employees during a laboratory visit.
Years after Rochfords 2018 departure, NIST employees are still deploying similar MacGyver-style workarounds. Each year between October and March, low humidity in one lab creates a static charge making it impossible to operate an instrument ensuring companies meet environmental standards for greenhouse gases.
Problems with the HVAC and specialized lights have made the agency unable to meet demand for reference materials, which manufacturers use to check whether their measurements are accurate in products like baby formula.
Facility problems have also delayed critical work on biometrics, including evaluations of facial recognition systems used by the FBI and other law enforcement agencies. The data center in the 1966 building that houses that work receives inadequate cooling, and employees there spend about 30 percent of their time trying to mitigate problems with the lab, according to the academies reports. Scheduled outages are required to maintain the data centers that hold technology work, knocking all biometric evaluations offline for a month each year.
Fekete, the scientist who recalled covering the microscope, said his teams device never completely stopped working due to rain water.
But other NIST employees havent been so lucky. Leaks and floods destroyed an electron microscope worth $2.5 million used for semiconductor research, and permanently damaged an advanced scale called a Kibble balance. The tool was out of commission for nearly five years.
Despite these constraints, NIST has built a reputation as a natural interrogator of swiftly advancing AI systems.
In 2019, the agency released a landmark study confirming facial recognition systems misidentify people of color more often than White people, casting scrutiny on the technologys popularity among law enforcement. Due to personnel constraints, only a handful of people worked on that project.
Four years later, NIST released early guidelines around AI, cementing its reputation as a government leader on the technology. To develop the framework, the agency connected with leaders in industry, civil society and other groups, earning a strong reputation among numerous parties as lawmakers began to grapple with the swiftly evolving technology.
The work made NIST a natural home for the Biden administrations AI red-teaming efforts and the AI Safety Institute, which were formalized in the November executive order. Vice President Harris touted the institute at the U.K. AI Safety Summit in November. More than 200 civil society organizations, academics and companies including OpenAI and Google have signed on to participate in a consortium within the institute.
OpenAI spokeswoman Kayla Wood said in a statement that the company supports NISTs work, and that the company plans to continue to work with the lab to "support the development of effective AI oversight measures.
Under the executive order, NIST has a laundry list of initiatives that it needs to complete by this summer, including publishing guidelines for how to red-team AI models and launching an initiative to guide evaluating AI capabilities. In a December speech at the machine learning conference NeurIPS, the agencys chief AI adviser, Elham Tabassi, said this would be an almost impossible deadline.
It is a hard problem, said Tabassi, who was recently named the chief technology officer of the AI Safety Institute. We dont know quite how to evaluate AI.
The NIST staff has worked tirelessly to complete the work it is assigned by the AI executive order, said Andrews, the Commerce spokesperson.
While the administration has been clear that additional resources will be required to fully address all of the issues posed by AI in the long term, NIST has been effectively carrying out its responsibilities under the [executive order] and is prepared to continue to lead on AI-related research and other work, he said.
Commerce Secretary Gina Raimondo asked Congress to allocate $10 million for the AI Safety Institute during an event at the Atlantic Council in January. The Biden administration also requested more funding for NIST facilities, including $262 million for safety, maintenance and repairs. Congressional appropriators responded by cutting NISTs facilities budget.
The administrations ask falls far below the recommendations of the national academies study, which urged Congress to provide $300 to $400 million in additional annual funding over 12 years to overcome a backlog of facilities damage. The report also calls for $120 million to $150 million per year for the same period to stabilize the effects of further deterioration and obsolescence.
Ross B. Corotis, who chaired the academies committee that produced the facilities report, said Congress needs to ensure that NIST is funded because it is the go-to lab when any new technology emerges, whether thats chips or AI.
Unless youre going to build a whole new laboratory for some particular issue, youre going to turn first to NIST, Corotis said. And NIST needs to be ready for that.
Eva Dou and Nitasha Tiku contributed to this report.
Read the original post:
NIST, the lab at the center of Bidens AI safety push, is decaying - The Washington Post
Posted in Ai
Comments Off on NIST, the lab at the center of Bidens AI safety push, is decaying – The Washington Post
Mapping Disease Trajectories from Birth to Death with AI – Neuroscience News
Posted: at 6:25 am
Summary: Researchers mapped disease trajectories from birth to death, analyzing over 44 million hospital stays in Austria to uncover patterns of multimorbidity across different age groups.
Their groundbreaking study identified 1,260 distinct disease trajectories, revealing critical moments where early and personalized prevention could alter a patients health outcome significantly. For instance, young men with sleep disorders showed two different paths, indicating varying risks for developing metabolic or movement disorders later in life.
These insights provide a powerful tool for healthcare professionals to implement targeted interventions, potentially easing the growing healthcare burden due to an aging population and improving individuals quality of life.
Key Facts:
Source: CSH
The world population is aging at an increasing pace. According to the World Health Organization (WHO), in 2023, one in six people were over 60 years old. By 2050, the number of people over 60 is expected to double to 2.1 billion.
As age increases, the risk of multiple, often chronic diseases occurring simultaneouslyknown as multimorbiditysignificantly rises, explainsElma Dervicfrom theComplexity Science Hub (CSH). Given the demographic shift we are facing, this poses several challenges.
On one hand, multimorbidity diminishes the quality of life for those affected. On the other hand, this demographic shift creates a massive additional burden for healthcare and social systems.
Identifying typical disease trajectories
We wanted to find out which typical disease trajectories occur in multimorbid patients from birth to death and which critical moments in their lives significantly shape the further course. This provides clues for very early and personalized prevention strategies, explains Dervic.
Together with researchers from the Medical University of Vienna, Dervic analyzed all hospital stays in Austria between 2003 and 2014, totaling around 44 million. To make sense of this vast amount of data, the team constructed multilayered networks. A layer represents each ten-year age group, and each diagnosis is represented by nodes within these layers.
Using this method, the researchers were able to identify correlations between different diseases among different age groups for example, how frequently obesity, hypertension, and diabetes occur together in 20-29-year-olds and which diseases have a higher risk of occurring after them in the 30s, 40s or 50s.
The team identified 1,260 different disease trajectories (618 in women and 642 in men) over a 70-year period. On average, one of these disease trajectories includes nine different diagnoses, highlighting how common multimorbidity actually is, emphasizes Dervic.
Critical moments
In particular, 70 trajectories have been identified where patients exhibited similar diagnoses in their younger years, but later evolved into significantly different clinical profiles.
If these trajectories, despite similar starting conditions, significantly differ later in life in terms of severity and the corresponding required hospitalizations, this is a critical moment that plays an important role in prevention, says Dervic.
Menwith sleep disorders
The model, for instance, shows two typical trajectory paths for men between 20 and 29 years old who suffer from sleep disorders. In trajectory A, metabolic diseases such as diabetes mellitus, obesity, and lipid disorders appear years later. In trajectory B, movement disorders occur, among other conditions.
This suggests that organic sleep disorders could be an early marker for the risk of developing neurodegenerative diseases such as Parkinsons disease.
If someone suffers from sleep disorders at a young age, that can be a critical event prompting doctors attention, explains Dervic.
The results of the study show that patients who follow trajectory B spend nine days less in hospital in their 20s but 29 days longer in hospital in their 30s and also suffer from more additional diagnoses. As sleep disorders become more prevalent, the distinction in the course of their illnesses not only matters for those affected but also for the healthcare system.
Women with high blood pressure
Similarly, when adolescent girls between the ages of ten and nineteen have high blood pressure, their trajectory varies as well. While some develop additional metabolic diseases, others experience chronic kidney disease in their twenties, leading to increased mortality at a young age.
This is of particular clinical importance as childhood hypertension is on the rise worldwide and is closely linked to the increasing prevalence of childhood obesity.
There are specific trajectories that deserve special attention and should be monitored closely, according to the authors of the study.
With these insights derived from real-life data, doctors can monitor various diseases more intensively and implement targeted, personalized preventive measures decades before serious problems arise, explains Dervic.
By doing so, they are not only reducing the burden on healthcare systems, but also improving patients quality of life.
Author: Eliza Muto Source: CSH Contact: Eliza Muto CSH Image: The image is credited to Neuroscience News
Original Research: Open access. Unraveling cradle-to-grave disease trajectories from multilayer comorbidity networks by Elma Dervic et al. npj Digital Medicine
Abstract
Unraveling cradle-to-grave disease trajectories from multilayer comorbidity networks
We aim to comprehensively identify typical life-spanning trajectories and critical events that impact patients hospital utilization and mortality. We use a unique dataset containing 44 million records of almost all inpatient stays from 2003 to 2014 in Austria to investigate disease trajectories.
We develop a new, multilayer disease network approach to quantitatively analyze how cooccurrences of two or more diagnoses form and evolve over the life course of patients. Nodes represent diagnoses in age groups of ten years; each age group makes up a layer of the comorbidity multilayer network.
Inter-layer links encode a significant correlation between diagnoses (p<0.001, relative risk>1.5), while intra-layers links encode correlations between diagnoses across different age groups. We use an unsupervised clustering algorithm for detecting typical disease trajectories as overlapping clusters in the multilayer comorbidity network.
We identify critical events in a patients career as points where initially overlapping trajectories start to diverge towards different states. We identified 1260 distinct disease trajectories (618 for females, 642 for males) that on average contain 9 (IQR 26) different diagnoses that cover over up to 70 years (mean 23 years).
We found 70 pairs of diverging trajectories that share some diagnoses at younger ages but develop into markedly different groups of diagnoses at older ages. The disease trajectory framework can help us to identify critical events as specific combinations of risk factors that put patients at high risk for different diagnoses decades later.
Our findings enable a data-driven integration of personalized life-course perspectives into clinical decision-making.
See more here:
Mapping Disease Trajectories from Birth to Death with AI - Neuroscience News
Posted in Ai
Comments Off on Mapping Disease Trajectories from Birth to Death with AI – Neuroscience News
SAP enhances Datasphere and SAC for AI-driven transformation – CIO
Posted: at 6:25 am
SAP announced today a host of new AI copilot and AI governance features for SAP Datasphere and SAP Analytics Cloud (SAC). Jurgen Mueller, SAP CTO and executive board member, called the innovations, which includes an expanded partnership with data governance specialist Collibra, a quantum leap in the companys ability to help customers drive intelligent business transformation through data.
SAP is executing on a roadmap that brings an important semantic layer to enterprise data, and creates the critical foundation for implementing AI-based use cases, said analyst Robert Parker, SVP of industry, software, and services research at IDC.
SAP unveiled Datasphere a year ago as a comprehensive data service, built on SAP Business Technology Platform (BTP), to provide a unified experience for data integration, data cataloging, semantic modeling, data warehousing, data federation, and data virtualization. At SAP Dataspheres core is the concept of the business data fabric, a data management architecture delivering an integrated, semantically rich data layer over the existing data landscape, and providing seamless and scalable access to data without duplication while retaining business context and logic.
With todays announcements, SAP is building on that vision. The company is expanding its partnership with Collibra to integrate Collibras AI Governance platform with SAP data assets to facilitate data governance for non-SAP data assets in customer environments.
We have cataloging inside Datasphere: It allows you to catalog, manage metadata, all the SAP data assets were seeing, said JG Chirapurath, chief marketing and solutions officer for SAP. We are also seeing customers bringing in other data assets from other apps or data sources. In this model, it doesnt make sense for us to say our catalog has to understand all of these corpuses or data. Collibra does a fantastic job of understanding it.
The expanded partnership gives customers the ability to use Collibra as a catalog of catalogs, with Dataspheres catalog also managed by the Collibra platform.
Go here to see the original:
SAP enhances Datasphere and SAC for AI-driven transformation - CIO
Posted in Ai
Comments Off on SAP enhances Datasphere and SAC for AI-driven transformation – CIO
Microsoft confirms Surface and Windows AI event for March 21st – The Verge
Posted: at 6:25 am
The Surface Pro 10 and Surface Laptop 6 commercial versions will be minor spec bumps, according to sources familiar with Microsofts plans. Microsoft will also offer an OLED display on the Surface Pro 10 for consumers, which the company is expected to reveal later this spring.
The Surface Laptop 6 may include a new design
The new Surface Laptop 6 could be the most interesting device, thanks to a new design that will reportedly include thinner display bezels, rounded corners, a haptic touchpad, and two USB-C and one USB-A ports. Microsoft is rumored to be shipping both Intel Core Ultra and Snapdragon X Elite-based models of its latest Surface hardware, with Intel models expected in April and the Arm ones in June.
The event page for Microsofts March event simply says tune in here for the latest in scaling AI in your environment with Copilot, Windows, and Surface, suggesting this will be a rather low-key event thats focused on Microsofts big AI PC push.
Microsoft is also working on a new AI Explorer experience for Windows 11, thats designed as a far more advanced version of its AI assistant. Windows Central reports that it will catalog everything you do on your PC so you can search for moments in a timeline using natural language. Microsoft tried to bring this same idea to life as a Timeline feature in Windows 10, but the lack of app support meant it never really took off and was eventually removed years later.
See the article here:
Microsoft confirms Surface and Windows AI event for March 21st - The Verge
Posted in Ai
Comments Off on Microsoft confirms Surface and Windows AI event for March 21st – The Verge
Adobes new Express app brings Firefly AI tools to iOS and Android – The Verge
Posted: at 6:25 am
Adobe has released a new app for Adobe Express, its cloud-based mobile design platform, bringing the same creative, editing, and Firefly-powered generative AI features enjoyed by desktop users to iOS and Android devices. Available to try for free today in beta, the new Adobe Express app allows users to easily produce creative assets like social media posts, posters, and website banners, with Creative Cloud members able to access and edit Photoshop and Illustrator files directly within the mobile app.
The Adobe Express beta is a free download, with Premium features (which will eventually require a $9.99 monthly subscription) like the erase and remove background tools available at no additional cost while the app is in testing. Firefly-powered generative AI features like Generative Fill and Text to Image effects, however, will require Adobes generative credits. Adobe Express beta users will receive 25 credits per month. The number of monthly credits received will be tied to the users' subscription tier when the mobile app is generally available.
Adobe Express users wont see their projects from the existing mobile app in the new beta app on day one. However, when the new app leaves beta, it will then have all the historical data from the old app carried over in a seamless migration, according to Ian Wang, vice president of product for Adobe Express, on a call with The Verge.
The new Express mobile beta shares the same platform as the desktop version that was updated last year, which means collaborative workflows have been restored if youre using the beta app allowing teams to work together on the same creative projects across both desktop and mobile devices. Anyone still using the current Adobe Express mobile app wont be able to use these features.
The processing for generative AI features is cloud-based rather than on the device itself, but not every smartphone is compatible with the new beta. You can find a list of supported devices here. The Adobe Express mobile app beta is available on the Google Play Store for Android, but iOS users will need to sign up here due to restrictions Apple places on the number of beta users.
Adobes Firefly AI features have been available as standalone web apps since September 2023 (and are very much accessible on mobile devices), which are good enough to experiment with but inconvenient to use in design workflows. By contrast, the new Express beta is the first mobile app to feature them alongside other design tools, giving it a much-needed leg up over Canva a rival design platform that hasnt made its own Magic Studio AI features available to mobile users.
Correction, March 7th, 4:00PM ET: Adobes original press release said that premium features are available at no cost during the Express beta. The company informed us after publication that these premium features do not include generative AI tools, which use a separate credit-based system.
The rest is here:
Adobes new Express app brings Firefly AI tools to iOS and Android - The Verge
Posted in Ai
Comments Off on Adobes new Express app brings Firefly AI tools to iOS and Android – The Verge
A Google AI Watched 30,000 Hours of Video GamesNow It Makes Its Own – Singularity Hub
Posted: at 6:25 am
AI continues to generate plenty of light and heat. The best models in text and imagesnow commanding subscriptions and being woven into consumer productsare competing for inches. OpenAI, Google, and Anthropic are all, more or less, neck and neck.
Its no surprise then that AI researchers are looking to push generative models into new territory. As AI requires prodigious amounts of data, one way to forecast where things are going next is to look at what data is widely available online, but still largely untapped.
Video, of which there is plenty, is an obvious next step. Indeed, last month, OpenAI previewed a new text-to-video AI called Sora that stunned onlookers.
But what about videogames?
It turns out there are quite a few gamer videos online. Google DeepMind says it trained a new AI, Genie, on 30,000 hours of curated video footage showing gamers playing simple platformersthink early Nintendo gamesand now it can create examples of its own.
Genie turns a simple image, photo, or sketch into an interactive video game.
Given a prompt, say a drawing of a character and its surroundings, the AI can then take input from a player to move a character through its world. In a blog post, DeepMind showed Genies creations navigating 2D landscapes, walking around or jumping between platforms. Like a snake eating its tail, some of these worlds were even sourced from AI-generated images.
In contrast to traditional video games, Genie generates these interactive worlds frame by frame. Given a prompt and command to move, it predicts the most likely next frames and creates them on the fly. It even learned to include a sense of parallax, a common feature in platformers where the foreground moves faster than the background.
Notably, the AIs training didnt include labels. Rather, Genie learned to correlate input commandslike, go left, right, or jumpwith in-game movements simply by observing examples in its training. That is, when a character in a video moved left, there was no label linking the command to the motion. Genie figured that part out by itself. That means, potentially, future versions could be trained on as much applicable video as there is online.
The AI is an impressive proof of concept, but its still very early in development, and DeepMind isnt planning to make the model public yet.
The games themselves are pixellated worlds streaming by at a plodding one frame per second. By comparison, contemporary video games can hit 60 or 120 frames per second. Also, like all generative algorithms, Genie generates strange or inconsistent visual artifacts. Its also prone to hallucinating unrealistic futures, the team wrote in their paper describing the AI.
That said, there are a few reasons to believe Genie will improve from here.
Because the AI can learn from unlabeled online videos and is still a modest sizejust 11 billion parameterstheres ample opportunity to scale up. Bigger models trained on more information tend to improve dramatically. And with a growing industry focused on inferencethe process of by which a trained AI performs tasks, like generating images or textits likely to get faster.
DeepMind says Genie could help people, like professional developers, make video games. But like OpenAIwhich believes Sora is about more than videosthe team is thinking bigger. The approach could go well beyond video games.
One example: AI that can control robots. The team trained a separate model on video of robotic arms completing various tasks. The model learned to manipulate the robots and handle a variety of objects.
DeepMind also said Genie-generated video game environments could be used to train AI agents. Its not a new strategy. In a 2021 paper, another DeepMind team outlined a video game called XLand that was populated by AI agents and an AI overlord generating tasks and games to challenge them. The idea that the next big step in AI will require algorithms that can train one another or generate synthetic training data is gaining traction.
All this is the latest salvo in an intense competition between OpenAI and Google to show progress in AI. While others in the field, like Anthropic, are advancing multimodal models akin to GPT-4, Google and OpenAI also seem focused on algorithms that simulate the world. Such algorithms may be better at planning and interaction. Both will be crucial skills for the AI agents both organizations seem intent on producing.
Genie can be prompted with images it has never seen before, such as real world photographs or sketches, enabling people to interact with their imagined virtual worldsessentially acting as a foundation world model, the researchers wrote in the Genie blog post. We focus on videos of 2D platformer games and roboticsbut our method is general and should work for any type of domain, and is scalable to ever larger internet datasets.
Similarly, when OpenAI previewed Sora last month, researchers suggested it might herald something more foundational: a world simulator. That is, both teams seem to view the enormous cache of online video as a way to train AI to generate its own video, yes, but also to more effectively understand and operate out in the world, online or off.
Whether this pays dividends, or is sustainable long term, is an open question. The human brain operates on a light bulbs worth of power; generative AI uses up whole data centers. But its best not to underestimate the forces at play right nowin terms of talent, tech, brains, and cashaiming to not only improve AI but make it more efficient.
Weve seen impressive progress in text, images, audio, and all three together. Videos are the next ingredient being thrown in the pot, and they may make for an even more potent brew.
Image Credit: Google DeepMind
Go here to read the rest:
A Google AI Watched 30,000 Hours of Video GamesNow It Makes Its Own - Singularity Hub
Posted in Ai
Comments Off on A Google AI Watched 30,000 Hours of Video GamesNow It Makes Its Own – Singularity Hub
Elliptic Curve Murmurations Found With AI Take Flight – Quanta Magazine
Posted: at 6:25 am
Almost immediately, the preprint garnered interest, particularly from Andrew Sutherland, a research scientist at MIT who is one of the managing editors of the LMFDB. Sutherland realized that 3 million elliptic curves werent enough for his purposes. He wanted to look at much larger conductor ranges to see how robust the murmurations were. He pulled data from another immense repository of about 150 million elliptic curves. Still unsatisfied, he then pulled in data from a different repository with 300 million curves.
But even those werent enough, so I actually computed a new data set of over a billion elliptic curves, and thats what I used to compute the really high-res pictures, Sutherland said. The murmurations showed up whether he averaged over 15,000 elliptic curves at a time or a million at a time. The shape stayed the same even as he looked at the curves over larger and larger prime numbers, a phenomenon called scale invariance. Sutherland also realized that murmurations are not unique to elliptic curves, but also appear in more general L-functions. He wrote a letter summarizing his findings and sent it to Sarnak and Michael Rubinstein at the University of Waterloo.
If there is a known explanation for it I expect you will know it, Sutherland wrote.
They didnt.
Lee, He and Oliver organized a workshop on murmurations in August 2023 at Brown Universitys Institute for Computational and Experimental Research in Mathematics (ICERM). Sarnak and Rubinstein came, as did Sarnaks student Nina Zubrilina.
Zubrilina presented her research into murmuration patterns in modular forms, special complex functions which, like elliptic curves, have associated L-functions. In modular forms with large conductors, the murmurations converge into a sharply defined curve, rather than forming a discernible but dispersed pattern. In a paper posted on October 11, 2023, Zubrilina proved that this type of murmuration follows an explicit formula she discovered.
Ninas big achievement is that shes given a formula for this; I call it the Zubrilina murmuration density formula, Sarnak said. Using very sophisticated math, she has proven an exact formula which fits the data perfectly.
Her formula is complicated, but Sarnak hails it as an important new kind of function, comparable to the Airy functions that define solutions to differential equations used in a variety of contexts in physics, ranging from optics to quantum mechanics.
Though Zubrilinas formula was the first, others have followed. Every week now, theres a new paper out, Sarnak said, mainly using Zubrilinas tools, explaining other aspects of murmurations.
Jonathan Bober, Andrew Booker and Min Lee of the University of Bristol, together with David Lowry-Duda of ICERM, proved the existence of a different type of murmuration in modular forms in another October paper. And Kyu-Hwan Lee, Oliver and Pozdnyakov proved the existence of murmurations in objects called Dirichlet characters that are closely related to L-functions.
Sutherland was impressed by the significant dose of luck that had led to the discovery of murmurations. If the elliptic curve data hadnt been ordered by conductor, the murmurations would have disappeared. They were fortunate to be taking data from the LMFDB, which came pre-sorted according to the conductor, he said. Its what relates an elliptic curve to the corresponding modular form, but thats not at all obvious. Two curves whose equations look very similar can have very different conductors. For example, Sutherland noted that y2 = x3 11x + 6 has conductor 17, but flipping the minus sign to a plus sign, y2 = x3+ 11x + 6 has conductor 100,736.
Even then, the murmurations were only found because of Pozdnyakovs inexperience. I dont think we would have found it without him, Oliver said, because the experts traditionally normalize ap to have absolute value 1. But he didnt normalize them so the oscillations were very big and visible.
The statistical patterns that AI algorithms use to sort elliptic curves by rank exist in a parameter space with hundreds of dimensions too many for people to sort through in their minds, let alone visualize, Oliver noted. But though machine learning found the hidden oscillations, only later did we understand them to be the murmurations.
Editors Note: Andrew Sutherland, Kyu-Hwan Lee and the L-functions and modular forms database (LMFDB) have all received funding from the Simons Foundation, which also funds this editorially independent publication. Simons Foundation funding decisions have no influence on our coverage. More information is available here.
View post:
Elliptic Curve Murmurations Found With AI Take Flight - Quanta Magazine
Posted in Ai
Comments Off on Elliptic Curve Murmurations Found With AI Take Flight – Quanta Magazine
Amid record high energy demand, America is running out of electricity – The Washington Post
Posted: at 6:24 am
Vast swaths of the United States are at risk of running short of power as electricity-hungry data centers and clean-technology factories proliferate around the country, leaving utilities and regulators grasping for credible plans to expand the nations creaking power grid.
In Georgia, demand for industrial power is surging to record highs, with the projection of new electricity use for the next decade now 17 times what it was only recently. Arizona Public Service, the largest utility in that state, is also struggling to keep up, projecting it will be out of transmission capacity before the end of the decade absent major upgrades.
Northern Virginia needs the equivalent of several large nuclear power plants to serve all the new data centers planned and under construction. Texas, where electricity shortages are already routine on hot summer days, faces the same dilemma.
The soaring demand is touching off a scramble to try to squeeze more juice out of an aging power grid while pushing commercial customers to go to extraordinary lengths to lock down energy sources, such as building their own power plants.
When you look at the numbers, it is staggering, said Jason Shaw, chairman of the Georgia Public Service Commission, which regulates electricity. It makes you scratch your head and wonder how we ended up in this situation. How were the projections that far off? This has created a challenge like we have never seen before.
A major factor behind the skyrocketing demand is the rapid innovation in artificial intelligence, which is driving the construction of large warehouses of computing infrastructure that require exponentially more power than traditional data centers. AI is also part of a huge scale-up of cloud computing. Tech firms like Amazon, Apple, Google, Meta and Microsoft are scouring the nation for sites for new data centers, and many lesser-known firms are also on the hunt.
The proliferation of crypto-mining, in which currencies like bitcoin are transacted and minted, is also driving data center growth. It is all putting new pressures on an overtaxed grid the network of transmission lines and power stations that move electricity around the country. Bottlenecks are mounting, leaving both new generators of energy, particularly clean energy, and large consumers facing growing wait times for hookups.
The situation is sparking battles across the nation over who will pay for new power supplies, with regulators worrying that residential ratepayers could be stuck with the bill for costly upgrades. It also threatens to stifle the transition to cleaner energy, as utility executives lobby to delay the retirement of fossil fuel plants and bring more online. The power crunch imperils their ability to supply the energy that will be needed to charge the millions of electric cars and household appliances required to meet state and federal climate goals.
The nations 2,700 data centers sapped more than 4 percent of the countrys total electricity in 2022, according to the International Energy Agency. Its projections show that by 2026, they will consume 6 percent. Industry forecasts show the centers eating up a larger share of U.S. electricity in the years that follow, as demand from residential and smaller commercial facilities stays relatively flat thanks to steadily increasing efficiencies in appliances and heating and cooling systems.
Data center operators are clamoring to hook up to regional electricity grids at the same time the Biden administrations industrial policy is luring companies to build factories in the United States at a pace not seen in decades. That includes manufacturers of clean tech, such as solar panels and electric car batteries, which are being enticed by lucrative federal incentives. Companies announced plans to build or expand more than 155 factories in this country during the first half of the Biden administration, according to the Electric Power Research Institute, a research and development organization. Not since the early 1990s has factory-building accounted for such a large share of U.S. construction spending, according to the group.
Utility projections for the amount of power they will need over the next five years have nearly doubled and are expected to grow, according to a review of regulatory filings by the research firm Grid Strategies.
In the past, companies tried to site their data centers in areas with major internet infrastructure, a large pool of tech talent, and attractive government incentives. But these locations are getting tapped out.
Communities that had little connection to the computing industry now find themselves in the middle of a land rush, with data center developers flooding their markets with requests for grid hookups. Officials in Columbus, Ohio; Altoona, Iowa; and Fort Wayne, Ind. are being aggressively courted by data center developers. But power supply in some of these second-choice markets is already running low, pushing developers ever farther out, in some cases into cornfields, according to JLL, a commercial real estate firm that serves the tech industry.
Grid Strategies warns in its report that there are real risks some regions may miss out on economic development opportunities because the grid cant keep up.
Across the board, we are seeing power companies say, We dont know if we can handle this; we have to audit our system; weve never dealt with this kind of influx before, said Andy Cvengros, managing director of data center markets at JLL. Everyone is now chasing power. They are willing to look everywhere for it.
We saw a quadrupling of land values in some parts of Columbus, and a tripling in areas of Chicago, he said. Its not about the land. It is about access to power. Some developers, he said, have had to sell the property they bought at inflated prices at a loss, after utilities became overwhelmed by the rush for grid hookups.
It is all happening at the same time the energy transition is steering large numbers of Americans to rely on the power grid to fuel vehicles, heat pumps, induction stoves and all manner of other household appliances that previously ran on fossil fuels. A huge amount of clean energy is also needed to create the green hydrogen championed by the White House, as developers rush to build plants that can produce the powerful zero-emissions fuel, lured by generous federal subsidies.
Planners are increasingly concerned that the grid wont be green enough or powerful enough to meet these demands.
Already, soaring power consumption is delaying coal plant closures in Kansas, Nebraska, Wisconsin and South Carolina.
In Georgia, the states major power company, Georgia Power, stunned regulators when it revealed recently how wildly off its projections were, pointing to data centers as the main culprit.
The demand has Georgia officials rethinking the states policy of offering incentives to lure computing operations, which generate few jobs but can boost community budgets through the hefty property taxes they pay. The top leaders of Georgias House and Senate, both Republicans, are championing a pause in data center incentives.
Georgia regulators, meanwhile, are exploring how to protect ratepayers while ensuring there is enough power to meet the needs of the states most-prized new tenants: clean-technology companies. Factories supplying the electric vehicle and green-energy markets have been rushing to locate in Georgia in large part on promises of cheap, reliable electricity.
When the data center industry began looking for new hubs, Atlanta was like, Bring it on, said Pat Lynch, who leads the Data Center Solutions team at real estate giant CBRE. Now Georgia Power is warning of limitations. ... Utility shortages in the face of these data center demands are happening in almost every market.
A similar dynamic is playing out in a very different region: the Pacific Northwest. In Oregon, Portland General Electric recently doubled its forecast for new electricity demand over the next five years, citing data centers and rapid industrial growth as the drivers.
That power crunch threw a wrench into the plans of Michael Halaburda and Arman Khalili, longtime data center developers whose latest project involves converting a mothballed tile factory in the Portland area. The two were under the impression only a couple of months ago that they would have no problem getting the electricity they needed to run the place. Then the power company alerted them that it would need to do a line and load study to assess whether it could supply the facility with 60 megawatts of electricity roughly the amount needed to power 45,000 homes.
The Portland project Halaburda and Khalili are developing will now be powered in large part by off-the-grid, high-tech fuel cells that convert natural gas into low-emissions electricity. The technology will be supplemented by whatever power can be secured from the grid. The partners decided that on their next project, in South Texas, theyre not going to take their chances with the grid at all. Instead, they will drill thousands of feet into the ground to draw geothermal energy.
Halaburda sees the growth as good for the country and the economy. But no one took into consideration where this is all going, he said. In the next couple of years, unless there is a real focus on expanding the grid and making it more robust, we are going to see opportunities fall by the wayside because we cant get power to where it is needed.
Companies are increasingly turning to such off-the-grid experiments as their frustration with the logjam in the nations traditional electricity network mounts. Microsoft and Google are among the firms hoping that energy-intensive industrial operations can ultimately be powered by small nuclear plants on-site, with Microsoft even putting AI to work trying to streamline the burdensome process of getting plants approved. Microsoft has also inked a deal to buy power from a company trying to develop zero-emissions fusion power. But going off the grid brings its own big regulatory and land acquisition challenges. The type of nuclear plants envisioned, for example, are not yet even operational in the United States. Fusion power does not yet exist.
The big tech companies are also exploring ways AI can help make the grid operate more efficiently. And they are developing platforms that during times of peak power demand can shift compute tasks and their associated energy consumption to the times and places where carbon-free energy is available on the grid, according to Google. But meeting both their zero-emissions pledges and their AI innovation ambitions is becoming increasingly complicated as the energy needs of their data centers grow.
These problems are not going to go away, said Michael Ortiz, CEO of Layer 9 Data Centers, a U.S. company that is looking to avoid the logjam here by building in Mexico. Data centers are going to have to become more efficient, and we need to be using more clean sources of efficient energy, like nuclear.
Officials at Equinix, one of the worlds largest data center companies, said they have been experimenting with fuel cells as backup power, but they remain hopeful they can keep the power grid as their main source of electricity for new projects.
The logjam is already pushing officials overseeing the clean-energy transition at some of the nations largest airports to look beyond the grid. The amount of energy they will need to charge fleets of electric rental vehicles and ground maintenance trucks alone is immense. An analysis shows electricity demand doubling by 2030 at both the Denver and Minneapolis airports. By 2040, they will need more than triple the electricity they are using now, according to the study, commissioned by car rental giant Enterprise, Xcel Energy and Jacobs, a consulting firm.
Utilities are not going to be able to move quickly enough to provide all this capacity, said Christine Weydig, vice president of transportation at AlphaStruxure, which designs and operates clean-energy projects. The infrastructure is not there. Different solutions will be needed. Airports, she said, are looking into dramatically expanding the use of clean-power microgrids they can build on-site.
The Biden administration has made easing the grid bottleneck a priority, but it is a politically fraught process, and federal powers are limited. Building the transmission lines and transfer stations needed involves huge land acquisitions, exhaustive environmental reviews and negotiations to determine who should pay what costs.
The process runs through state regulatory agencies, and fights between states over who gets stuck with the bill and where power lines should go routinely sink and delay proposed projects. The amount of new transmission line installed in the United States has dropped sharply since 2013, when 4,000 miles were added. Now, the nation struggles to bring online even 1,000 new miles a year. The slowdown has real consequences not just for companies but for the climate. A group of scientists led by Princeton University professor Jesse Jenkins warned in a report that by 2030 the United States risks losing out on 80 percent of the potential emission reductions from President Bidens signature climate law, the Inflation Reduction Act, if the pace of transmission construction does not pick up dramatically now.
While the proliferation of data centers puts more pressure on states to approve new transmission lines, it also complicates the task. Officials in Maryland, for example, are protesting a plan for $5.2 billion in infrastructure that would transmit power to huge data centers in Loudoun County, Va. The Maryland Office of Peoples Council, a government agency that advocates for ratepayers, called grid operator PJMs plan fundamentally unfair, arguing it could leave Maryland utility customers paying for power transmission to data centers that Virginia aggressively courted and is leveraging for a windfall in tax revenue.
Tensions over who gets power from the grid and how it gets to them are only going to intensify as the supply becomes scarcer.
In Texas, a dramatic increase in data centers for crypto mining is touching off a debate over whether they are a costly drain on an overtaxed grid. An analysis by the consulting firm Wood Mackenzie found that the energy needed by crypto operations aiming to link to the grid would equal a quarter of the electricity used in the state at peak demand. Unlike data centers operated by big tech companies such as Google and Meta, crypto miners generally dont build renewable-energy projects with the aim of supplying enough zero-emissions energy to the grid to cover their operations.
The result, said Ben Hertz-Shargel, who authored the Wood Mackenzie analysis, is that cryptos drain on the grid threatens to inhibit the ability of Texas to power other energy-hungry operations that could drive innovation and economic growth, such as factories that produce zero-emissions green hydrogen fuel or industrial charging depots that enable electrification of truck and bus fleets.
But after decades in which power was readily available, regulators and utility executives across the country generally are not empowered to prioritize which projects get connected. It is first come, first served. And the line is growing longer. To answer the call, some states have passed laws to protect crypto minings access to huge amounts of power.
Lawmakers need to think about this, Hertz-Shargel said of allocating an increasingly limited supply of power. There is a risk that strategic industries they want in their states are going to have a challenging time setting up in those places.
See the original post here:
Amid record high energy demand, America is running out of electricity - The Washington Post
Posted in Ai
Comments Off on Amid record high energy demand, America is running out of electricity – The Washington Post