Moore’s law – Wikipedia

Observation on the growth of integrated circuit capacity

Moore's law is the observation that the number of transistors in a dense integrated circuit (IC) doubles about every two years. Moore's law is an observation and projection of a historical trend. Rather than a law of physics, it is an empirical relationship linked to gains from experience in production.

The observation is named after Gordon Moore, the co-founder of Fairchild Semiconductor and Intel (and former CEO of the latter), who in 1965 posited a doubling every year in the number of components per integrated circuit,[a] and projected this rate of growth would continue for at least another decade. In 1975, looking forward to the next decade, he revised the forecast to doubling every two years, a compound annual growth rate (CAGR) of 41%. While Moore did not use empirical evidence in forecasting that the historical trend would continue, his prediction held since 1975 and has since become known as a "law".

Moore's prediction has been used in the semiconductor industry to guide long-term planning and to set targets for research and development, thus functioning to some extent as a self-fulfilling prophecy. Advancements in digital electronics, such as the reduction in quality-adjusted microprocessor prices, the increase in memory capacity (RAM and flash), the improvement of sensors, and even the number and size of pixels in digital cameras, are strongly linked to Moore's law. These ongoing changes in digital electronics have been a driving force of technological and social change, productivity, and economic growth.

Industry experts have not reached a consensus on exactly when Moore's law will cease to apply. Microprocessor architects report that semiconductor advancement has slowed industry-wide since around 2010, slightly below the pace predicted by Moore's law.

In 1959, Douglas Engelbart studied the projected downscaling of integrated circuit (IC) size, publishing his results in the article "Microelectronics, and the Art of Similitude".[2][3][4] Engelbart presented his findings at the 1960 International Solid-State Circuits Conference, where Moore was present in the audience.[5]

In 1965, Gordon Moore, who at the time was working as the director of research and development at Fairchild Semiconductor, was asked to contribute to the thirty-fifth anniversary issue of Electronics magazine with a prediction on the future of the semiconductor components industry over the next ten years. His response was a brief article entitled "Cramming more components onto integrated circuits".[1][6][b] Within his editorial, he speculated that by 1975 it would be possible to contain as many as 65,000 components on a single quarter-square-inch (~1.6 square-centimeter) semiconductor.

The complexity for minimum component costs has increased at a rate of roughly a factor of two per year. Certainly over the short term this rate can be expected to continue, if not to increase. Over the longer term, the rate of increase is a bit more uncertain, although there is no reason to believe it will not remain nearly constant for at least 10 years.[1]

Moore posited a log-linear relationship between device complexity (higher circuit density at reduced cost) and time.[9][10] In a 2015 interview, Moore noted of the 1965 article: "...I just did a wild extrapolation saying its going to continue to double every year for the next 10 years."[11] One historian of the law cites Stigler's law of eponymy, to introduce the fact that the regular doubling of components was known to many working in the field.[10]

In 1974, Robert H. Dennard at IBM recognized the rapid MOSFET scaling technology and formulated what became known as Dennard scaling, which describes that as MOS transistors get smaller, their power density stays constant such that the power use remains in proportion with area.[12][13] Evidence from the semiconductor industry shows that this inverse relationship between power density and areal density broke down in the mid-2000s.[14]

At the 1975 IEEE International Electron Devices Meeting, Moore revised his forecast rate,[15][16] predicting semiconductor complexity would continue to double annually until about 1980, after which it would decrease to a rate of doubling approximately every two years.[16][17][18] He outlined several contributing factors for this exponential behavior:[9][10]

Shortly after 1975, Caltech professor Carver Mead popularized the term "Moore's law".[19][20] Moore's law eventually came to be widely accepted as a goal for the semiconductor industry, and it was cited by competitive semiconductor manufacturers as they strove to increase processing power. Moore viewed his eponymous law as surprising and optimistic: "Moore's law is a violation of Murphy's law. Everything gets better and better."[21] The observation was even seen as a self-fulfilling prophecy.[22][23]

The doubling period is often misquoted as 18 months because of a prediction by Moore's colleague, Intel executive David House. In 1975, House noted that Moore's revised law of doubling transistor count every 2 years in turn implied that computer chip performance would roughly double every 18 months[24] (with no increase in power consumption).[25] Mathematically, Moore's Law predicted that transistor count would double every 2 years due to shrinking transistor dimensions and other improvements. As a consequence of shrinking dimensions, Dennard scaling predicted that power consumption per unit area would remain constant. Combining these effects, David House deduced that computer chip performance would roughly double every 18 months. Also due to Dennard scaling, this increased performance would not be accompanied by increased power, i.e., the energy-efficiency of silicon-based computer chips roughly doubles every 18 months. Dennard scaling ended in the 2000s.[14] Koomey later showed that a similar rate of efficiency improvement predated silicon chips and Moore's Law, for technologies such as vacuum tubes.

Microprocessor architects report that since around 2010, semiconductor advancement has slowed industry-wide below the pace predicted by Moore's law.[14] Brian Krzanich, the former CEO of Intel, cited Moore's 1975 revision as a precedent for the current deceleration, which results from technical challenges and is "a natural part of the history of Moore's law".[26][27][28] The rate of improvement in physical dimensions known as Dennard scaling also ended in the mid-2000s. As a result, much of the semiconductor industry has shifted its focus to the needs of major computing applications rather than semiconductor scaling.[22][29][14] Nevertheless, leading semiconductor manufacturers TSMC and Samsung Electronics have claimed to keep pace with Moore's law[30][31][32][33][34][35] with 10nm and 7nm nodes in mass production[30][31] and 5nm nodes in risk production as of 2019[update].[36][37]

As the cost of computer power to the consumer falls, the cost for producers to fulfill Moore's law follows an opposite trend: R&D, manufacturing, and test costs have increased steadily with each new generation of chips. Rising manufacturing costs are an important consideration for the sustaining of Moore's law.[38] This led to the formulation of Moore's second law, also called Rock's law, which is that the capital cost of a semiconductor fabrication plant also increases exponentially over time.[39][40]

Numerous innovations by scientists and engineers have sustained Moore's law since the beginning of the IC era. Some of the key innovations are listed below, as examples of breakthroughs that have advanced integrated circuit and semiconductor device fabrication technology, allowing transistor counts to grow by more than seven orders of magnitude in less than five decades.

Computer industry technology road maps predicted in 2001 that Moore's law would continue for several generations of semiconductor chips.[64]

One of the key challenges of engineering future nanoscale transistors is the design of gates. As device dimension shrinks, controlling the current flow in the thin channel becomes more difficult. Modern nanoscale transistors typically take the form of multi-gate MOSFETs, with the FinFET being the most common nanoscale transistor. The FinFET has gate dielectric on three sides of the channel. In comparison, the gate-all-around MOSFET (GAAFET) structure has even better gate control.

Microprocessor architects report that semiconductor advancement has slowed industry-wide since around 2010, below the pace predicted by Moore's law.[14] Brian Krzanich, the former CEO of Intel, announced, "Our cadence today is closer to two and a half years than two."[96] Intel stated in 2015 that improvements in MOSFET devices have slowed, starting at the 22nm feature width around 2012, and continuing at 14nm.[97]

The physical limits to transistor scaling have been reached due to source-to-drain leakage, limited gate metals and limited options for channel material. Other approaches are being investigated, which do not rely on physical scaling. These include the spin state of electron spintronics, tunnel junctions, and advanced confinement of channel materials via nano-wire geometry.[98] Spin-based logic and memory options are being developed actively in labs.[99][100]

The vast majority of current transistors on ICs are composed principally of doped silicon and its alloys. As silicon is fabricated into single nanometer transistors, short-channel effects adversely change desired material properties of silicon as a functional transistor. Below are several non-silicon substitutes in the fabrication of small nanometer transistors.

One proposed material is indium gallium arsenide, or InGaAs. Compared to their silicon and germanium counterparts, InGaAs transistors are more promising for future high-speed, low-power logic applications. Because of intrinsic characteristics of III-V compound semiconductors, quantum well and tunnel effect transistors based on InGaAs have been proposed as alternatives to more traditional MOSFET designs.

Biological computing research shows that biological material has superior information density and energy efficiency compared to silicon-based computing.[108]

Various forms of graphene are being studied for graphene electronics, e.g. graphene nanoribbon transistors have shown great promise since its appearance in publications in 2008. (Bulk graphene has a band gap of zero and thus cannot be used in transistors because of its constant conductivity, an inability to turn off. The zigzag edges of the nanoribbons introduce localized energy states in the conduction and valence bands and thus a bandgap that enables switching when fabricated as a transistor. As an example, a typical GNR of width of 10nm has a desirable bandgap energy of 0.4eV.[109][110]) More research will need to be performed, however, on sub-50nm graphene layers, as its resistivity value increases and thus electron mobility decreases.[109]

In April 2005, Gordon Moore stated in an interview that the projection cannot be sustained indefinitely: "It can't continue forever. The nature of exponentials is that you push them out and eventually disaster happens." He also noted that transistors eventually would reach the limits of miniaturization at atomic levels:

In terms of size [of transistors] you can see that we're approaching the size of atoms which is a fundamental barrier, but it'll be two or three generations before we get that farbut that's as far out as we've ever been able to see. We have another 10 to 20 years before we reach a fundamental limit. By then they'll be able to make bigger chips and have transistor budgets in the billions.[111]

In 2016 the International Technology Roadmap for Semiconductors, after using Moore's Law to drive the industry since 1998, produced its final roadmap. It no longer centered its research and development plan on Moore's law. Instead, it outlined what might be called the More than Moore strategy in which the needs of applications drive chip development, rather than a focus on semiconductor scaling. Application drivers range from smartphones to AI to data centers.[112]

IEEE began a road-mapping initiative in 2016, "Rebooting Computing", named the International Roadmap for Devices and Systems (IRDS).[113]

Most forecasters, including Gordon Moore,[114] expect Moore's law will end by around 2025.[115][112][116] Although Moore's Law will reach a physical limitation, some forecasters are optimistic about the continuation of technological progress in a variety of other areas, including new chip architectures, quantum computing, and AI and machine learning.[117][118] Nvidia CEO Jensen Huang declared Moore's law dead in 2022;[119] several days later Intel CEO Pat Gelsinger declared that Moore's law is not dead.[120]

Digital electronics have contributed to world economic growth in the late twentieth and early twenty-first centuries.[121] The primary driving force of economic growth is the growth of productivity,[122] and Moore's law factors into productivity. Moore (1995) expected that "the rate of technological progress is going to be controlled from financial realities".[123] The reverse could and did occur around the late-1990s, however, with economists reporting that "Productivity growth is the key economic indicator of innovation."[124] Moore's law describes a driving force of technological and social change, productivity, and economic growth.[125][126][122]

An acceleration in the rate of semiconductor progress contributed to a surge in U.S. productivity growth,[127][128][129] which reached 3.4% per year in 19972004, outpacing the 1.6% per year during both 19721996 and 20052013.[130] As economist Richard G. Anderson notes, "Numerous studies have traced the cause of the productivity acceleration to technological innovations in the production of semiconductors that sharply reduced the prices of such components and of the products that contain them (as well as expanding the capabilities of such products)."[131]

The primary negative implication of Moore's law is that obsolescence pushes society up against the Limits to Growth. As technologies continue to rapidly "improve", they render predecessor technologies obsolete. In situations in which security and survivability of hardware or data are paramount, or in which resources are limited, rapid obsolescence often poses obstacles to smooth or continued operations.[132]

Because of the intensive resource footprint and toxic materials used in the production of computers, obsolescence leads to serious harmful environmental impacts. Americans throw out 400,000 cell phones every day,[133] but this high level of obsolescence appears to companies as an opportunity to generate regular sales of expensive new equipment, instead of retaining one device for a longer period of time, leading to industry using planned obsolescence as a profit centre.[134]

An alternative source of improved performance is in microarchitecture techniques exploiting the growth of available transistor count. Out-of-order execution and on-chip caching and prefetching reduce the memory latency bottleneck at the expense of using more transistors and increasing the processor complexity. These increases are described empirically by Pollack's Rule, which states that performance increases due to microarchitecture techniques approximate the square root of the complexity (number of transistors or the area) of a processor.[135]

For years, processor makers delivered increases in clock rates and instruction-level parallelism, so that single-threaded code executed faster on newer processors with no modification.[136] Now, to manage CPU power dissipation, processor makers favor multi-core chip designs, and software has to be written in a multi-threaded manner to take full advantage of the hardware. Many multi-threaded development paradigms introduce overhead, and will not see a linear increase in speed vs number of processors. This is particularly true while accessing shared or dependent resources, due to lock contention. This effect becomes more noticeable as the number of processors increases. There are cases where a roughly 45% increase in processor transistors has translated to roughly 1020% increase in processing power.[137]

On the other hand, manufacturers are adding specialized processing units to deal with features such as graphics, video, and cryptography. For one example, Intel's Parallel JavaScript extension not only adds support for multiple cores, but also for the other non-general processing features of their chips, as part of the migration in client side scripting toward HTML5.[138]

Moore's law has affected the performance of other technologies significantly: Michael S. Malone wrote of a Moore's War following the apparent success of shock and awe in the early days of the Iraq War. Progress in the development of guided weapons depends on electronic technology.[139] Improvements in circuit density and low-power operation associated with Moore's law also have contributed to the development of technologies including mobile telephones[140] and 3-D printing.[141]

Several measures of digital technology are improving at exponential rates related to Moore's law, including the size, cost, density, and speed of components. Moore wrote only about the density of components, "a component being a transistor, resistor, diode or capacitor",[123] at minimum cost.

Transistors per integrated circuit The most popular formulation is of the doubling of the number of transistors on ICs every two years. At the end of the 1970s, Moore's law became known as the limit for the number of transistors on the most complex chips. The graph at the top shows this trend holds true today. As of 2017, the commercially available processor possessing the highest number of transistors is the 48 core Centriq with over 18billion transistors.[142]

This is the formulation given in Moore's 1965 paper.[1] It is not just about the density of transistors that can be achieved, but about the density of transistors at which the cost per transistor is the lowest.[143]As more transistors are put on a chip, the cost to make each transistor decreases, but the chance that the chip will not work due to a defect increases. In 1965, Moore examined the density of transistors at which cost is minimized, and observed that, as transistors were made smaller through advances in photolithography, this number would increase at "a rate of roughly a factor of two per year".[1]

Dennard scaling This posits that power usage would decrease in proportion to area (both voltage and current being proportional to length) of transistors. Combined with Moore's law, performance per watt would grow at roughly the same rate as transistor density, doubling every 12 years. According to Dennard scaling transistor dimensions would be scaled by 30% (0.7x) every technology generation, thus reducing their area by 50%. This would reduce the delay by 30% (0.7x) and therefore increase operating frequency by about 40% (1.4x). Finally, to keep electric field constant, voltage would be reduced by 30%, reducing energy by 65% and power (at 1.4x frequency) by 50%.[c] Therefore, in every technology generation transistor density would double, circuit becomes 40% faster, while power consumption (with twice the number of transistors) stays the same.[144] Dennard scaling came to end in 20052010, due to leakage currents.[14]

The exponential processor transistor growth predicted by Moore does not always translate into exponentially greater practical CPU performance. Since around 20052007, Dennard scaling has ended, so even though Moore's law continued for several years after that, it has not yielded dividends in improved performance.[12][145] The primary reason cited for the breakdown is that at small sizes, current leakage poses greater challenges, and also causes the chip to heat up, which creates a threat of thermal runaway and therefore, further increases energy costs.[12][145][14]

The breakdown of Dennard scaling prompted a greater focus on multicore processors, but the gains offered by switching to more cores are lower than the gains that would be achieved had Dennard scaling continued.[146][147] In another departure from Dennard scaling, Intel microprocessors adopted a non-planar tri-gate FinFET at 22nm in 2012 that is faster and consumes less power than a conventional planar transistor.[148] The rate of performance improvement for single-core microprocessors has slowed significantly.[149] Single-core performance was improving by 52% per year in 19862003 and 23% per year in 20032011, but slowed to just seven percent per year in 20112018.[149]

Quality adjusted price of IT equipment The price of information technology (IT), computers and peripheral equipment, adjusted for quality and inflation, declined 16% per year on average over the five decades from 1959 to 2009.[150][151] The pace accelerated, however, to 23% per year in 19951999 triggered by faster IT innovation,[124] and later, slowed to 2% per year in 20102013.[150][152]

While quality-adjusted microprocessor price improvement continues,[153] the rate of improvement likewise varies, and is not linear on a log scale. Microprocessor price improvement accelerated during the late 1990s, reaching 60% per year (halving every nine months) versus the typical 30% improvement rate (halving every two years) during the years earlier and later.[154][155] Laptop microprocessors in particular improved 2535% per year in 20042010, and slowed to 1525% per year in 20102013.[156]

The number of transistors per chip cannot explain quality-adjusted microprocessor prices fully.[154][157][158] Moore's 1995 paper does not limit Moore's law to strict linearity or to transistor count, "The definition of 'Moore's Law' has come to refer to almost anything related to the semiconductor industry that on a semi-log plot approximates a straight line. I hesitate to review its origins and by doing so restrict its definition."[123]

Hard disk drive areal density A similar prediction (sometimes called Kryder's law) was made in 2005 for hard disk drive areal density.[159] The prediction was later viewed as over-optimistic. Several decades of rapid progress in areal density slowed around 2010, from 30100% per year to 1015% per year, because of noise related to smaller grain size of the disk media, thermal stability, and writability using available magnetic fields.[160][161]

Fiber-optic capacity The number of bits per second that can be sent down an optical fiber increases exponentially, faster than Moore's law. Keck's law, in honor of Donald Keck.[162]

Network capacity According to Gerald Butters,[163][164] the former head of Lucent's Optical Networking Group at Bell Labs, there is another version, called Butters' Law of Photonics,[165] a formulation that deliberately parallels Moore's law. Butters' law says that the amount of data coming out of an optical fiber is doubling every nine months.[166] Thus, the cost of transmitting a bit over an optical network decreases by half every nine months. The availability of wavelength-division multiplexing (sometimes called WDM) increased the capacity that could be placed on a single fiber by as much as a factor of 100. Optical networking and dense wavelength-division multiplexing (DWDM) is rapidly bringing down the cost of networking, and further progress seems assured. As a result, the wholesale price of data traffic collapsed in the dot-com bubble. Nielsen's Law says that the bandwidth available to users increases by 50% annually.[167]

Pixels per dollar Similarly, Barry Hendy of Kodak Australia has plotted pixels per dollar as a basic measure of value for a digital camera, demonstrating the historical linearity (on a log scale) of this market and the opportunity to predict the future trend of digital camera price, LCD and LED screens, and resolution.[168][169][170][171]

The great Moore's law compensator (TGMLC), also known as Wirth's law generally is referred to as software bloat and is the principle that successive generations of computer software increase in size and complexity, thereby offsetting the performance gains predicted by Moore's law. In a 2008 article in InfoWorld, Randall C. Kennedy,[172] formerly of Intel, introduces this term using successive versions of Microsoft Office between the year 2000 and 2007 as his premise. Despite the gains in computational performance during this time period according to Moore's law, Office 2007 performed the same task at half the speed on a prototypical year 2007 computer as compared to Office 2000 on a year 2000 computer.

Library expansion was calculated in 1945 by Fremont Rider to double in capacity every 16 years, if sufficient space were made available.[173] He advocated replacing bulky, decaying printed works with miniaturized microform analog photographs, which could be duplicated on-demand for library patrons or other institutions. He did not foresee the digital technology that would follow decades later to replace analog microform with digital imaging, storage, and transmission media. Automated, potentially lossless digital technologies allowed vast increases in the rapidity of information growth in an era that now sometimes is called the Information Age.

Carlson curve is a term coined by The Economist[174] to describe the biotechnological equivalent of Moore's law, and is named after author Rob Carlson.[175] Carlson accurately predicted that the doubling time of DNA sequencing technologies (measured by cost and performance) would be at least as fast as Moore's law.[176] Carlson Curves illustrate the rapid (in some cases hyperexponential) decreases in cost, and increases in performance, of a variety of technologies, including DNA sequencing, DNA synthesis, and a range of physical and computational tools used in protein expression and in determining protein structures.

Eroom's law is a pharmaceutical drug development observation which was deliberately written as Moore's Law spelled backwards in order to contrast it with the exponential advancements of other forms of technology (such as transistors) over time. It states that the cost of developing a new drug roughly doubles every nine years.

Experience curve effects says that each doubling of the cumulative production of virtually any product or service is accompanied by an approximate constant percentage reduction in the unit cost. The acknowledged first documented qualitative description of this dates from 1885.[177][178] A power curve was used to describe this phenomenon in a 1936 discussion of the cost of airplanes.[179]

Edholm's law Phil Edholm observed that the bandwidth of telecommunication networks (including the Internet) is doubling every 18 months.[180] The bandwidths of online communication networks has risen from bits per second to terabits per second. The rapid rise in online bandwidth is largely due to the same MOSFET scaling that enables Moore's law, as telecommunications networks are built from MOSFETs.[181]

Haitz's law predicts that the brightness of LEDs increases as their manufacturing cost goes down.

Swanson's law is the observation that the price of solar photovoltaic modules tends to drop 20 percent for every doubling of cumulative shipped volume. At present rates, costs go down 75% about every 10 years.

More:

Moore's law - Wikipedia

What is Moore’s Law? | Is Moores Law Dead? | Synopsys

The slowing of Moores law has prompted many to ask, Is Moores law dead? This, in fact, is not occurring. While Moores law is still delivering exponential improvements, the results are being delivered at a slower pace. The pace of technology innovation is NOT slowing down, however. Rather, the explosion of hyperconnectivity, big data, and artificial intelligence applications has increased the pace of innovation and the need for Moores law-style improvements in delivered technology.

For many years, scale complexity drove Moores law and the semiconductor industrys exponential technology growth. As the ability to scale a single chip slows, the industry is finding other methods of innovation to maintain exponential growth.

This new design trend is driven by systemic complexity. Some aspects of this new approach to design have been dubbed more than Moore. This term refers primarily to 2.5D and 3D integration techniques.

The complete landscape is far bigger and presents the opportunity for higher impact, however. At the 2021 SNUG World conference of worldwide Synopsys Users Group members, the chairman and co-CEO of Synopsys, Aart de Geus, presented a keynote address. In his presentation, de Geus observed that Moores law is now blending with new innovations that leverage systemic complexity. He coined the term SysMoore as a shorthand way to describe this new design paradigm.

These trends and resultant terminology are summarized below. The SysMoore era will fuel semiconductor innovation for the foreseeable future. With it comes a wide range of design challenges that must be addressed.

Link:

What is Moore's Law? | Is Moores Law Dead? | Synopsys

Liverpool John Moores University – Wikipedia

University in Liverpool, England

Liverpool John Moores University (abbreviated LJMU) is a public research university in the city of Liverpool, England. The university can trace its origins to the Liverpool Mechanics' School of Arts, established in 1823.[3] This later merged to become Liverpool Polytechnic. In 1992, following an Act of Parliament, the Liverpool Polytechnic became what is now Liverpool John Moores University.[4] It is named after Sir John Moores, a local businessman and philanthropist, who donated to the university's precursor institutions.

The university had 25,050 students in 2019/20, of which 20,105 are undergraduate students and 4,945 are postgraduate,[2] making it the 30th largest university in the UK by total student population.

It is a member of the University Alliance, the Northern Consortium and the European University Association.

Founded as a small mechanics institution (Liverpool Mechanics' School of Arts) in 1823, the institution grew over the centuries by converging and amalgamating with different colleges, including the F.L.Calder School of Domestic Science,[5] the City of Liverpool C.F. Mott Training College, before eventually becoming Liverpool Polytechnic in 1970.[6] The university also has a long history of providing training, education and research to the maritime industry, dating back to the formation of the Liverpool Nautical College in 1892.

The institution then became a university under the terms of the Further and Higher Education Act 1992 under the new title of "Liverpool John Moores University". This new title was approved by the Privy Council on 15 September 1992. The university took its name from Sir John Moores, the founder of the Littlewoods empire. Moores was a great believer in the creation of opportunity for all, which embodies the ethos of LJMU in providing educational routes for people of all ages and from all backgrounds. This belief led Sir John Moores to invest in the institution and facilities, such as the John Foster Building (housing the Liverpool Business School), designed by and named after leading architect John Foster.[6] With the institution's backgrounds dating back as far as 1823, many of the university buildings date back also, with aesthetically pleasing Georgian and Victorian buildings found on a few of the campuses.[1]

LJMU now has more than 27,000 students[7] from over 100 countries world-wide, 2,400 staff and 250 degree courses.[8] LJMU was awarded the Queen's Anniversary Prize in 2005.[9]

Currently, Liverpool John Moores University is receiving more applications than previously seen[citation needed]; according to data in 2009, the total number of applications submitted to LJMU was 27,784.[10]

On 28 March 2022, former student and founder of Mowgli, Nisha Katona was installed as Chancellor of the university.[11] Previously, in 2008, astrophysicist and Queen lead guitarist Brian May was appointed the fourth Chancellor of Liverpool John Moores University. He replaced outgoing Chancellor Cherie Blair, wife of former Prime Minister Tony Blair. Honorary fellows in attendance at the ceremony included astronomer Sir Patrick Moore and actor Pete Postlethwaite.[12] May was succeeded as Chancellor by judge Sir Brian Leveson in 2013.

LJMU is a founding member of the Northern Consortium, an educational charity owned by 11 universities in northern England.

The university is separated into two campuses in Liverpool:

Between the two campuses is the Copperas Hill Site, opened in summer 2021, containing many faculties moved from the former IM Marsh Campus, and home to the Student Life and LJMU Sports Buildings. Its location between the two sites has been described by the university to help connect both of its campuses together, and is not regarded to be part of either. It is however closer to the Mount Pleasant Campus and separated from the City Campus by the A5047, and Liverpool Lime Street railway station.[13]

There are currently two libraries operated by LJMU, one for either campus:[14]

There is an LRC present in the Learning Commons of the Student Life Building on the Copperas Hill site between the two campuses.

Students of the university can use any library in term-time and some non-term time periods within the library's opening hours. The Student Life Building is open 24/7 in term time. Students need their student identification card for entry to all buildings.

There are more than 68,500 books in the Libraries' collections, with 1,630 work spaces available for students 24 hours a day. In addition to this there are over 16,000 e-books and 5,000 e-journals available.[15] It is a member of the Libraries Together: Liverpool Learning Partnership (evolved from Liverpool Libraries Group) which formed in 1990. Under which, a registered reader at any of the member libraries can have access rights to the other libraries within the partnership.[16]

The Tom Reilly Building houses the School of Sports and Exercise Sciences and the School of Natural Sciences and Psychology, which are both part of the Faculty of Science.[17] Some 8,000 students use the building which is located at LJMU's City Campus on Byrom Street. The five storey, 6,493m2 (69,890sqft) building was completed in November 2009[17] and opened in March 2010 by Liverpool F.C. captain Steven Gerrard.[18] The building provides sports and science facilities including; appetite laboratories, psychology testing labs, neuroscience labs, an indoor 70-metre running track, force plates, caren disc, physiology suites, a DEXA scanner, a driving simulator and a chronobiology lab.[17]

The university is organised into five faculties (which are each split into schools or centres), most of the faculties are based at a particular campus site however, with many joint honours degrees and some conventional degrees, the faculties overlap meaning students' degrees are from both faculties. The five faculties are:

LJMU is highly ranked for teaching and research in Sports and Exercise Sciences.[19][20] The Higher Education Funding Council for England (HEFCE) awarded LJMU 4.5million over five years for the establishment of a Centre for Excellence in Teaching and Learning (CETL)[citation needed]. The CETL award recognises LJMU's record for Physical Education, Dance, Sport and Exercises Sciences. LJMU is the only United Kingdom university to be awarded an Ofsted Grade A in Physical Education and it is also the premier institution for both teaching and research in Sport and Exercise Sciences.[20]

Liverpool Business School (LBS) is located in the Redmonds Building on the Mount Pleasant Campus and has over 2,500 students and 100 academics.[21][self-published source?]

LBS offers undergraduate, postgraduate (including an Executive MBA) and research based programmes.[21][self-published source?] Research areas include International Banking, Economics and Finance, Sustainable Enterprise, Public Service Management, Development of Modern Economic Thought, Performance Management, Marketing, Project Management, and Market Research.[22][self-published source?]

In the 2001 Research Assessment Exercise (RAE), LJMU reported notable research strengths in general engineering and sports-related sciences. By the 2008 RAE, LJMU was the top-performing post-92 university for Anthropology, Electrical and Electronic Engineering, General Engineering, Physics (Astrophysics) and Sports-Related Studies. According to the UK Research Assessment Exercise 2014 (RAE 2014), LJMU every unit of assessment submitted was rated as at least 45% internationally excellent or better.[23] In 2012, the university's scientist published notable research suggesting that the dinosaur's extinction may have been caused by increased methane production from the dinosaurs, with some informally saying that dinosaurs "farted" their way to extinction.[24]

Liverpool John Moores University was included in the new 2013 Times Higher Education 100 under 50, ranking 72 out of 100. The list aims to show the rising stars in the global academy under the age of 50 years.[31]

First Destination Survey results show that 89% of LJMU graduates are in employment or undertaking postgraduate study within six months of graduating.[1]

Students at the university are represented by the John Moores Students' Union.

Representation for all students is central and is conducted by executive officers elected annually. In most cases, these students will be on a sabbatical from their studies. The election process is normally contested in mid April, successful candidates assuming office the following academic year.

Liverpool John Moores University has BUCS-registered teams in badminton, basketball, cricket, football, cycling, hockey, netball, rugby league, rugby union, tennis, volleyball, swimming, and American football. Many of the sports teams compete in BUCS competitions. Liverpool Students' Union has 15 BUCS sports, from which 36 teams run, catering for over 800 athletes. In recent years, LJMU students have competed for BUCS representative squads, in national finals and at World University Championships.[32] In addition, the Students' Union also runs intramural sports leagues.

The university also enjoys success at national and world level. Gymnast Beth Tweddle studied at LJMU and has achieved national, Commonwealth, European, and World medals whilst also competing at the Olympic Games.

Every year the university sports compete for 'The Varsity Cup' in the inter-university derby, Liverpool John Moores University Vs. University of Liverpool. The competing sports include: badminton, basketball, hockey, football, netball, volleyball, swimming, tennis, and the snowriders racing team.

Excerpt from:

Liverpool John Moores University - Wikipedia

Liverpool John Moores University : Rankings, Fees & Courses Details …

Liverpool John Moores University (LJMU) can trace its roots back to 1823 and the foundation of Liverpool Mechanics and Apprentices Library. Weve come a long way since then, becoming a university in 1992 and we are now ranked 60th in The Times Higher Education World University Rankings 2018.

Throughout our history we have championed education for all; from our earliest students in the nineteenth century through to todays skilled graduates, who are driving forward twenty-first century innovations and economic success, both in the UK and overseas.

We take our name from one of the UKs most successful businessmen, Sir John Moores, who turned his love of football into a business empire that was worth millions. Born into a working class family, he proved that, with vision and hard work, anyone can achieve success.

So if you have talent and are willing to work hard and grasp every opportunity that comes your way, we are the University that can help you achieve your ambitions.

See the article here:

Liverpool John Moores University : Rankings, Fees & Courses Details ...

Moores law | computer science | Britannica

Moores law, prediction made by American engineer Gordon Moore in 1965 that the number of transistors per silicon chip doubles every year.

For a special issue of the journal Electronics, Moore was asked to predict developments over the next decade. Observing that the total number of components in these circuits had roughly doubled each year, he blithely extrapolated this annual doubling to the next decade, estimating that microcircuits of 1975 would contain an astounding 65,000 components per chip. In 1975, as the rate of growth began to slow, Moore revised his time frame to two years. His revised law was a bit pessimistic; over roughly 50 years from 1961, the number of transistors doubled approximately every 18 months. Subsequently, magazines regularly referred to Moores law as though it were inexorablea technological law with the assurance of Newtons laws of motion.

What made this dramatic explosion in circuit complexity possible was the steadily shrinking size of transistors over the decades. Measured in millimetres in the late 1940s, the dimensions of a typical transistor in the early 2010s were more commonly expressed in tens of nanometres (a nanometre being one-billionth of a metre)a reduction factor of over 100,000. Transistor features measuring less than a micron (a micrometre, or one-millionth of a metre) were attained during the 1980s, when dynamic random-access memory (DRAM) chips began offering megabyte storage capacities. At the dawn of the 21st century, these features approached 0.1 micron across, which allowed the manufacture of gigabyte memory chips and microprocessors that operate at gigahertz frequencies. Moores law continued into the second decade of the 21st century with the introduction of three-dimensional transistors that were tens of nanometres in size.

More:

Moores law | computer science | Britannica

Moore’s Law in 2022: Whats the status quo? – Power & Beyond

SEMICONDUCTOR TECHNOLOGY Moore's Law in 2022: Whats the status quo?

28.02.2022Updated on 27.07.2022 From Luke James

Related Vendors

Moores law is dead! This is a line of thought championed by many prominent individuals in the fields of electrical and power engineering. But its quite a controversial one; just as many people believe Moores Law is still true today in 2022 as those who believe that its dead and no longer valid.

(Source: Revoltan - stock.adobe.com)

The debate of whether Moores Law is dying (or already dead) has been going on for years. It has been discussed by pretty much everyone. But before we can give an aswer to that, let's first clarify the meaning of Moore's Law.

Moore's Law stems from the observation of Gordon Moore, co-founder and chairman emeritus of Intel, made in 1965. At the time, he said that the number of transistors in a dense integrated circuit had doubled roughly every year and would continue to do so for the next 10 years. In 1975, he revised his observation to say that this would occur every two years indefinitely.

Moore's Law is the principle that the speed and capability of computers can be expected to double every two years, as a result of increases in the number of transistors a microchip can contain.

Moores observation became the driving force behind the semiconductor technology revolution that led to the proliferation of computers and other electronic devices.

Moores Law is based on empirical observations made by Gordon Moore. The yearly doubling of the number of transistors on a microchip was extrapolated from observed data in 1965.Over time, the details of Moores Law were amended to reflect the true growth of transistor density. First, the doubling interval was increased to two years and then decreased to around 18 months. The exponential nature of Moores Law continued and created decades of opportunity for the semiconductor industry and the electronics that use them.

The issue for Moores Law is the inherent complexity of semiconductor process technology, and these complexities have been growing. Transistors are now three-dimensional, and the small feature size of todays advanced process technologies has required multiple exposures to reproduce these features on silicon wafers. This has added extreme complexity to the design process and has slowed down Moores Law.

Watch this video to see the origin story of Moore's Law with statement by Gordon Moore:

This slowing down has led many to ask, Is Moores Law dead?The simple answer to this is no, Moores Law is not dead. While its true that chip densities are no longer doubling every two years (thus, Moores Law isnt happening anymore by its strictest definition), Moores Law is still delivering exponential improvements, albeit at a slower pace. The trend is very much still here.

Intel's CEO Pat Gelsinger believes that Moore's Law is far from obsolete. As a goal for the next 10 years, he announced in 2021 not only to uphold Moore's Law, but to outpace it. There are many industry veterans who agree with this. Mario Morales, a program vice president at IDC, said he believes Moores Law is still relevant in theory in an interview with TechRepublic.If you look at what Moores Law has enabled, were seeing an explosion of more computing across the entire landscape,' he said. It used to be computing was centered around mainframes and then it became clients and now edge and endpoints, but theyre getting more intelligent, and now theyre doing AI inferencing, and you need computing to do that. So, Moores Law has been able to continue to really push computing to the outer edge.

While the consensus is that Moores Law is slowing down and that it might soon be augmented, it is still driving improvements in processing technology and the amount of progress that follows these improvements.

If it were dead, it simply couldnt do this.

Have you enjoyed reading this article? Then follow us on LinkedIn and stay up-to-date with daily posts about the latest developments on the industry, products and applications, tools and software as well as research and development.

(ID:48035602)

It goes without saying that we treat your personal data responsibly. Where we collect personal data from you, we process the data in compliance with the relevant data protection regulations. More detailed information is available in our privacy policy.

I consent to the use of my email address to send editorial newsletters by Mesago Messe Frankfurt GmbH, Rotebhlstr. 83-85, 70178 Stuttgart, Germany including all of its affiliates within the meaning of Section15 et seq. AktG (Mesago). I have viewed lists of each group of affiliates here for the Mesago.The content of the newsletter covers the products and services of all of the companies listed above including, for example, trade magazines and specialist books, events and exhibitions and event-related products and services, printed and digital media products and services such as additional (editorial) newsletters, competitions, lead campaigns, online and offline market research, technical web portals and e-learning courses. If my personal telephone number has also been collected, it may be used to send offers for the aforementioned products and services from the above companies, as well as for market research purposes.If I wish to access protected content online on portals of Mesago including its affiliates within the meaning of Section15 et seq. AktG, I must register with additional data in order to access that content. In return for this free access to editorial content, my data may be used in line with this declaration of consent for the purposes described herein.

I am aware that I can withdraw this consent at any time with future effect. My withdrawal of consent does not affect the lawfulness of processing performed based on my consent before its withdrawal. If I wish to withdraw my consent, I can send an email to Mesago at privacy@mesago.com. If I no longer wish to receive individual newsletters to which I have subscribed, I can also click on the unsubscribe link at the bottom of a newsletter. I can find more information on my right to withdraw, how to exercise this right and the consequences of my withdrawal in the Privacy policy Editorial newsletters section.

Read more here:

Moore's Law in 2022: Whats the status quo? - Power & Beyond

FDA Gives First Go Ahead for Lab Grown Meat Product

The FDA has approved a lab grown meat product from Upside Foods for human consumption, which now only needs USDA approval before being sold to customers.

Meat and Greet

Behold, ethical omnivores: the US Food and Drug Administration (FDA) has given a key go-ahead to what could be the first lab grown meat product bound for human consumption in the US.

The decision, a first for cultivated meat in the US, paves the way for Californian startup Upside Foods to start selling its lab-grown chicken product domestically — meaning that now, it only needs approval from the US Department of Agriculture (USDA) before the ersatz chicken can hit restaurant menus.

"The world is experiencing a food revolution and the [FDA] is committed to supporting innovation in the food supply," FDA officials said in a statement. "The agency evaluated the information submitted by Upside Foods as part of a pre-market consultation for their food made from cultured chicken cells and has no further questions at this time about the firm’s safety conclusion."

Upside Foods' products were evaluated via a process in which manufacturers divulge the production process to the agency for review, along with a sample. If everything looks good after inspection, the FDA then sends back a "no further questions" letter to the company.

"We are thrilled at FDA's announcement," said Upside director of communications David Kay in an email to Reuters. "This historic step paves the way for our path to market."

Going Protein

Lab meat like Upside's aren't a plant-based imitation, unlike popular vegan alternatives such as Beyond Burgers. Instead, they're made from real animal cells grown in bioreactors, sparing the lives of actual livestock.

But while at a cellular level the meat may be the same, customers will definitely notice a difference in price. For now, cultivating meat remains an extremely expensive process, so pending USDA approval notwithstanding, it could still be a while before you see it hit the shelves of your local grocer.

To let eager, early customers try out the lab meat, Upside, which already announced its collaboration with Michelin star chef Dominique Crenn last year, will be debuting its chicken at specific upscale restaurants.

"We would want to bring this to people through chefs in the initial stage," CEO Uma Valeti told Wired. "Getting chefs excited about this is a really big deal for us. We want to work with the best partners who know how to cook well, and also give us feedback on what we could do better."

While the FDA's thumbs-up only applies to a specific product of Upside's, it's still a historic decision, signalling a way forward for an industry that's rapidly accruing investment.

Updated to clarify details regarding the FDA's evaluation of the product.

More on lab grown meat: Scientists Cook Comically Tiny Lab-Grown Hamburger

The post FDA Gives First Go Ahead for Lab Grown Meat Product appeared first on Futurism.

Go here to see the original:

FDA Gives First Go Ahead for Lab Grown Meat Product

Celebrities Are Officially Being Sued by FTX Retail Investors

The first civil suit against the crypto exchange FTX was just filed, naming FTX, Sam Bankman-Fried, and 11 of FTX's many celebrity ambassadors.

Welp, that didn't take long. The first civil suit against the still-imploding crypto exchange FTX was just filed in a Florida court, accusing FTX, disgraced CEO Sam Bankman-Fried, and 11 of the exchange's many celebrity ambassadors of preying on "unsophisticated" retail investors.

The list of celeb defendants impressive — honestly, it reads more like an invite list to a posh award show than a lawsuit.

Geriatric quarterback Tom Brady and soon-to-be-ex-wife Gisele Bündchen lead the pack, followed by basketball players Steph Curry and Udonis Haslem, as well as the Golden State Warriors franchise; tennis star Naomi Osaka; baseballers Shoehi Ohtani, Udonis Haslem, and David Ortiz; and quarterback Trevor Laurence.

Also named is comedian Larry David — who starred in that FTX Super Bowl commercial that very specifically told investors that even if they didn't understand crypto, they should definitely invest — and investor Kevin O'Leary of "Shark Tank" fame.

"The Deceptive and failed FTX Platform," reads the suit," "was based upon false representations and deceptive conduct."

"Many incriminating FTX emails and texts... evidence how FTX’s fraudulent scheme was designed to take advantage of unsophisticated investors from across the country," it continues. "As a result, American consumers collectively sustained over $11 billion dollars in damages."

Indeed, a number of FTX promos embraced an attitude similar to the cursed Larry David commercial. In one, Steph Curry tells viewers that with FTX, there's no need to be an "expert," while a Naomi Osaka promotion pushed the idea that crypto trading should be "accessible," "easy," and "fun."

It's also worth noting that this isn't the first suit of its kind. Billionaire Mark Cuban, also of "Shark Tank" fame, was named in a class action lawsuit launched against the bankrupt lender Voyager in August, while reality TV star Kim Kardashian was recently made to pay a roughly $1.2 million fine for hawking the "EthereumMAX" token without disclosing that she was paid to do so.

The FTX suit, however, appears to be the most extensive — and high-profile — of its kind. And while a fine for a million or two is basically a one dollar bill to this tax bracket, $11 billion, even if split amongst a group of 11 exorbitantly wealthy celebs, is a more substantial chunk of change.

Of course, whether anyone actually ever has to pay up remains to be seen. Regardless, it's still a terrible look, and real people got hurt. If there's any defense here, though? At least they didn't promise to be experts.

READ MORE: FTX founder Sam Bankman-Fried hit with class-action lawsuit that also names Brady, Bündchen, Shaq, Curry [Fox Business]

More on the FTX crash: Experts Say Sam Bankman-fried's Best Legal Defense Is to Say He's Just Really, Really Stupid

The post Celebrities Are Officially Being Sued by FTX Retail Investors appeared first on Futurism.

Read more from the original source:

Celebrities Are Officially Being Sued by FTX Retail Investors

Sam Bankman-Fried Admits the "Ethics Stuff" Was "Mostly a Front"

In Twitter DMs, FTX founder Sam Bankman-Fried appeared to admit that his

Effecting Change

The disgraced former head of the crypto exchange FTX, Sam Bankman-Fried, built his formidable public persona on the idea that he was a new type of ethical crypto exec. In particular, he was a vocal proponent of "effective altruism" — the vague-but-noble concept of using data to make philanthropic giving as targeted and helpful as possible.

But in a direct message, Vox's Kelsey Piper asked Bankman-Fried if the "ethics stuff" had been "mostly a front."

Bankman-Fried's reply: "Yeah."

"I mean that's not *all* of it," he wrote. "But it's a lot."

Truth Be Told

If the concept of becoming rich to save the world strikes you as iffy, you're not alone — and it appears that even Bankman-Fried himself knows it.

When Piper observed that Bankman-Fried had been "really good at talking about ethics" while actually playing a game, he responded that he "had to be" because he'd been engaged in "this dumb game we woke Westerners play where we say all the right shibboleths and everyone likes us."

Next time you're thinking of investing in crypto, maybe it's worth taking a moment to wonder whether the person running the next exchange might secretly be thinking the same thing.

More on effective altruism: Elon Musk Hired A Professional Gambler to Manage His Philanthropic Donations

The post Sam Bankman-Fried Admits the "Ethics Stuff" Was "Mostly a Front" appeared first on Futurism.

See the original post:

Sam Bankman-Fried Admits the "Ethics Stuff" Was "Mostly a Front"

NASA Drops Stunning New James Webb Image of a Star Being Born

The James Webb Space Telescope just released an image of a star being born, and it gives Lady Gaga and Bradley Cooper a run for their money.

Birth Canal

The James Webb Space Telescope's latest mind-bending image just dropped — and this one is, in a word, splendid.

As NASA notes in a blog post about the finding, the telescope's Near-Infrared Camera (NIRCam) was put to incredible use when capturing the "once-hidden features" of the beginnings of a star.

Known as "protostars," celestial objects like this one — found inside an uber-absorbant "dark nebula" cloud — are not yet stars, but will be soon. In short, the Webb telescope capture imagery of a star being born.

As NASA notes, the fledgling star itself is hidden within the tiny "neck" disk of the spectacular, fiery hourglass shape in the image — which is, as NASA notes, "about the size of our solar system" — and the colorful lights seen below and above this neck are emitted by the protostar's birth.

Countdown to a new star ?

Hidden in the neck of this “hourglass” of light are the very beginnings of a new star — a protostar. The clouds of dust and gas within this region are only visible in infrared light, the wavelengths that Webb specializes in: https://t.co/DtazblATMW pic.twitter.com/aGEEBO9BB8

— NASA Webb Telescope (@NASAWebb) November 16, 2022

Stellar Anatomy

While this incredible capture is not the first time space telescopes have observed star birth, Webb's latest does provide an incredible look at the phenomenon.

"The surrounding molecular cloud is made up of dense dust and gas being drawn to the center, where the protostar resides," the post reads. "As the material falls in, it spirals around the center. This creates a dense disk of material, known as an accretion disk, which feeds material to the protostar."

Some of that material, NASA notes, are "filaments of molecular hydrogen that have been shocked as the protostar ejects material away from it," most of which the stellar fetus takes for itself. It continues to feed on that material, growing more massive and compressing further until its core temperature rises to the point that it kickstarts nuclear fusion.

This gorgeous peek at that process is extraordinary to witness — and a yet another testament to the power of the mighty James Webb.

More on Webb: NASA Fixes Months-Long Issue With Webb Telescope

The post NASA Drops Stunning New James Webb Image of a Star Being Born appeared first on Futurism.

Read the original here:

NASA Drops Stunning New James Webb Image of a Star Being Born

"Elon" Plummets in Popularity as a Baby Name for Some Reason

According to BabyCenter's

Big Baby

Tesla and SpaceX CEO Elon Musk's name has clearly lost its luster among the parents of newborns.

According to BabyCenter's review of the data the name "Elon" has cratered in popularity over the last year, dropping from 120 babies per million in 2021 to just 90 babies per million, falling in the popularity rankings by 466 spots.

The name had seen a meteoric rise over the last seven or so years, but is currently falling out of favor big time, plummeting back down to 2019 levels.

The read? It seems like Musk's public reputation has been taking a significant hit.

Name Game

There are countless reasons why Musk could be less popular public figure than he was three years ago.

Especially since the start of the COVID-19 pandemic, Musk emerged as a controversial figure, speaking out against vaccinations and lockdowns. He has also become synonymous with an unhealthy work culture, firing practically anybody standing in his way and forcing his employees to work long hours.

The fiasco surrounding Musk's chaotic takeover of Twitter has likely only further besmirched his public image.

For reference, other baby names that have fallen out of fashion include "Kanye" — almost certainly in response to the travails of rapper Kanye West, who's had a years-long relationship with Musk — which fell a whopping 3,410 spots over the last year.

More on Elon Musk: Sad Elon Musk Says He's Overwhelmed In Strange Interview After the Power Went Out

The post "Elon" Plummets in Popularity as a Baby Name for Some Reason appeared first on Futurism.

See the original post here:

"Elon" Plummets in Popularity as a Baby Name for Some Reason

NASA Tells Astronauts That Tweeting Isn’t As Important as Staying Alive

NASA's astronaut social media handbook just dropped — and they've got some staunch guidelines for safely tweeting on the ISS.

Stayin' Alive

NASA's astronaut social media handbook just dropped — and they've got some staunch guidelines for space tweeting.

As part of a public records request, NASA released to Vox an almost entirely unredacted copy of its current social media handbook for astronauts, and it offers a fascinating look into the agency's policies for the online astronauts it sends to space.

Overall, it's a reasonable document. One particularly interesting detail? It advises astronauts to please lay off of posting when their lives are in jeopardy. Good advice for us all!

Socialing

In a 2018 memo from the Johnson Space Center included in the records provided to Vox, NASA notes that along with not posting for personal or financial gain or exposing state secrets, "social media efforts should always be considered secondary to the safety of the crew and vehicle."

In another section of the guidelines, a slide reminds astronauts that "social media is voluntary and should be considered secondary to safety of mission and crew cohesion."

Politicking

Beyond bodily safety, political discretion is also repeatedly advised in the guidelines — an important detail, given the past and current tensions between the ISS' main players, the United States and Russia.

While some have criticized NASA for doing a bit too much social networking — the agency operates a whopping 700 social media accounts, including on Reddit, Twitch, and LinkedIn — it clearly takes a backseat to onboard safety.

Given how much can go wrong on both a mortal and interpersonal level while floating above the Earth, that's definitely a good thing.

More on the ISS: Amazing Video Shows What the ISS Would Look Like If It Flew at the Height of a Jetplane

The post NASA Tells Astronauts That Tweeting Isn't As Important as Staying Alive appeared first on Futurism.

Read more here:

NASA Tells Astronauts That Tweeting Isn't As Important as Staying Alive

Former Facebook Exec Says Zuckerberg Has Surrounded Himself With Sycophants

Conviction is easy if you're surrounded by a bunch of yes men — which Mark Zuckerberg just might be. And $15 billion down the line, that may not bode well.

In just about a year, Facebook-turned-Meta CEO Mark Zuckerberg's metaverse vision has cost his company upwards of $15 billion, cratering value and — at least in part — triggering mass company layoffs. That's a high price tag, especially when the Facebook creator has shockingly little to show for it, both in actual technology and public interest.

Indeed, it seems that every time Zuckerberg excitedly explains what his currently-legless metaverse will one day hold, he's met with crickets — and a fair share of ridicule — at the town square. Most everyone finds themselves looking around and asking themselves the same question: who could this possibly be for, other than Zucko himself?

That question, however, doesn't really seem to matter to the swashzuckling CEO, who's either convinced that the public wants and needs his metaverse just as much as he does, or is simply just convicted to the belief that one day people will finally get it. After all, he's bet his company on this thing and needs the public to engage to stay financially viable long-term.

And sure, points for conviction. But conviction is easy if you're surrounded by a bunch of yes men — which, according to Vanity Fair, the founder unfortunately is. And with $15 billion down the line, that may not bode well for the Silicon Valley giant.

"The problem now is that Mark has surrounded himself with sycophants, and for some reason he's fallen for their vision of the future, which no one else is interested in," one former Facebook exec told Vanity Fair. "In a previous era, someone would have been able to reason with Mark about the company's direction, but that is no longer the case."

Given that previous reports have revealed that some Meta employees have taken to marking metaverse documents with the label "MMA" — "Make Mark Happy" — the revelation that he's limited his close circle to people who only agree with him isn't all that shocking. He wants the metaverse, he wants it bad, and he's put a mind-boggling amount of social and financial capital into his AR-driven dream.

While the majority of his many thousands of employees might disagree with him — Vanity Fair reports that current and former metamates have written things like "the metaverse will be our slow death" and "Mark Zuckerberg will single-handedly kill a company with the metaverse" on the Silicon Valley-loved Blind app — it's not exactly easy, or even that possible, to wrestle with the fact that you may have made a dire miscalculation this financially far down the road.

And if you just keep a close circle of people who just agree with you, you may not really have to confront that potential for failure. At least not for a while.

The truth is that Zuckerberg successfully created a thing that has impacted nearly every single person on this Earth. Few people can say that. And while it can be argued that the thing he built has, at its best, created some real avenues for connection, that same creation also seems to have led to his own isolation, in life and at work.

How ironic it is that he's marketed his metaverse on that same promise of connection, only to become more disconnected than ever.

READ MORE: "Mark Has Surrounded Himself with Sycophants": Zuckerberg's Big Bet on the Metaverse Is Backfiring [Vanity Fair]

More on the Meta value: Stock Analyst Cries on Tv Because He Recommended Facebook Stock

The post Former Facebook Exec Says Zuckerberg Has Surrounded Himself With Sycophants appeared first on Futurism.

Read more here:

Former Facebook Exec Says Zuckerberg Has Surrounded Himself With Sycophants

Panicked Elon Musk Reportedly Begging Engineers Not to Leave

According to former Uber engineer Gergely Orosz,

Elon Musk's Twitter operations are still in free fall.

Earlier this week, the billionaire CEO sent an email to staff telling them that they "need to be extremely hardcore" and work long hours at the office, or quit and get three months severance, as The Washington Post reports.

Employees had until 5 pm on Thursday to click "yes" and be part of Twitter moving forward or take the money and part ways. The problem for Musk? According to former Uber engineer Gergely Orosz, who has had a close ear to Twitter's recent inner turmoil, "far fewer than expected [developers] hit 'yes.'"

So many employees called Musk's bluff, Orosz says, that Musk is now "having meetings with top engineers to convince them to stay," in an  embarrassing reversal of his public-facing bravado earlier this week.

Twitter has already been rocked by mass layoffs, cutting the workforce roughly in half. Instead of notifying them, employees had access to their email and work computers revoked without notice.

Even that process was bungled, too, with some employees immediately being asked to return to the company after Musk's crew realized it had sacked people it needed.

According to Orosz's estimations, Twitter's engineering workforce may have been cut by a whopping 90 percent in just three weeks.

Musk has been banging the war drums in an active attempt to weed out those who aren't willing to abide by his strict rules and those who were willing to stand up to him.

But developers aren't exactly embracing that kind of tyranny.

"Sounds like playing hardball does not work," Orosz said. "Of course it doesn't."

"From my larger group of 50 people, 10 are staying, 40 are taking the severance," one source reportedly told Orosz. "Elon set up meetings with a few who plan to quit."

In short, developers are running for the hills — and besides, they're likely to find far better work conditions pretty much anywhere else.

"I am not sure Elon realizes that, unlike rocket scientists, who have relatively few options to work at, [developers] with the experience of building Twitter only have better options than the conditions he outlines," Orosz argued.

Then there's the fact that Musk has publicly lashed out at engineers, mocking them and implying that they were leading him on.

Those who spoke out against him were summarily fired.

That kind of hostility in leadership — Musk has shown an astonishing lack of respect — clearly isn't sitting well with many developers, who have taken up his to get three months of severance and leave.

"I meant it when I called Elon's latest ultimatum the first truly positive thing about this Twitter saga," Orosz wrote. "Because finally, everyone who had enough of the BS and is not on a visa could finally quit."

More on Twitter: Sad Elon Musk Says He's Overwhelmed In Strange Interview After the Power Went Out

The post Panicked Elon Musk Reportedly Begging Engineers Not to Leave appeared first on Futurism.

Continued here:

Panicked Elon Musk Reportedly Begging Engineers Not to Leave

NASA Orders Press Not to Photograph Launch Site After Moon Mission Takes Off

NASA apparently barred the press from photographing the Artemis moon rocket launch when it lifted its Orion capsule off to space earlier this week. 

No Photos, Please

NASA barred the press from photographing the launch site of its Space Launch System after it boosted the agency's Artemis I Moon mission into space earlier this week.

Multiple space reporters said on Twitter that the agency had sent them a message telling them they were prohibited from photographing the Artemis 1 launch tower after the liftoff.

"NASA did not provide a reason," Eric Berger, Ars Technica's senior space editor, tweeted. The reporter added that according to his sources, the ban was apparently an attempt to save face after the launch damaged the tower.

"So now sources are saying that yes, Launch Complex-39B tower was damaged during the Artemis I launch on Wednesday morning," Berger tweeted. "Basically, there were leaks and damage where there weren't supposed to be leaks and damage."

Damaging Reports

Later, Washington Post space reporter Christian Davenport posted a statement from NASA that seemed to corroborate Berger's sources, though he emphasized that there was "no word on damage" to the launch pad.

"Because of the current state of the configuration, there are [International Traffic in Arms Regulations license] restrictions and photos are not permitted at this time," the statement given to Davenport read. "There also is a launch debris around the pad as anticipated, and the team is currently assessing."

Whatever NASA's reasoning, it's pretty clear that the agency doesn't want unapproved photos of its expensive and overdue Space Launch System rocket going out to the public. NASA loves positive publicity, it seems — but not negative.

More on the Artemis 1 launch: NASA Says It's Fine That Some Pieces May Have Fallen Off Its Moon Rocket During Launch

The post NASA Orders Press Not to Photograph Launch Site After Moon Mission Takes Off appeared first on Futurism.

Read more here:

NASA Orders Press Not to Photograph Launch Site After Moon Mission Takes Off

Celebrities’ Bored Apes Are Hilariously Worthless Now

The value of Bored Ape Yacht Club NFTs has absolutely plummeted, leaving celebrities with six figure losses, a perhaps predictable conclusion.

Floored Apes

The value of Bored Ape Yacht Club NFTs have absolutely plummeted, leaving celebrities with six figure losses, in a perhaps predictable conclusion to a bewildering trend.

Earlier this year, for instance, pop star Justin Bieber bought an Ape for a whopping $1.3 million. Now that the NFT economy has essentially collapsed in on itself, as Decrypt points out, it's worth a measly $69,000.

Demand Media

NFTs, which represent exclusive ownership rights to digital assets — but usually, underwhelmingly, just JPGs and GIFs — have absolutely plummeted in value, spurred by the ongoing crypto crisis and a vanishing appetite.

Sales volume of the blockchain knickknacks has also bottomed out. NFT sales declined for six straight months this year, according to CryptoSlam.

According to NFT Price Floor, the value of the cheapest available Bored Ape dipped down to just 48 ETH, well below $60,000, this week. In November so far, the floor price fell 33 percent.

Meanwhile, the crypto crash is only accelerating the trend, with the collapse of major cryptocurrency exchange FTX leaving its own mark on NFT markets.

Still Kicking

Despite the looming pessimism, plenty of Bored Apes are still being sold. In fact, according to Decrypt, around $6.5 million worth of Apes were moved on Tuesday alone, an increase of 135 percent day over day.

Is the end of the NFT nigh? Bored Apes are clearly worth a tiny fraction of what they once were, indicating a massive drop off in interest.

Yet many other much smaller NFT marketplaces are still able to generate plenty of hype, and millions of dollars in sales.

In other words, NFTs aren't likely to die out any time soon, but they are adapting to drastically changing market conditions — and leaving celebrities with deep losses in their questionable investments.

READ MORE: Justin Bieber Paid $1.3 Million for a Bored Ape NFT. It’s Now Worth $69K [Decrypt]

More on NFTs: The Latest Idea to Make People Actually Buy NFTs: Throw in a House

The post Celebrities' Bored Apes Are Hilariously Worthless Now appeared first on Futurism.

Visit link:

Celebrities' Bored Apes Are Hilariously Worthless Now

Ticketmaster May Have Finally Met Its Match: Furious Swifties

The notorious ticket selling service Ticketmaster botched the pre-sale of tickets for Taylor Swift's upcoming tour. Now, everyone's calling for its head.

The notorious ticket peddling service Ticketmaster has never been a fan favorite, and anyone who's ever bought a concert ticket there can attest to why. Preposterous prices, slimy junk fees, and terrible customer service are just a few of its mundane evils. In spite of how universally reviled it is, Ticketmaster has persisted as the king of the box office. But now, it's facing its worst PR nightmare in years — and that's saying something. Why? It made the fatal error of pissing off Taylor Swift fans, or "Swifties."

Swift's "Eras Tour," which will have her perform at over 50 venues in the US alone, is set to be one of the biggest music events on the planet. Biding their time, her fiercely loyal fanbase — probably the largest of any single artist and easily the most vocal online — have been waiting since 2018 for her next headlining tour. So, looking to guarantee a spot, many of them signed up for Ticketmaster's Verified Fans program, a system which was supposed to only allow a select amount of around 1.5 million real fans — as opposed to scalper bots — to buy tickets ahead of time.

It didn't work. Ticketmaster CEO Michael Rapino told The Hollywood Reporter that around 14 million users, some of them bots, rushed to buy pre-sale tickets this week, and it pretty much broke the service. Parts of the website immediately crashed, leaving millions either waiting for hours or suffering through a miserable, glitchy experience — only for some to be told they couldn't buy a ticket anyway even though they were verified. In total, Ticketmaster was barraged with 3.5 billion system requests, which is nearly half the population of the Earth and four times its previous peak.

Even with all the difficulties, it did manage to sell around two million tickets — but it's unclear how many of those went to actual, verified Swifties and how many went to scalpers.

And we suspect that Ticketmaster has made way more than that in the form of enemies. Search its name on social media right now, and you'll be returned with swarms of complaints from ardent Swifties and Ticketmaster haters crawling out of the woodwork.

To make matters worse, the maligned seller abruptly informed fans via Twitter that it would be canceling the sale of tickets to the general public originally planned for Friday, "due to extraordinarily high demands on ticketing systems and insufficient remaining ticket inventory to meet that demand."

With Ticketmaster shutting its doors, vulturous resellers who gobbled up tickets during the presale pandemonium remain the only alternative for fans, selling them at outrageous amounts as high as $28,000, Reuters reports.

Exceptionally crummy service isn't exactly a scandal in itself, but the magnitude of Ticketmaster's mishandling of the situation — and the blatant scalping it's enabled — has brought significant attention to the company's nefarious practices and its stranglehold on the market.

Now, politicians are jumping on the Swifties' grievances to call for Ticketmaster's head.

"Daily reminder that Ticketmaster is a monopoly, [its] merger with LiveNation should never have been approved, and they need to be [reined] in," said Rep. Alexandria Ocasio-Cortez (D-NY), in a tweet. "Break them up."

"It's no secret that Live Nation-Ticketmaster is an unchecked monopoly," echoed Rep. David N. Cicilline (D-RI), the chair of the House Subcommittee on Antitrust, Commercial, and Administrative Law.

"The merger of these companies should never have been allowed in the first place," Cicilline added, stating that he's joining others to call on the Department of Justice (DOJ) to "investigate LiveNation’s efforts to jack up prices and strangle competition."

Ticketmaster was already a behemoth in the 90s when Pearl Jam — then one of the biggest bands in the world — tried to take them on. Eddie Vedder and his bandmates certainly made the concert corporation sweat for a time, but since then, it's only grown. In 2010, it merged with LiveNation, once its largest competitor and now Ticketmaster's parent company. Critics, like AOC and Cicilline, argue that this merger was in blatant violation of antitrust laws.

Monopolistic behavior aside, as well as frequently bullying artists and venues to give into its tyrannical demands, consumers don't have to dig very far to realize Ticketmaster is ripping them off. Buy a ticket on there and it could charge you a significant portion of the ticket price in service and other junk fees.

Another culprit? Its dynamic pricing model, infamously used in other industries like airline tickets and hotels, in which prices are continuously adjusted in real time based on demand. As a result, ticket prices are not made public before a sale begins. In theory, dynamic pricing is meant to make predatory resellers obsolete by keeping prices competitive. But really, it's just a good excuse for Ticketmaster to match its prices with that of ludicrous resellers and pocket the extra cash.

Furthermore, at least one 2018 investigation by CBC found that Ticketmaster was quietly recruiting professional scalpers into its reseller program, and turned a blind eye to them using hundreds of fake accounts to sell tickets.

Bearing all that in mind, you'd think Swift would speak up about the most recent fiasco over her tour.

And for a while, she didn't, driving fans frantic over her silence — which she's finally broken.

On Friday, Swift spoke out in a carefully worded statement on her Instagram.

"Well, it goes without saying that I’m extremely protective of my fans," she began. "It’s really difficult for me to trust an outside entity with these relationships and loyalties, and excruciating for me to just watch mistakes happen with no recourse."

Swift is clearly alluding to Ticketmaster here, and euphemistically summed up the situation as there being "a multitude of reasons why people had such a hard time trying to get tickets" — though she never specifically names the corporation.

Diplomatic as the words may be, they've dropped at the perfect moment, because The New York Times reports that the DOJ has opened an antitrust investigation over LiveNation's ownership of Ticketmaster (though at press time, official confirmation is still pending.)

Could this be the beginning of the end of the company's unfettered dominance? Maybe. Ticketmaster and LiveNation only seem to get stronger with the more bad PR they get. So taking them down? It'll take more than online outrage. However, with Swift looking poised to join the fight alongside the DOJ, maybe this time around the concert conglomerate will get a run for its money.

More on Taylor Swift: Taylor Swift Reportedly Threatened Microsoft Over Racist Chatbot

The post Ticketmaster May Have Finally Met Its Match: Furious Swifties appeared first on Futurism.

Go here to read the rest:

Ticketmaster May Have Finally Met Its Match: Furious Swifties

Experts Baffled by Why NASA’s “Red Crew” Wear Blue Shirts

Red Crew, Blue Crew

Had it not been for the heroics of three members of NASA's specialized "Red Crew," NASA's absolutely massive — and incredibly expensive — Space Launch System (SLS) likely wouldn't have made it off the ground this week.

During the launch, the painfully delayed Mega Moon Rocket sprang a hydrogen leak. The Red Crew ventured into the dangerous, half-loaded launch zone to fix it live. Incredible work indeed, although in spite of their heroics, keen-eyed observers did notice something strange about the so-called Red Crew: they, uh, don't wear red?

"How is it we spent $20B+ on this rocket," tweeted Chris Combs, a professor at the University of Texas San Antonio, "but we couldn't manage to get some RED SHIRTS for the Red Team."

Alas, the rumor is true. Red shirts seemed to be out of the budget this year — perhaps due to the ungodly amount of money spent on the rocket that these guys could have died while fixing — with the Red Crew-mates donning dark blue shirts instead. Per the NYT, they also drove white cars, which feels like an additional miss.

A leftover from last night that’s still bothering me:

how is it we spent $20B+ on this rocket but we couldn’t manage to get some RED SHIRTS for the Red Team pic.twitter.com/FO10Y6mg3H

— Chris Combs (@DrChrisCombs) November 16, 2022

Packing Nuts

For their part, the Red Crew didn't seem to care all that much, at least not in the moment. They were very much focused on needing to "torque" the "packing nuts," as they reportedly said during a post-launch interview on NASA TV. In other words, they were busy with your casual rocket science. And adrenaline, because, uh, risk of death.

"All I can say is we were very excited," Red Crew member Trent Annis told NASA TV, according to the NYT. "I was ready to get up there and go."

"We were very focused on what was happening up there," he added. "It's creaking, it's making venting noises, it's pretty scary."

In any case, shoutout to the Red Crew. The Artemis I liftoff is historic, and wouldn't have happened if they hadn't risked it all. They deserve a bonus, and at the very least? Some fresh new shirts.

READ MORE: When NASA'S moon rocket sprang a fuel leak, the launch team called in the 'red crew.' [The New York Times]

More on the Artemis I launch: Giant Nasa Rocket Blasts off Toward the Moon

The post Experts Baffled by Why NASA’s “Red Crew” Wear Blue Shirts appeared first on Futurism.

Read the original:

Experts Baffled by Why NASA’s “Red Crew” Wear Blue Shirts

Startup Says It’s Building a Giant CO2 Battery in the United States

Italian startup Energy Dome has designed an ingenious battery that uses CO2 to store energy, and it only needs non-exotic materials like steel and water.

Italian Import

Carbon dioxide has a bad rep for its role in driving climate change, but in an unexpected twist, it could also play a key role in storing renewable energy.

The world's first CO2 battery, built by Italian startup Energy Dome, promises to store renewables on an industrial scale, which could help green energy rival fossil fuels in terms of cost and practicality.

After successfully testing the battery at a small scale plant in Sardinia, the company is now bringing its technology to the United States.

"The US market is a primary market for Energy Dome and we are working to become a market leader in the US," an Energy Dome spokesperson told Electrek. "The huge demand of [long duration energy storage] and incentive mechanisms like the Inflation Reduction Act will be key drivers for the industry in the short term."

Storage Solution

As renewables like wind and solar grow, one of the biggest infrastructural obstacles is the storage of the power they produce. Since wind and solar sources aren't always going to be available, engineers need a way to save excess power for days when it's less sunny and windy out, or when there's simply more demand.

One obvious solution is to use conventional battery technology like lithium batteries, to store the energy. The problem is that building giant batteries from rare earth minerals — which can be prone to degradation over time — is expensive, not to mention wasteful.

Energy Dome's CO2 batteries, on the other hand, use mostly "readily available materials" like steel, water, and of course CO2.

In Charge

As its name suggests, the battery works by taking CO2, stored in a giant dome, and compressing it into a liquid by using the excess energy generated from a renewable source. That process generates heat, which is stored alongside the now liquefied CO2, "charging" the battery.

To discharge power, the stored heat is used to vaporize the liquid CO2 back into a gas, powering a turbine that feeds back into the power grid. Crucially, the whole process is self-contained, so no CO2 leaks back into the atmosphere.

The battery could be a game-changer for renewables. As of now, Energy Dome plans to build batteries that can store up to 200 MWh of energy. But we'll have to see how it performs as it gains traction.

More on batteries: Scientists Propose Turning Skyscrapers Into Massive Gravity Batteries

The post Startup Says It's Building a Giant CO2 Battery in the United States appeared first on Futurism.

Read this article:

Startup Says It's Building a Giant CO2 Battery in the United States

Elon Musk Locks Twitter Employees Out Office, Then Asks Them to Meet Him on the 10th Floor

Elon Musk's ownership of Twitter is somehow going even worse than expected amid reports that he's locked employees out of the company's office buildings.

Worst Case Scenario

Elon Musk's Twitter-buying experiment is somehow going even worse than expected, amid reports that he's locked employees out of the company's office buildings.

As reported by Platformer's Zoë Schiffer, an email sent to Twitter staff yesterday evening informed them out of the blue that they wouldn't be able to get into their offices for the rest of the week.

"We're hearing this is because Elon Musk and his team are terrified employees are going to sabotage the company," Schiffer wrote. "Also, they're still trying to figure out which Twitter workers they need to cut access for."

Then, the saga somehow got even stranger today when Musk emailed staff asking them to come to the 10th floor of Twitter's headquarters — which, remember, they'd just been told they were locked out of — for a meeting on the 10th floor.

Ultimatums

All told, the aura of chaos surrounding Twitter since Musk's acquisition late last month has deepened to a comical degree.

News of the office closure, you'll recall, comes not long after Musk issued an ultimatum to the staff who survived his first purge the company's employees, in which he said that if "tweeps" didn't come into the office, they would be effectively tendering their resignations.

Just before the office closure announcement, Musk gave his new employees another apparent threat: that if they are not prepared "to be extremely hardcore" and work long in-office hours, they can cut and run with three months severance.

Unsurprisingly, many Twitter employees have chosen the latter — a move that some described to CNN's Darcy as a "mass exodus."

And in the face of all this contradiction and whiplash, who could blame them?

More on Musk: Panicked Elon Musk Reportedly Begging Engineers Not to Leave

The post Elon Musk Locks Twitter Employees Out Office, Then Asks Them to Meet Him on the 10th Floor appeared first on Futurism.

Link:

Elon Musk Locks Twitter Employees Out Office, Then Asks Them to Meet Him on the 10th Floor