LockBit ransomware borrows tricks to keep up with REvil and Maze – Naked Security

Ransomware operators are always on the lookout for a way to take their ransomware to the next level. Thats particularly true of the gang behind LockBit. Following the lead of the Maze and REvil ransomware crime rings, LockBits operators are now threatening to leak the data of their victims in order to extort payment. And the ransomware itself also includes a number of technical improvements that show LockBits developers are climbing the ransomware learning curveand have developed an interesting technique to circumvent Windows User Account Control (UAC).

Because of recent dynamics in the ransomware world, we suspect that this privilege-escalation technique will pop up in other ransomware families in the future. Weve seen a surge in imposter ransomware that are essentially rebranded variants of already-existing ransomware. Not a single day goes by where a new brand of ransomware does not come out. It has become surprisingly easy to clone ransomware and release it, with small modifications, under a different umbrella.

Before we jump into the synopsis of LockBit, lets take a moment to look at how ransomware is developed, in general. Many families follow a common timeline when it comes to the techniques and procedures ransomware developers implement at each stage. This appears to stem from the learning curve involved in creating ransomware, and the iteration of the malware as the developer builds his or her related knowledge of the malware craft.

Each ransomware seems to have an infancy phase, where the developer implements TTPs hastily just so the product can come out and start gaining its reputation. In this phase, the simplest ideas are implemented first, strings are usually plain text, the encryption is implemented in a way that only a single-thread is used, and LanguageID checks are in place to avoid encrypting computers in CIS countries. and avoid attracting unwanted attention from CIS law enforcement agencies.

After about 2 months into the ransomware operation, the developer starts implementing more sophisticated elements. They may introduce multi-threading, establish a presence in underground forums, obfuscate or encrypt strings in the binary, and there is usually a skip list/kill list for services and processes.

Around 4 months into the ransomwares life, we start seeing things get more serious. The business model may now switch to Ransomware as a Service (RaaS), putting an Affiliate program in place. Oftentimes, binaries are cryptographically signed with valid, stolen certificates. There is a possibility that the ransomware developer starts implementing UAC bypasses at this stage. This appears to be the stage the LockBit group is entering.

As with most ransomware, LockBit maintains a forum topic on a well-known underground web board to promote their product. Ransomware operators maintain a forum presence mainly to advertise the ransomware, discuss customer inquiries and bugs, and to advertise an affiliate program through which other criminals can lease components of the ransomware code to build their own ransomware and infrastructure.

In January, LockBits operators created a new thread in the web boards marketplace forum, announcing the LockBit Cryptolocker Affiliate Program and advertising the capabilities of their malware. The post claims that the new version had been in development since September of 2019, and emphasizes the performance of the encryptor and its lower use of system resources to prevent its detection.

LockBits post indicates that we do not work in the CIS, meaning that the ransomware will not target victims in Russia and other Commonwealth of Independent States countries. This comes as no surpriseas we have seen previously, CIS authorities dont bother investigating these groups unless they are operating against targets in their area of jurisdiction.

That does not mean that the LockBit group wont do business with other CIS-based gangs. In fact, they wont work with English-speaking developers without a Russian-speaking guarantor to vouch for them.

In this most recent evolution of LockBit, the malware now drops a ransom note that threatens to leak data the malware has stolen from victims: !!! We also download huge amount of your private data, including finance information, clients personal info, network diagrams, passwords and so on. Dont forget about GDPR.

If the threat were to be carried out, it might result in real-world sanctions against the ransomware victims from regulators or privacy authoritiesfor example, for violating the European Unions General Data Privacy Rules (GDPR) that make companies responsible for securing sensitive customer data in their possession.

An increasing number of ransomware gangs use extortion that threatens the release of private data, which might include sensitive customer information, trade secrets, or embarrassing correspondence to incentivize victims to pay the ransom, even if they have backups that prevented data loss. The data leak threat has become a signature of the REvil and Maze ransomware gangs; the Maze group has gone as far as to publicly publish chunks of data from victims who fail to pay by the deadline, taking down the dumps when they are finally paid.

From a first glance at the recent LockBit sample with a reverse-engineering tool, we can tell that the program was written primarily in C++ with some additions made using Assembler. For example, a few anti-debug techniques employ the fs:30h function call to manually check the PEB (Process Environment Block) for the BeingDebugged flag, instead of using IsDebuggerPresent().

The first thing the ransomware does at execution is to check whether the sample was executed with any parameters added from the command line. Usually, this is done to check for whether the sample is being executed in a sandbox environment. Contemporary malware often requires that the command to run the malware use specific parameters to prevent the malware from being analyzed by an automated sandbox, which often execute samples without parameters. But the LockBit sample we examined doesnt do thatit wont execute if there is any parameter entered from the command line. If there are no arguments in the command that executes it, Lockbit hides its console output, where the malware prints debug messages, and proceeds to do its job.

This could be intended to detect if the sample was executed in a sandbox environment. But its possible that either the malware author made a mistake in the implementation of the check (and wanted to check the other way around), or that this behavior is just a placeholder, and future versions will introduce different logic.

LockBits author also used several techniques to make it more difficult to reconstruct the code behind it. The Portable Executable (PE) binary shows signs of being heavily optimized, as well as some efforts by the group to cover their coding tracksor at least get rid of some of the low-hanging fruit that reverse engineering tools look for, such as unencrypted text strings.

Those heavy optimizations also increase LockBits performance. The binary makes heavy use of Intels SSE instruction set and architecture-specific features to boost its performance. That includes the use of multiple XMM registers used to store and decrypt the service names, process names and other strings used to interact with the operating system that are unique to the ransomware.

These string variables get decrypted on the fly with a 1-byte XOR key unique to each string: the first hex byte of every variable.

Almost all the functions contain a small routine that loops around and is in charge of decrypting hidden strings. In this case, we can see that how the original MSSQLServerADHelper100 service name gets de-obfuscated: the malware leverages a one-byte 0A XOR key to decrypt the plaintext service name.

To ensure that it can do the most damage possible, LockBit has a procedure to check whether its process has Administrator privileges. And if it doesnt, it uses a technique that is growing in popularity among malware developers: a Windows User Account Control (UAC) bypass.

Leveraging OpenProcessToken, it queries the current process via a TOKEN_QUERY access mask. After that, it calls CreateWellKnownSid to create a user security identifier (SID) that matches the administrator group (WinBuiltinAdministratorsSid), so now the malware has a reference it can use for comparisons. Finally, it checks whether the current process privileges are sufficient for Administrator rights, with a call to CheckTokenMembership.

If the current process does not have Admin privileges, the ransomware tries to sidestep Windows UAC with a bypass. In order for that to succeed, a Windows COM object needs to auto-elevate to Admin-level access first.

To make this possible, LockBit calls a procedure called supMasqueradeProcess upon process initialization. Using supMasqueradeProcess allows LockBit to conceal its process information by injecting into a process running in a trusted directory. And what better target is there for that than explorer.exe?

The source code for the masquerade procedure can be found in a Github repository.

With the use of IDA Pros COM helper tool, we see two CLSIDsglobally unique identifiers that identify COM class objectthat LockBits code references. CLSIDs, represented as 128-bit hexadecimal numbers within a pair of curly braces, are stored in the Registry path HKEY_LOCAL_MACHINESoftwareClassesCLSID.

Looking up these reveals that the two CSLIDS belong to IColorDataProxy and ICMLuaUtilboth undocumented COM interfaces that are prone to UAC bypass.

Masquerading as explorer.exe, LockBit calls CoInitializeEx to initialize the COM library, with COINIT_MULTITHREADED and COINIT_DISABLE_OLE1DDE flags to set the concurrency model. The hex values here (CLSIDs) are then moved and aligned into the stack segment register, and the next function call (lockbit.413980) will further use them.

Lockbit.413980 hosts the COM elevation moniker, which allows applications that are running under user account control (UAC) to activate COM classes (via the following format: Elevation:Administrator!new:{guid} ) with elevated privileges.

The malware adds the 2 previously seen CLSIDs to the moniker and executes them.

Now, the privilege has been successfully elevated with the UAC bypass and the control flow is passed back to the ransomware. We also notice two events and a registry key change during the execution:

LockBit enumerates the currently running processes and started services via the API calls CreateToolhelp32Snapshot, Process32First, Process32Next and finally OpenProcess, and compares the names against an internal service and process list. If one process matches with one on the list, LockBit will attempt to terminate it via TerminateProcess.

The procedure to kill a service is a bit different. The malware will first connect to the Service Control Manager via OpenSCManagerA. It then attempts to check whether a service from the list exists via OpenServiceA. If the targeted service is present, it then tries to determine its state by calling to QueryServiceStatusEx. Based on the status returned, it will call ControlService with the parameter SERVICE_CONTROL_STOP (0x00000001) on the specific service to stop it. But before that, another function (0x40F310) will cycle through all dependent services in conjunction with the target service, so dependencies are stopped too. The malware will initiate calls to EnumDependentServicesA to achieve this.

The services that the malware tries to stop include anti-virus software (to avoid detection) and backup solution services. (Sophos is not affected by this attempt.) Other services are stopped because they might lock files on the disk, and might make it more difficult for the ransomware to easily acquire handles to filesstopping them improves LockBits effectiveness.

Some of the services of note that the ransomware attempts to stop, in the order they are coded into the ransomware, are:

In addition to the list of services to kill, LockBit also carries a list of things not to encrypt, including certain folders, specific files and files with certain extensions that are important to the operating systemsince disabling the operating system would make it difficult for the victim to receive and act upon the ransom note. These are stored in obfuscated lists within the code (shown below), A function within LockBit uses the FindFirstFileExW and FindNextFileW API calls to read through the file names and folder names on the targeted disk, and then a simple lstrcmpiW function is called to compare the hardcoded list with those names.

This slideshow requires JavaScript.

Recently, we have seen ransomware groups taking more advanced concepts and applying it to their craft. One of these advanced concepts applied in LockBit is the use of Input/Output Completion Ports (IOCPs).

IOCPs are a model for creating a queue to efficient threads to process multiple asynchronous I/O requests. They allow processes to handle many concurrent asynchronous I/O more quickly and efficiently without having to create new threads each time they get an I/O request.

That capability makes them well-suited to ransomware. The sole purpose of ransomware is to encrypt as many delicate files as possible, rendering the users data useless. REvil (Sodinokibi) ransomware also uses IOCPs to achieve higher encryption performance.

LockBits aim was to be much faster than any other multi-threaded locker. The group behind the ransomware claims to have used the following methods to boost the performance of their file encryption:

Once a file is marked for encryptionmeaning, it did not match entries on the skip-lista LockBit routine checks whether the file already has a .lockbit extension. If it does not, it encrypts the file and appends the .lockbit extension to the end of the filename.

Lockbit relies on LoadLibraryA and GetProcAddress to load bcrypt.dll and import the BCryptGenRandom function. If the malware successfully imports that DLL, it uses BCRYPT_USE_SYSTEM_PREFERRED_RNG which means use the system-preferred random number generator algorithm. If the malware was unsuccessful calling bcrypt.dll, it invokes CryptAcquireContextW and CryptGenRandom to invoke the Microsoft Base Cryptographic Provider v1.0 and generates 32 bytes of random data to use as a seed.

Also, at this stage, the hardcoded ransom note, Restore-My-Files.txt, gets de-obfuscated and the ransomware drops the .txt file in every directory that contains at least one encrypted file.

LockBit creates 2 registry keys with key blobs as values under the following registry hive: HKEY_CURRENT_USERSoftwareLockBit

The two registry keys are:

LockBitfullLockBitPublic

These registry keys correlate with the Victim ID, file markers, and the unique TOR URL ID that LockBit builds for each system it takes down.

Lets take the unique TOR URL from the ransom note:

In this example, the 16 byte long unique ID is at the end of the URL, http://lockbitks2tvnmwk[.]onion/?A0C155001DD0CB01AE0692717A2DB14A :

The file marker (0x10 long) is divided into 2 sections:

A0C155001DD0CB01

The first 8 bytes of the file marker and the first 8 bytes of the TOR unique URL ID.

D4EA7A79A0835006

The second 8 bytes are same for all encrypted files in a given run

Also, the value of the full registry key (0x500 long, starting as 1A443C7179498278B40DC082FCF8DE26 in this example) is also present in every encrypted file, just before the file marker.

For a successful ransomware hit and run, the goal is to encrypt as many files as possible. So naturally, LockBit scans for network shares and other attached drives with the help of the following API calls.

First, the malware enumerates the available drive letters with a call to GetLogicalDrives, then it cycles through the found drives and uses a call to GetDriveTypeW to determine whether the drive letters it finds are network shares bycomparing the result with 0x4 (DRIVE_REMOTE).

Once it finds a networked drive, it calls WNetGetConnectionW to get the name of the share, then recursively enumerates all the folders and files on the share using the WNetOpenEnumW, WNetEnumResourceW API calls.

The ransomware can also enter network shares that might require user credentials. LockBit uses the WNetAddConnection2W API call with parameters lpUserName = 0 and lpPassword = 0, which (counterintuitively) transmits the username and password of the current, logged in user to connect to the given share. Then it can enumerate the share using the NetShareEnum API call.

I an attempt to ensure that LockBit would not be kept from finishing its job by a system shutdown, the developers of this ransomware implemented a small routine that uses a call to ShutdownBlockReasonCreate.

The developers didnt try to conceal the ransomware as the cause of the shutdown block: the ransomware sets the message for blocking shutdown as LockBit Ransom.Computer users would also see the message LockBit Ransom under the process name.

SetProcessShutdownParameters is also called to set the shutdown order level of the ransomwares process to 0, the lowest level, so that the ransomwares parent process will be active as long as it can, before a shutdown terminates the process.

If the system is shut down, the malware also has capability to persist after a reboot. LockBit creates a registry key to restart itself under HKCUSOFTWAREMicrosoftWindowsCurrentVersionRun, called XO1XADpO01.

LockBit prevents multiple ransomware instances on a single system by way of a hardcoded mutex: Global{BEF590BE-11A6-442A-A85B-656C1081E04C}. Before LockBit starts encrypting, the ransomware checks that the mutex does not already exist by calling OpenMutexA, and calls ExitProcess if it does.

As soon as the ransomware is mapped into memory and the encryption process finishes, the sample will execute the following command to maintain a stealthy operation:

The ping command at the front is used because the sample cant delete itself, due to the fact that it is locked. Once ping terminates, the command can delete the executable.

We clearly see an evolution to the applied technique here: in the earlier versions, the sample was missing a Del procedure at the end, so the ransomware would not delete itself.

In the recent version, the crooks had decided to use fsutil to basically zero out the initial binary to perhaps throw off forensic analysis efforts. After the file is zeroed out, the now null-file is deleted also, making double-sure the malware is not forensically recoverable.

As we noted earlier, LockBits developers wanted to avoid having their ransomware hit victims in Commonwealth of Independent States (CIS) countries. The mechanism used by the ransomware to achieve this calls GetUserDefaultLangID and looks for specific language identifier constants in the region format setting for the current user. If the current users language setting matches any of the values below, the ransomware exits and does not start the encryption routine.

To get the affected users attention, the malware (as is typical) creates and displays a ransom note wallpaper. A set of API calls are involved in this process, listed below.

The created wallpaper gets stored under %APPDATA%LocalTempA7D8.tmp.bmp.

In the meantime, the malware also sets a few registry keys so that the wallpaper is not tiled, and the image is stretched out to fill the screen:

HKEY_CURRENT_USERControl PanelDesktop

LockBit leverages a very similar service-list to MedusaLocker ransomware. It comes as no surprise that crooks copy these lists, so they dont have to reinvent the wheel.

The unique Registry run key and ransom note filename that was written by LockBitXO1XADpO01 and Restore-My-Files.txt were also seen being used by Phobos, and by a Phobos imposter ransomware. This would suggest that there is a connection between these families, but without further evidence that is hard to justify.

A recent Twitter post demonstrates what the future looks like for LockBit. In a recent LockBit attack, the MBR was overwritten with roughly 2000 bytes; The infected machine would not boot up unless a password is supplied. The hash of this sample is currently not known.

The e-mail used for extortion ondrugs@firemail.cc was also seen with STOP ransomwarean uncanny connection. The group behind might be related.

There is also speculation that application Diskcryptor was combined with the ransomware to add this extra lockdown layer. The MAMBA ransomware was also using this technique, leveraging Diskcryptor to lock the victim machine. DiskCryptor is currently being detected as AppC/DCrpt-Gen by Sophos Anti-Virus.

A list of the indicators of compromise (IoCs) for this post have been published to the SophosLabs Github.

Original post:
LockBit ransomware borrows tricks to keep up with REvil and Maze - Naked Security

Endpoint Encryption Market Analysis by Size, Share, Top Key Manufacturers, Demand Overview, Regional Outlook And Growth Forecast to 2026 – Cole of…

Sophos

Global Endpoint Encryption Market Segmentation

This market was divided into types, applications and regions. The growth of each segment provides an accurate calculation and forecast of sales by type and application in terms of volume and value for the period between 2020 and 2026. This analysis can help you develop your business by targeting niche markets. Market share data are available at global and regional levels. The regions covered by the report are North America, Europe, the Asia-Pacific region, the Middle East, and Africa and Latin America. Research analysts understand the competitive forces and provide competitive analysis for each competitor separately.

To get Incredible Discounts on this Premium Report, Click Here @ https://www.marketresearchintellect.com/ask-for-discount/?rid=175596&utm_source=COD&utm_medium=888

Global Endpoint Encryption Market Regions and Countries Level Analysis

The regional analysis is a very complete part of this report. This segmentation highlights Endpoint Encryption sales at regional and national levels. This data provides a detailed and accurate analysis of volume by country and an analysis of market size by region of the world market.

The report provides an in-depth assessment of growth and other aspects of the market in key countries such as the United States, Canada, Mexico, Germany, France, the United Kingdom, Russia and the United States Italy, China, Japan, South Korea, India, Australia, Brazil and Saudi Arabia. The chapter on the competitive landscape of the global market report contains important information on market participants such as business overview, total sales (financial data), market potential, global presence, Endpoint Encryption sales and earnings, market share, prices, production locations and facilities, products offered and applied strategies. This study provides Endpoint Encryption sales, revenue, and market share for each player covered in this report for a period between 2016 and 2020.

Why choose us:

We offer state of the art critical reports with accurate information about the future of the market.

Our reports have been evaluated by some industry experts in the market, which makes them beneficial for the company to maximize their return on investment.

We provide a full graphical representation of information, strategic recommendations and analysis tool results to provide a sophisticated landscape and highlight key market players. This detailed market assessment will help the company increase its efficiency.

The dynamics of supply and demand shown in the report offer a 360-degree view of the market.

Our report helps readers decipher the current and future constraints of the Endpoint Encryption market and formulate optimal business strategies to maximize market growth.

Have Any Query? Ask Our Expert @ https://www.marketresearchintellect.com/need-customization/?rid=175596&utm_source=COD&utm_medium=888

Table of Contents:

Study Coverage: It includes study objectives, years considered for the research study, growth rate and Endpoint Encryption market size of type and application segments, key manufacturers covered, product scope, and highlights of segmental analysis.

Executive Summary: In this section, the report focuses on analysis of macroscopic indicators, market issues, drivers, and trends, competitive landscape, CAGR of the global Endpoint Encryption market, and global production. Under the global production chapter, the authors of the report have included market pricing and trends, global capacity, global production, and global revenue forecasts.

Endpoint Encryption Market Size by Manufacturer: Here, the report concentrates on revenue and production shares of manufacturers for all the years of the forecast period. It also focuses on price by manufacturer and expansion plans and mergers and acquisitions of companies.

Production by Region: It shows how the revenue and production in the global market are distributed among different regions. Each regional market is extensively studied here on the basis of import and export, key players, revenue, and production.

About Us:

Market Research Intellect provides syndicated and customized research reports to clients from various industries and organizations with the aim of delivering functional expertise. We provide reports for all industries including Energy, Technology, Manufacturing and Construction, Chemicals and Materials, Food and Beverage and more. These reports deliver an in-depth study of the market with industry analysis, market value for regions and countries and trends that are pertinent to the industry.

Contact Us:

Mr. Steven Fernandes

Market Research Intellect

New Jersey ( USA )

Tel: +1-650-781-4080

Tags: Endpoint Encryption Market Size, Endpoint Encryption Market Growth, Endpoint Encryption Market Forecast, Endpoint Encryption Market Analysis

See original here:
Endpoint Encryption Market Analysis by Size, Share, Top Key Manufacturers, Demand Overview, Regional Outlook And Growth Forecast to 2026 - Cole of...

Amazon pushes the button on Keyspaces: Cassandra lookalike to boost its NoSQL credentials – The Register

Amazon has announced the general availability of a serverless NoSQL database in Amazon Keyspaces, with more than a passing resemblance to the open-source Apache Cassandra.

First touted as Amazon Managed Apache Cassandra Service (AMAC) at AWS re:Invent 2019 last December, "Cassandracompatible" Keyspaces allows users to build applications on the Cassandra Query Language (CQL) code. They can employ Apache 2.0-licensed Cassandra drivers, and use developer tools they already have on Cassandra.

Soooo, why not just use Cassandra, since it is free after all?

The answer comes from the original launch of AMAC.

Amazon's rationale is that managing large Cassandra clusters can be difficult and take a lot of time. Set-up, configuration, and maintenance of the underlying infrastructure require a strong understanding of the entire application stack, including the Apache Cassandra open source software, Amazon said.

Meanwhile, scaling down from peak workloads is complex.

Bezos' juggernaut claims that Keyspaces takes all that pain away. Provisioning, patching, and maintenance are all taken care of. In addition, Amazon Keyspaces is serverless, and users only pay for the resources they use.

"You can build applications that serve thousands of requests per second with virtually unlimited throughput and storage," Amazon commented.

The firm promised "consistent, single-digit-millisecond performance at any scale" with Keyspaces promising 99.99 per cent availability service level agreement within an AWS Region. Users can also manage access to tables by using AWS Identity and Access Management, connect your resources to your virtual private cloud (VPC), and keep applications running with integrated logging and monitoring.

Cassandra is one of the most popular NoSQL databases with a loyal user base which may welcome using the tools they have become accustomed to on a database that is easier to provision and manage in AWS. If Amazon lives up to its promise, Keyspaces might be that database.

But the Cassandra faithful will point to other developments that also help the deployment of the open-source database into the cloud. In March, DataStax announced the availability of DSE 6.8 with Kubernetes Operator, the idea being to make Cassandra easier to deploy and manage, with Kubernetes at least.

Casandra has gained popularity as a database that can manage heavy workloads and help users perform real-time analytics, so it has been popular for web data and IoT use cases. But it is not ACID-compliant so it isn't appropriate for applications where consistency is paramount.

Sponsored: Webcast: Build the next generation of your business in the public cloud

Original post:
Amazon pushes the button on Keyspaces: Cassandra lookalike to boost its NoSQL credentials - The Register

Quantum Computing Is Hot And Noisy, But Zapata Opens Early Access – Forbes

Zapata's quantum coders, ready for a hot & noisy ride.

Were on the road to quantum computing. But these massively powerful machines are still in somewhat embryonic prototype stages and we still have several key challenges to overcome before we can start to build more of them.

As a quantum reminder: traditional computers compute on the basis of binary 1s and 0s, so all values and mathematical logic are essentially established from a base of those two values quantum superposition particles (known as qubits) can be 1 or 0, or anywhere in between and the value expressed can be differentiated depending upon what angle the qubit is viewed from so with massively more breadth, we can create a lot more algorithmic logic and computing power.

One of the main challenges associated with building quantum computing machines is the massive heat they generate. Scientists have been working with different semiconducting materials such as so-called quantum dots to help overcome the heat challenge. This issue is that qubits are special, qubits are powerful, but qubits are also fragile... and heat is one of their sworn enemies.

Another core challenge is noise.

As computations pass through the quantum gates that make up the quantum circuits in our new super quantum machines they create a lot of noise disturbance (think of an engine revving louder as it speeds up), so this means we have come to define and accept the term NISQ-based quantum applications i.e. Noisy Intermediate-Scale Quantum (NISQ).

As beautifully clarified by theoretical physicist John Preskill in this 2018 paper, Noisy Intermediate-Scale Quantum (NISQ) technology will be available in the near future. Quantum computers with 50-100 qubits may be able to perform tasks which surpass the capabilities of todays classical digital computers, but noise in quantum gates will limit the size of quantum circuits that can be executed reliably. Quantum technologists should continue to strive for more accurate quantum gates and, eventually, fully fault-tolerant quantum computing.

The fact that we know about the heat and noise challenges hasnt stopped companies like Strangeworks, D-Wave Systems, Coldquanta and others (including usual suspects Intel, IBM and Microsoft) forging on with development in the quantum space. Joining that list is Boston-headquartered Zapata Computing, Inc. The company describes itself as the quantum software company for near-term/NISQ-based quantum applications empowering enterprise teams. Near-term in this case meaning, well, now i.e. quantum stuff we can actually use on quantum devices of about 100-300 qubits.

Zapatas latest quantum leap in quantum (pun absolutely intended) is an early access program to Orquestra, its platform for quantum-enabled workflows. The company claims to have provided a software- and hardware-interoperable enterprise quantum toolset i.e. again, quantum tools we can actually use in modern day enterprise IT departments.

Using Zapatas unified Quantum Operating Environment, users can build, run and analyze quantum and quantum-inspired workflows. This toolset will empower enterprises and institutions to make their quantum mark on the world, enabling them to develop quantum capabilities and foundational IP today while shoring up for derivative IP for tomorrow, says CEO Christopher Savoie. It is a new computing paradigm, built on a unified enterprise framework that spans quantum and classical programming and hardware tools. With Orquestra, we are accelerating quantum experiments at scale.

Zapatas Early Access Program to Orquestra is aimed at users with backgrounds in software engineering, machine learning, physics, computational chemistry or quantum information theory working on the most computationally complex problems.

Orquestra is agnostic across the entire software and hardware stack. It offers an extensible library of open source and Zapata-created components for writing, manipulating and optimizing quantum circuits and running them across quantum computers, quantum simulators and classical computing resources. It comes equipped with a versatile workflow system and Application Programming Interfaces (APIs) to connect all modes of quantum devices.

We developed Orquestra to scale our own work for our customers and then realized the quantum community needs it, too. Orquestra is the only system for managing quantum workflows, said Zapata CTO Yudong Cao. The way we design and deploy computing solutions is changing. Orquestras interoperable nature enables extensible and modular implementations of algorithms and workflows across platforms and unlocks fast, fluid repeatability of experiments at scale.

So were on a journey. The journey is the road from classical-to-quantum and the best advice is to insist upon an interoperable vehicle (as Zapata has provided here) and to take a modular and extensible approach. In car analogy theory, that would mean break your journey up into bite-size chunks and make sure you have enough gas for the long haul when it comes. The quantum software parallel is obvious enough not to even explain.

Even when quantum evolves to become more ubiquitously available, many people think it will still be largely delivered as a cloud computing Quantum-as-a-Service (QaaS) package, but understanding the noisy overheated engine room in the meantime makes for a fascinating movie preview.

Read the original here:
Quantum Computing Is Hot And Noisy, But Zapata Opens Early Access - Forbes

Wiring the Quantum Computer of the Future: aNovel Simple Build with Existing Technology – Analytics Insight

Wiring the Quantum Computer of the Future: a Novel Simple Build with Existing Technology

The basic units of a quantum computer can be rearranged in 2D to solve typical design and operation challenges

Efficient quantum computing is expected to enable advancements that are impossible with classical computers. Scientists from Japan and Sydney have collaborated and proposed a novel two-dimensional design that can be constructed using existing integrated circuit technology. This design solves typical problems facing the current three-dimensional packaging for scaled-up quantum computers, bringing the future one step closer.

Quantum computing is increasingly becoming the focus of scientists in fields such as physics and chemistry,and industrialists in the pharmaceutical, airplane, and automobile industries. Globally, research labs at companies like Google and IBM are spending extensive resources on improving quantum computers, and with good reason. Quantum computers use the fundamentals of quantum mechanics to process significantly greater amounts of information much faster than classical computers. It is expected that when error-corrected and fault-tolerant quantum computation is achieved, scientific and technological advancement will occur at an unprecedented scale.

But, building quantum computers for large-scale computation is proving to be a challenge in terms of their architecture. The basic units of a quantum computer are the quantum bits or qubits. These are typically atoms, ions, photons, subatomic particles such as electrons,or even larger elements that simultaneously exist in multiple states, making it possible to obtain several potential outcomes rapidly for large volumes of data. The theoretical requirement for quantum computers is that these are arranged in two-dimensional (2D) arrays, where each qubit is both coupled with its nearest neighbor and connected to the necessary external control lines and devices. When the number of qubits in an array is increased, it becomes difficult to reach qubits in the interior of the array from the edge. The need to solve this problem has so far resulted in complex three-dimensional (3D) wiring systems across multiple planes in which many wires intersect,making their construction a significant engineering challenge.

A group of scientists from Tokyo University of Science, Japan, RIKEN Centre for Emergent Matter Science, Japan, and University of Technology, Sydney, led by Prof Jaw-Shen Tsai, proposes a unique solution to this qubit accessibility problem by modifying the architecture of the qubit array. Here, we solve this problem and present a modified superconducting micro-architecture that does not require any 3D external line technology and reverts to a completely planar design, they say. This study has been published in the New Journal of Physics.

The scientists began with a qubit square lattice array and stretched out each column in the 2D plane. They then folded each successive column on top of each other, forming a dual one-dimensional array called a bi-linear array. This put all qubits on the edge and simplified the arrangement of the required wiring system.The system is also completely in 2D. In this new architecture, some of the inter-qubit wiringeach qubit is also connected to all adjacent qubits in an arraydoes overlap, but because these are the only overlaps in the wiring, simple local 3D systems such as airbridges at the point of overlap are enough and the system overall remains in 2D. As you can imagine, this simplifies its construction considerably.

The scientists evaluated the feasibility of this new arrangement through numerical and experimental evaluation in which they tested how much of a signal was retained before and after it passed through an airbridge. Results of both evaluations showed that it is possible to build and run this system using existing technology and without any 3D arrangement.

The scientists experiments also showed them that their architecture solves several problems that plague the 3D structures: they are difficult to construct, there is crosstalk or signal interference between waves transmitted across two wires, and the fragile quantum states of the qubits can degrade. The novel pseudo-2D design reduces the number of times wires cross each other,thereby reducing the crosstalk and consequently increasing the efficiency of the system.

At a time when large labs worldwide are attempting to find ways to buildlarge-scale fault-tolerant quantum computers, the findingsof this exciting new study indicate that such computers can be built using existing 2D integrated circuit technology. The quantum computer is an information device expected to far exceed the capabilities of modern computers, Prof Tsai states.The research journey in this direction has only begun with this study, and Prof Tsai concludes by saying, We are planning to construct a small-scale circuit to further examine and explore the possibility.

###

ReferenceTitle of original paper: Pseudo-2D superconducting quantum computing circuit for the surface code: the proposal and preliminary tests

Journal:New Journal of Physics

DOI:10.1088/1367-2630/ab7d7d

Tokyo University of Science (TUS) is a well-known and respected university, and the largest science-specialized private research university in Japan, with four campuses in central Tokyo and its suburbs and in Hokkaido. Established in 1881, the university has continually contributed to Japans development in science through inculcating the love for science in researchers, technicians, and educators.

With a mission of Creating science and technology for the harmonious development of nature, human beings, and society, TUS has undertaken a wide range of research from basic to applied science. TUS has embraced a multidisciplinary approach to research and undertaken intensive study in some of todays most vital fields. TUS is a meritocracy where the best in science is recognized and nurtured. It is the only private university in Japan that has produced a Nobel Prize winner and the only private university in Asia to produce Nobel Prize winners within the natural sciences field.

Website:https://www.tus.ac.jp/en/mediarelations/

Dr Jaw-Shen Tsai is currently a Professor at the Tokyo University of Science, Japan. He began research in Physics in 1975 and continues to hold interest in areas such as superconductivity, the Josephson effect, quantum physics, coherence, qubits, and artificial atoms. He has 160+ research publications to his credit and serves as the lead author in this paper. He has also won several awards, including Japans Medal of Honor, the Purple Ribbon Award.

Professor Jaw-Shen Tsai

Department of Physics

Tokyo University of Science

Tsutomu Shimizu

Public Relations Divisions

Tokyo University of Science

Email: mediaoffice@admin.tus.ac.jp

Website: https://www.tus.ac.jp/en/mediarelations/

Share This ArticleDo the sharing thingy

View original post here:
Wiring the Quantum Computer of the Future: aNovel Simple Build with Existing Technology - Analytics Insight

Will Quantum Computing Really Change The World? Facts And Myths – Analytics India Magazine

In recent years, some big tech companies like IBM, Microsoft, Intel, or Google have been working in relative silence on something that sounds great: quantum computing. The main problem with this is that it is difficult to know what exactly it is and what it can be useful for.

There are some questions that can be easily solved. For example, quantum computing is not going to help you have more FPS on your graphics card at the moment. Nor will it be as easy as changing the CPU of your computer for a quantum to make it hyperfast. Quantum computing is fundamentally different from the computing we are used to, but how?

At the beginning of the 20th century, Planck and Einstein proposed that light is not a continuous wave (like the waves in a pond) but that it is divided into small packages or quanta. This apparently simple idea served to solve a problem called the ultraviolet catastrophe. But over the years other physicists developed it and came to surprising conclusions about the matter, of which we will be interested in two: the superposition of states and entanglement.

To understand why we are interested, lets take a short break and think about how a classic computer works. The basic unit of information is the bit, which can have two possible states (1 or 0) and with which we can perform various logical operations (AND, NOT, OR). Putting together n bits we can represent numbers and operate on those numbers, but with limitations: we can only represent up to 2 different states, and if we want to change x bits we have to perform at least x operations on them: there is no way to magically change them without touching them.

Well, superposition and entanglement allow us to reduce these limitations: with superposition, we can store many more than just 2 ^ n states with n quantum bits (qubits), and entanglement maintains certain relations between qubits in such a way that the operations in one qubit they forcefully affect the rest.

Overlapping, while looking like a blessing at first glance, is also a problem. As Alexander Holevo showed in 1973, even though we have many more states than we can save in n qubits, in practice we can only read 2 ^ n different ones. As we saw in an article in Genbeta about the foundations of quantum computing: a qubit is not only worth 1 or 0 as a normal bit, but it can be 1 in 80% and 0 in 20%. The problem is that when we read it we can only obtain either 1 or 0, and the probabilities that each value had of leaving are lost because when we measured it we modified it.

This discrepancy between the information kept by the qubits and what we can read led Benioff and Feynman to demonstrate that a classical computer would not be able to simulate a quantum system without a disproportionate amount of resources, and to propose models for a quantum computer that did. was able to do that simulation.

Those quantum computers would probably be nothing more than a scientific curiosity without the second concept, entanglement, which allows two quite relevant algorithms to be developed: quantum tempering in 1989 and Shors algorithm in 1994. The first allows finding minimum values of functions, which So said, it does not sound very interesting but it has applications in artificial intelligence and machine learning, as we discussed in another article. For example, if we manage to code the error rate of a neural network as a function to which we can apply quantum quenching, that minimum value will tell us how to configure the neural network to be as efficient as possible.

The second algorithm, the Shor algorithm, helps us to decompose a number into its prime factors much more efficiently than we can achieve on a normal computer. So said, again, it doesnt sound at all interesting. But if I tell you that RSA, one of the most used algorithms to protect and encrypt data on the Internet, is based on the fact that factoring numbers are exponentially slow (adding a bit to the key implies doubling the time it takes to do an attack by force) then the thing changes. A quantum computer with enough qubits would render many encryption systems completely obsolete.

Until now, quantum computing is a field that hasnt been applied much in the real world. To give us an idea, with the twenty qubits of the commercial quantum computer announced by IBM, we could apply Shors factorization algorithm only to numbers less than 1048576, which as you can imagine is not very impressive.

Still, the field has a promising evolution. In 1998 the first ord quantum drive (only two qubits, and needed a nuclear magnetic resonance machine to solve a toy problem (the so-called Deutsch-Jozsa problem). In 2001 Shors algorithm was run for the first time. Only 6 years later, in 2007, D-Wave presented its first computer capable of executing quantum quenching with 16 qubits. This year, the same company announced a 2000 qubit quantum quenching computer. On the other hand, the new IBM computers, although with fewer qubits, they are able to implement generic algorithms and not only that of quantum quenching. In short, it seems that the push is strong and that quantum computing will be increasingly applicable to real problems.

What can those applications be? As we mentioned before, the quantum tempering algorithm is very appropriate for machine learning problems, which makes the computers that implement it extremely useful, although the only thing they can do is run that single algorithm. If systems can be developed that, for example, are capable of transcribing conversations or identifying objects in images and can be translated to train them in quantum computers, the results could be orders of magnitude better than those that already exist. The same algorithm could also be used to find solutions to problems in medicine or chemistry, such as finding the optimal treatment methods for a patient or studying the possible structures of complex molecules.

Generic quantum computers, which have fewer qubits right now, could run more algorithms. For example, they could be used to break much of the crypto used right now as we discussed earlier (which explains why the NSA wanted to have a quantum computer). They would also serve as super-fast search engines if Grovers search algorithm can be implemented, and for physics and chemistry, they can be very useful as efficient simulators of quantum systems.

Unfortunately, algorithms and codes for classic computers couldnt be used on quantum computers and magically get an improvement in speed: you need to develop a quantum algorithm (not a trivial thing) and implement it in order to get that improvement. That, at first, greatly restricts the applications of quantum computers and will be a problem to overcome when those systems are more developed.

However, the main problem facing quantum computing is building computers. Compared to a normal computer, a quantum computer is an extremely complex machine: they operate at a temperature close to absolute zero (-273 C), the qubits support are superconducting and the components to be able to read and manipulate the qubits are not simple either.

What can a non-quantum quantum computer be like? As we have explained before, the two relevant concepts of a quantum computer are superposition and entanglement, and without them, there cannot be the speed improvements that quantum algorithms promise. If computer disturbances modify overlapping qubits and bring them to classical states quickly, or if they break the interweaving between several qubits, what we have is not a quantum computer but only an extremely expensive computer that only serves to run a handful of algorithms. equivalent to a normal computer (and will probably give erroneous results).

Of the two properties, entanglement is the most difficult to maintain and prove to exist. The more qubits there are, the easier it is for one of them to deinterlace (which explains why increasing the number of qubits is not a trivial task). And it is not enough to build the computer and see that correct results come out to say that there are intertwined qubits: looking for evidence of entanglement is a task in itself and in fact, the lack of evidence was one of the main criticisms of D-systems. Wave in its beginnings.

A priori and with the materials that quantum computers are being built with, it does not seem that miniaturization is too feasible. But there is already research on new materials that could be used to create more accessible quantum computers. Who knows if fifty years from now we will be able to buy quantum CPUs to improve the speed of our computers.

comments

See the original post:
Will Quantum Computing Really Change The World? Facts And Myths - Analytics India Magazine

Google’s top quantum computing brain may or may not have quit – Fudzilla

We will know when someone opens his office door

John Martinis, who had established Googles quantum hardware group in 2014, has cleaned out his office, put the cats out and left the building.

Martinis says a few months after he got Googles now legendary quantum computing experiment to go he was reassigned from a leadership position to an advisory one.

Martinis told Wired that the change led to disagreements with Hartmut Neven, the longtime leader of Googles quantum project.

Martinis said he had to go because his professional goal is for someone to build a quantum computer.

Google has not disputed this account, and says that the company is grateful for Martinis contributions and that Neven continues to head the companys quantum project.

Martinis retains his position as a professor at the UC Santa Barbara, which he held throughout his tenure at Google, and says he will continue to work on quantum computing.

To be fair, Googles quantum computing project was founded by Neven, who pioneered Googles image search technology, and got enough cats together.

The project took on greater scale and ambition when Martinis joined in 2014 to establish Googles quantum hardware lab in Santa Barbara, bringing along several members of his university research group. His nearby lab at UC Santa Barbara had produced some of the most prominent work in the field over the past 20 years, helping to demonstrate the potential of using superconducting circuits to build qubits, the building blocks of quantum computers.

Googles ground-breaking supremacy experiment used 53 qubits working together. They took minutes to crunch through a carefully chosen math problem the company calculated would take a supercomputer 10,000 years to work out. It still does not have a practical use, and the cats were said to be bored with the whole thing.

Originally posted here:
Google's top quantum computing brain may or may not have quit - Fudzilla

On the Heels of a Light Beam – Scientific American

As a 16-year-old boy, Albert Einstein imagined chasing after a beam of light in the vacuum of space. He mused on that vision for years, turning it over in his mind, asking questions about the relation between himself and the beam. Those mental investigations eventually led him to his special theory of relativity. Such thought experiments, which Einstein referred to by the German term gedankenexperiment, continue to nourish the heart of physics today, especially in the field of quantum mechanics, which he helped to establish.

In quantum mechanics, things don't happen, theoretical physicist Stephen L. Adler tells our reporter Tim Folger, referring to the probabilistic nature of quantum reality.

Philosophically, this may be true, but it hasn't stopped researchers from testing quantum concepts. Using lasers to excite electrons into emitting photons, a group at Delft University of Technology in the Netherlands ruled out the existence of hidden variables, which Einstein believed were controlling so-called entangled particlesone of the main tenets of quantum theory. Without these mysterious forces, bizarre dynamics could indeed be at work in the quantum world, defying our notions of space and time. Physicist Lee Smolin argues that the fabric of the cosmos is a vast collection of atomic interactions within an evolving network of relations where causality among events is complex and irrespective of distance.

Despite the theoretical mysteries of quantum theory, its real-world applications are growing. Researchers are cooling atomic systems to near absolute zero for use as quantum simulators to study applications in superconductors and superfluids. Others are using tabletop experiments to monitor the gravitational fields around entangled objectsminuscule gold or diamond spheres, for examplelooking for signs that gravity itself is quantized into discrete bits. At a larger scale, tools such as the Event Horizon Telescope, which recently took the first picture of a black hole, and gravitational-wave detectors could help resolve long-standing, vexing contradictions between quantum mechanics and general relativity.

These quantum insights are fueling tremendous innovation. A team of researchers in China successfully tested superposition over a distance of 1,200 kilometers, paving the way for an unhackable quantum-communications network. Computer scientists are using quantum algorithms to enhance traditional systems, ratcheting up progress toward the heralded quantum computing era. Such applications are still immature, as Elizabeth Gibney reports, yet it's not stopping investors from pouring money into quantum start-ups.

Science historians have argued about whether Einstein accepted the elements of quantum theory that conflicted with his own theories. Who knows whether he could have imagined the applications his ideas engendered. In any case, the thought experiment continues.

See the original post:
On the Heels of a Light Beam - Scientific American

Eleven Princeton faculty elected to American Academy of Arts and Sciences – Princeton University

Princeton faculty members Rubn Gallo, M. Zahid Hasan, Amaney Jamal, Ruby Lee, Margaret Martonosi, Tom Muir, Eve Ostriker, Alexander Smits, Leeat Yariv and Muhammad Qasim Zaman have been named members of the American Academy of Arts and Sciences. Visiting faculty member Alondra Nelson also was elected to the academy.

They are among 276 scholars, scientists, artists and leaders in the public, nonprofit and private sectors elected this year in recognition of their contributions to their respective fields.

Gallo is the Walter S. Carpenter, Jr., Professor in Language, Literature, and Civilization of Spain and a professor of Spanish and Portuguese. He joined the Princeton faculty in 2002. His most recent book is Conversacin en Princeton(2017)with Mario Vargas Llosa, who was teaching at Princeton when he received the Nobel Prize in Literature in 2010.

Gallos other books include Prousts LatinAmericans(2014);Freuds Mexico: Into the Wilds of Psychoanalysis(2010); Mexican Modernity: the Avant-Garde and the Technological Revolution(2005); New Tendencies in Mexican Art(2004); andThe Mexico City Reader(2004). He is currently working on Cuba: A New Era, a book about the changes in Cuban culture after the diplomatic thaw with the United States.

Gallo received the Gradiva award for the best book on a psychoanalytic theme and the Modern Language Associations Katherine Singer Kovacs Prize for the best book on a Latin American topic. He is a member of the board of the Sigmund Freud Museum in Vienna, where he also serves as research director.

Photo by

Nick Barberio, Office of Communications

Hasan is the Eugene Higgins Professor of Physics. He studiesfundamental quantum effects in exotic superconductors, topological insulators and quantum magnetsto make new discoveries about the nature of matter, work that may have future applications in areas such asquantum computing. He joined the faculty in 2002and has since led his research team to publish many influential findings.

Last year, Hasans lab led research that discovered that certain classes of crystals with an asymmetry like biological handedness, known as chiral crystals, may harbor electrons that behave in unexpected ways. In 2015, he led a research team that first observed Weyl fermions, which, if applied to next-generation electronics, could allow for a nearly free and efficient flow of electricity in electronics, and thus greater power, especially for computers.

In 2013, Hasan was named a fellow of the American Physical Society for the experimental discovery of three-dimensional topological insulators a new kind of quantum matter. In 2009, he received a Sloan Research Fellowship for groundbreaking research.

Photo by Tori Repp/Fotobuddy

Jamal is the Edwards S. Sanford Professor of Politics and director of the Mamdouha S. Bobst Center for Peace and Justice. She has taught at Princeton since 2003. Her current research focuses on the drivers of political behavior in the Arab world, Muslim immigration to the U.S. and Europe, and the effect of inequality and poverty on political outcomes.

Jamal also directs the Workshop on Arab Political Development and the Bobst-AUB Collaborative Initiative. She is also principal investigator for the Arab Barometer project, which measures public opinion in the Arab world. She is the former President of the Association of Middle East Womens Studies.

Her books include Barriers to Democracy (2007), which won the 2008 APSA Best Book Award in comparative democratization, and Of Empires and Citizens, which was published by Princeton University Press (2012). She is co-editor of Race and Arab Americans Before and After 9/11: From Invisible Citizens to Visible Subjects (2007) and Citizenship and Crisis: Arab Detroit after 9/11 (2009).

Photo by Tori Repp/Fotobuddy

Lee is the Forrest G. Hamrick Professor in Engineering and professor of electrical engineering. She is an associated faculty member in computer science. Lee joined the Princeton faculty in 1998.Her work at Princeton explores how the security and performance of computing systems can be significantly and simultaneously improved by hardware architecture. Her designs of secure processor architectures have strongly influenced industry security offerings and also inspired new generations of academic researchers in hardware security, side-channel attacks and defenses, secure processors and caches, and enhanced cloud computing and smartphone security.

Her research lies at the intersection of computer architecture, cybersecurity and, more recently, the branch of artificial intelligence known as deep learning.

Lee spent 17 years designing computers at Hewlett-Packard, and was a chief architect there before coming to Princeton. Among many achievements, Lee is known in the computer industry for her design of the HP Precision Architecture (HPPA or PA-RISC) that powered HPs commercial and technical computer product families for several decades, and was widely regarded as introducing key forward-looking features. In the '90s she spearheaded the development of microprocessor instructions for accelerating multimedia, which enabled video and audio streaming, leading to ubiquitous digital media.Lee is a fellow into the Association for Computing Machinery and the Institute of Electrical and Electronics Engineers.

Margaret Martonosi, the Hugh Trumbull Adams 35 Professor of Computer Science, specializes in computer architecture and mobile computing with an emphasis on power efficiency. She was one of the architects of the Wattch power modeling infrastructure, a tool that was among the first to allow computer scientists to incorporate power consumption into early-stage computer systems design. Her work helped demonstrate that power needs can help dictate the design of computing systems. More recently, Martonosis work has also focused on architecture and compiler issues in quantum computing.

She currently serves as head of the National Science Foundations Directorate for Computer and Information Science and Engineering, one of seven top-level divisions within the NSF. From 2017 until February 2020, she directed Princetons Keller Center for Innovation in Engineering Education, a center focused on enabling students across the University to realize their aspirations for addressing societal problems. She is an inventor who holds seven U.S. patents and has co-authored two technical reference books on power-aware computer architecture. In 2018, she was one of 13 co-authors of a National Academies consensus study report on progress and challenges in quantum computing.

Martonosi is a fellow of the Association for Computing Machinery (ACM) and the Institute of Electrical and Electronics Engineers IEEE). Among other honors, she has received a Jefferson Science Fellowship, the IEEE Technical Achievement Award, and the ACM SIGARCH Alan D. Berenbaum Distinguished Service Award. She joined the Princeton faculty in 1994.

Muir is the Van Zandt Williams, Jr. Class of 65 Professor of Chemistry and chair of the chemistry department. He joined Princeton in 2011 and is also an associated faculty member in molecular biology.

He leads research in investigating the physiochemical basis of protein function in complex systems of biomedical interest. By combining tools of organic chemistry, biochemistry, biophysics and cell biology, his lab has developed a suite of new technologies that provide fundamental insight into how proteins work. The chemistry-driven approaches pioneered by Muirs lab are now widely used by chemical biologists around the world.

Muir has published over 150 scientific articles and has won a number of honors for his research.He received a MERIT Award from the National Institutes of Health and is a fellow of American Association for the Advancement of Science and the Royal Society of Edinburgh.

Photo by Thomas Sayers Ellis

Nelson is the Harold F. Linder Chair in the School of Social Science at the Institute for Advanced Study and a visiting lecturer with the rank of professor in sociology at Princeton. She is president of the Social Science Research Council and is one of the country's foremost thinkers in the fields of science, technology, social inequalityand race. Her groundbreaking books include "The Social Life of DNA: Race, Reparations, and Reconciliation after the Genome" (2016) and "Body and Soul: The Black Panther Party and the Fight Against Medical Discrimination" (2011).Her other books include"Genetics and the Unsettled Past: The Collision of DNA, Race, and History" (with Keith Wailoo of Princeton and Catherine Lee) and"Technicolor: Race, Technology, and Everyday Life" (with Thuy Linh Tu). In 2002 she edited "Afrofuturism," a special issue of Social Text.

Nelson's writings and commentary also have reached the broader public through a variety of outlets. She has contributed to national policy discussions on inequality and the implications of new technology on society.

She is an elected fellow of the American Academy of Political and Social Science, the Hastings Centerand the Sociological Research Association. She serves on several advisory boards, including the Andrew. W. Mellon Foundation and the American Association for the Advancement of Science.

Ostriker, professor of astrophysical sciences, studies the universe. Her research is in the area of theoretical and computational astrophysics, and the tools she uses are powerful supercomputers and algorithms capable of simulating the birth, life, death and reincarnation of stars in their galactic homes. Ostriker and her fellow researchers build computer models using fundamental physical laws ones that govern gravity, fluid dynamics and electromagnetic radiation to follow the evolution of conditions found in deep space.

Ostriker, who came to Princeton in 2012, and her team have explored the formation of superbubbles, giant fronts of hot gas that billow out from a cluster of supernova explosions. More recently, she and her colleagues turned their focus toward interstellar clouds.

The research team uses computing resources through the Princeton Institute for Computational Science and Engineering and its TIGER and Perseus research computing clusters, as well as supercomputers administered through NASA. In 2017, Ostriker received a Simons Investigator Award.

Photo by

Nick Donnoli, Office of Communications

Smits is the Eugene Higgins Professor of Mechanical and Aerospace Engineering, Emeritus. His research spans the field of fluid mechanics, including fundamental turbulence, supersonic and hypersonic flows, bio-inspired flows, sports aerodynamics, and novel energy-harvesting concepts.

He joined the Princeton faculty in 1981 and transferred to emeritus status in 2018. Smits served as chair of the Department of Mechanical and Aerospace Engineering for 13 years and was director of the Gas Dynamics Laboratory on the Forrestal Campus for 33 years. During that time, he received several teaching awards, including the Presidents Award for Distinguished Teaching.

Smits has written more than 240 articles and three books, and edited seven volumes. He was awarded seven patents and helped found three companies. He is a member of the National Academy of Engineering and a fellow of the American Physical Society, the American Institute of Aeronautics and Astronautics, the American Society of Mechanical Engineers, the American Association for the Advancement of Science, and the Australasian Fluid Mechanics Society.

Yariv is the Uwe Reinhardt Professor of Economics. An expert in applied theory and experimental economics, her research interests concentrate on game theory, political economy, psychology and economics. She joined the faculty in 2018. Yariv also is director of the Princeton Experimental Laboratory for the Social Sciences.

She is a member of several professional organizations and is lead editor of American Economic Journal: Microeconomics, a research associate with the Political Economy Program of the National Bureau of Economic Research, and a research fellow with the Industrial Organization Programme of the Centre for Economic Policy Research.

She is also a fellow of the Econometric Society and the Society for the Advancement of Economic Theory, and has received numerous grants for researchand awards for her many publications.

Zaman, who joined the Princeton faculty in 2006, is the Robert H. Niehaus 77 Professor of Near Eastern Studies and Religion and chair of the Department of Near Eastern Studies.

He has written on the relationship between religious and political institutions in medieval and modern Islam, on social and legal thought in the modern Muslim world, on institutions and traditions of learning in Islam, and on the flow of ideas between South Asia and the Arab Middle East. He is the author of Religion and Politics under the Early Abbasids (1997), The Ulama in Contemporary Islam: Custodians of Change (2002), Ashraf Ali Thanawi: Islam in Modern South Asia (2008), Modern Islamic Thought in a Radical Age: Religious Authority and Internal Criticism (2012), and Islam in Pakistan: A History (2018). With Robert W. Hefner, he is also the co-editor of Schooling Islam: The Culture and Politics of Modern Muslim Education (2007); with Roxanne L. Euben, of Princeton Readings in Islamist Thought (2009); and, as associate editor, with Gerhard Bowering et al., of the Princeton Encyclopedia of Islamic Political Thought (2013). Among his current projects is a book on South Asia and the wider Muslim world in the 18th and 19th centuries.

In 2017, Zaman received Princetons Graduate Mentoring Award. In 2009, he received a Guggenheim Fellowship.

The mission of the academy: Founded in 1780, the American Academy of Arts and Sciences honors excellence and convenes leaders from every field of human endeavor to examine new ideas, address issues of importance to the nation and the world, and work together to cultivate every art and science which may tend to advance the interest, honor, dignity, and happiness of a free, independent, and virtuous people.

Read the original here:
Eleven Princeton faculty elected to American Academy of Arts and Sciences - Princeton University

The Economic Impact of Coronavirus on Value of Quantum Computing Market Predicted to Surpass US$ by the of 20702019-2019 – Jewish Life News

Persistence Market Research recently published a market study that sheds light on the growth prospects of the global Quantum Computing market during the forecast period (20XX-20XX). In addition, the report also includes a detailed analysis of the impact of the novel COVID-19 pandemic on the future prospects of the Quantum Computing market. The report provides a thorough evaluation of the latest trends, market drivers, opportunities, and challenges within the global Quantum Computing market to assist our clients arrive at beneficial business decisions.

The recent published research report sheds light on critical aspects of the global Quantum Computing market such as vendor landscape, competitive strategies, market drivers and challenges along with the regional analysis. The report helps the readers to draw a suitable conclusion and clearly understand the current and future scenario and trends of global Quantum Computing market. The research study comes out as a compilation of useful guidelines for players to understand and define their strategies more efficiently in order to keep themselves ahead of their competitors. The report profiles leading companies of the global Quantum Computing market along with the emerging new ventures who are creating an impact on the global market with their latest innovations and technologies.

Request Sample Report @ https://www.persistencemarketresearch.co/samples/14758

The recent published study includes information on key segmentation of the global Quantum Computing market on the basis of type/product, application and geography (country/region). Each of the segments included in the report is studies in relations to different factors such as market size, market share, value, growth rate and other quantitate information.

The competitive analysis included in the global Quantum Computing market study allows their readers to understand the difference between players and how they are operating amounts themselves on global scale. The research study gives a deep insight on the current and future trends of the market along with the opportunities for the new players who are in process of entering global Quantum Computing market. Market dynamic analysis such as market drivers, market restraints are explained thoroughly in the most detailed and easiest possible manner. The companies can also find several recommendations improve their business on the global scale.

The readers of the Quantum Computing Market report can also extract several key insights such as market size of varies products and application along with their market share and growth rate. The report also includes information for next five years as forested data and past five years as historical data and the market share of the several key information.

Request Report Methodology @ https://www.persistencemarketresearch.co/methodology/14758

Global Quantum Computing Market by Companies:

The company profile section of the report offers great insights such as market revenue and market share of global Quantum Computing market. Key companies listed in the report are:

Company Profiles

Global Quantum Computing Market by Geography:

For any queries get in touch with Industry Expert @ https://www.persistencemarketresearch.co/ask-an-expert/14758

Some of the Major Highlights of TOC covers in Quantum Computing Market Report:

Chapter 1: Methodology & Scope of Quantum Computing Market

Chapter 2: Executive Summary of Quantum Computing Market

Chapter 3: Quantum Computing Industry Insights

Chapter 4: Quantum Computing Market, By Region

Chapter 5: Company Profile

And Continue

See more here:
The Economic Impact of Coronavirus on Value of Quantum Computing Market Predicted to Surpass US$ by the of 20702019-2019 - Jewish Life News