Billionaire Mark Cuban Reveals Reason He Owns Bitcoin, Names Catalyst That Will Push BTC to New Heights – The Daily Hodl

A Shark Tank investor is laying out what he thinks must happen for Bitcoin (BTC) to take off as the better version of gold.

In a new interview with the Bankless podcast, billionaire Mark Cuban says BTC faces a problem of relative lack of utility and resulting lack of mainstream popularity.

Part of the challenge for Bitcoin is, even with the Lightning Network, improving and becoming more popular. Theres not enough utility, but thats the same problem gold has. Nobody needs gold jewelry. People like gold jewelry and want it, but they dont need it. So theres a utility issue with Bitcoin but, that said, like gold, its a good store of value.

The billionaire then explains why he keeps BTC in his portfolio and what he thinks is bound to push Bitcoin to new highs.

Its a better store of value when youre trying to be diverse in your portfolio, and thats why I own it. Because I think even though I have no idea where itll go, I have no idea how low it will go, but I do know that once there are applications with smart contracts not on Bitcoin but across crypto that regular people use and figure out, just like streaming was a better version of getting audio and video than traditional media was when theres better reason to use crypto applications, thats when it takes off.

Cuban also says Bitcoin works as a type of digital gold.

I always analogize Bitcoin as its a better version of gold. Its a digital version of gold and I truly believe that.

I

Featured Image: Shutterstock/Sergey Nivens/Nikelser Kate

See the rest here:
Billionaire Mark Cuban Reveals Reason He Owns Bitcoin, Names Catalyst That Will Push BTC to New Heights - The Daily Hodl

One of the most prominent crypto hedge funds just defaulted on a $670 million loan – CNBC

Bitcoin rallied to a record high of nearly $69,000 at the height of the 2021 crypto frenzy. In 2022, it's moved in the opposite direction.

Nurphoto | Getty Images

Prominent crypto hedge fund Three Arrows Capital has defaulted on a loan worth more than $670 million. Digital asset brokerage Voyager Digital issued a notice on Monday morning, stating that the fund failed to repay a loan of $350 million in the U.S. dollar-pegged stablecoin, USDC, and 15,250 bitcoin, worth about $323 million at today's prices.

3AC's solvency crunch comes after weeks of turmoil in the crypto market, which has erased hundreds of billions of dollars in value. Bitcoin and ether are both trading slightly lower in the last 24 hours, though well off their all-time highs. Meanwhile, the overall crypto market cap sits at about $950 billion, down from around $3 trillion at its peak in Nov. 2021.

Voyager said it intends to pursue recovery from 3AC (Three Arrows Capital). In the interim, the broker emphasized that the platform continues to operate and fulfill customer orders and withdrawals. That assurance is likely an attempt to contain fear of contagion through the wider crypto ecosystem.

"We are working diligently and expeditiously to strengthen our balance sheet and pursuing options so we can continue to meet customer liquidity demands," said Voyager CEOStephen Ehrlich.

As ofFriday, Voyager said it had approximately$137 millionin U.S. dollars and owned crypto assets. The company also noted that it has access to a $200 million cash and USDC revolver, as well as a 15,000 bitcoin ($318 million) revolver from Alameda Ventures.

Last week, Alameda (FTX founder Sam Bankman-Fried's quantitative trading firm) committed $500 million in financing to Voyager Digital, a crypto brokerage. Voyager has already pulled $75 million from that line of credit.

"The default of 3AC does not cause a default in the agreement with Alameda," the statement said.

CNBC did not immediately receive a comment from 3AC.

Three Arrows Capital was established in 2012 by Zhu Su and Kyle Davies.

Zhu is known for his incredibly bullish view of bitcoin. He said last year the world's largest cryptocurrency could be worth $2.5 million per coin. But in May this year, as the crypto marketbegan its meltdown, Zhu said on Twitter that his "supercycle price thesis was regrettably wrong."

The onset of a new so-called "crypto winter" hashurt digital currency projects and companies across the board.

Three Arrow Capital's problems appeared to begin earlier this month after Zhu tweeted a rather cryptic message that the company is "in the process of communicating with relevant parties" and is "fully committed to working this out."

There was no follow-up about what the specific issues were.

But theFinancial Timesreported after the tweet that U.S.-based crypto lendersBlockFiand Genesis liquidated some of 3AC's positions, citing people familiar with the matter. 3AC had borrowed from BlockFi but was unable to meet the margin call.

A margin call is a situation in which an investor has to commit more funds to avoid losses on a trade made with borrowed cash.

Then theso-called algorithmic stablecoin terraUSDand itssister token luna collapsed.

3AC had exposure to Luna and suffered losses.

"The Terra-Luna situation caught us very much off guard,"3AC co-founder Davies told theWall Street Journalin an interview earlier this month.

Three Arrows Capital is still facing a credit crunch exacerbated by the continued pressure on cryptocurrency prices.Bitcoinhovered around the $21,000 level on Monday and is down about 53% this year.

Meanwhile, the U.S. Federal Reserve has signaled further interest rate hikes in a bid to control rampant inflation, which has taken the steam out of riskier assets.

3AC, which is one of the biggest crypto-focused hedge funds, has borrowed large sums of money from various companies and invested across a number of different digital asset projects. That has sparked fears of further contagion across the industry.

"The issue is that the value of their [3AC's] assets as well has declined massively with the market, so all in all, not good signs," Vijay Ayyar, vice president of corporate development and international at crypto exchange Luno, told CNBC.

"What's to be seen is whether there are any large, remaining players that had exposure to them, which could cause further contagion."

Already, a number of crypto firms are facing liquidity crises because of the market slump. This month, lending firm Celsius, which promised users super high yields for depositing their digital currency, paused withdrawals for customers, citing "extreme market conditions."

Another crypto lender, Babel Finance, said this month that it is "facing unusual liquidity pressures" and halted withdrawals.

CNBC's Ryan Browne contributed to this report.

Read more:
One of the most prominent crypto hedge funds just defaulted on a $670 million loan - CNBC

BIS Says Crypto Weaknesses Have Materialized Following Market Sell-Off Featured Bitcoin News – Bitcoin News

The Bank of International Settlements (BIS), the global body for central banks, claims the weaknesses in crypto that were pointed out before have pretty much materialized. BIS General Manager Agustin Carstens opined: You just cannot defy gravity At some point, you really have to face the music.

The Bank of International Settlements (BIS) has warned that a perceived danger of decentralized digital money is materializing.

The BIS explained in its Annual Economic Report, published Tuesday, that the crypto market sell-off and the collapse of cryptocurrency terra (LUNA) and algorithmic stablecoin terrausd (UST) are indicators of a structural problem in crypto.

Structural flaws make the crypto universe unsuitable as the basis for a monetary system: it lacks a stable nominal anchor, while limits to its scalability result in fragmentation. Contrary to the decentralisation narrative, crypto often relies on unregulated intermediaries that pose financial risks, the BIS report reads.

Agustin Carstens, the BIS general manager, said in an interview with Reuters Tuesday that any form of money ultimately lacks credibility without a government-backed authority that can use reserves funded by taxes. He opined:

I think all these weaknesses that were pointed out before have pretty much materialized.

The BIS executive continued: You just cannot defy gravity At some point, you really have to face the music.

Carstens does not believe that the crypto market meltdown will cause a systemic crisis in the way that bad loans triggered the global financial crash. He detailed:

Based on what we know, it should be quite manageable. But, there are a lot of things that we dont know.

The BIS executive proceeded to talk about central bank digital currencies (CBDCs). In a report published in May, the BIS said that nine out of 10 central banks worldwide are exploring their own digital currencies.

This is a topic that has been on the G20 agenda for quite some time, Carstens further told the news outlet, adding that there is a good chance for this to move forward. He pointed out that some countries have already conducted real life trials with their central bank digital currency.

Carstens believes there will be international standards for CBDCs in the next couple of years, noting that 12 months is probably too short.

This week, the BIS Innovation Hub announced that its Eurosystem Centre projects will explore cryptocurrency markets. Citing that The collapse of many stablecoins and decentralized finance (defi) lending platforms has highlighted the difficulty in assessing their risks and economic potential, the BIS described: The projects goal is to create an open-source market intelligence platform to shed light on market capitalizations, economic activity, and risks to financial stability.

What do you think about the comments by BIS General Manager Agustin Carstens? Let us know in the comments section below.

A student of Austrian Economics, Kevin found Bitcoin in 2011 and has been an evangelist ever since. His interests lie in Bitcoin security, open-source systems, network effects and the intersection between economics and cryptography.

Image Credits: Shutterstock, Pixabay, Wiki Commons

Disclaimer: This article is for informational purposes only. It is not a direct offer or solicitation of an offer to buy or sell, or a recommendation or endorsement of any products, services, or companies. Bitcoin.com does not provide investment, tax, legal, or accounting advice. Neither the company nor the author is responsible, directly or indirectly, for any damage or loss caused or alleged to be caused by or in connection with the use of or reliance on any content, goods or services mentioned in this article.

Read this article:
BIS Says Crypto Weaknesses Have Materialized Following Market Sell-Off Featured Bitcoin News - Bitcoin News

Appsmith Raises $41 Million to Scale Open Source Platform that Speeds Critical Internal App Development – Business Wire

SAN FRANCISCO--(BUSINESS WIRE)--Appsmith, the first open-source low code software helping developers build internal tools, today announced that it has raised $41 million in Series B funding led by global software investor Insight Partners, with participation from Accel, Canaan, OSS Capital, and angel investors including Jeff Hammerbacher, co-founder of Cloudera, and Abhinav Asthana, co-founder of Postman.

Every enterprise needs to create custom internal applications a slow, repetitive, expensive process that requires work to build the user interface, write integrations, code the business logic, manage access controls and ultimately deploy the app. Often, these internal applications take a backseat to customer-facing applications. Appsmith enables software engineers to build internal apps often 10 times faster.

Software development is becoming critical to more and more businesses, but there are simply not enough developers to meet our growing global needs, said Nikhil Sachdev, managing director at Insight Partners. Low code tools like Appsmith are compelling options because they enable companies to deliver more developer productivity and faster time to market. With its open-source approach and great momentum achieved in a short amount of time, Appsmith is clearly a favorite among developers.

The funding follows dramatic growth in users since the $10.5 million Series A round in October 2021 led by Canaan and including Bessemer Venture Partners, bringing the total funding to $51.5 million. The company plans to continue building out its open source software while delivering support services as well as a paid version of the software with additional security features needed by enterprises.

This funding round was done to support the demand that were seeing for our open source platform to build internal apps that are mission critical and not customer-facing so they often get deprioritized, said Abhishek Nayak, co-founder and CEO of Appsmith. As a result, companies of all sizes from many different industries rely on Appsmith as an easy way to build custom apps to solve this problem.

Since being introduced less than two years ago, Appsmiths open source software has more than 10 million downloads, more than 18,000 stars (up 12x since 2021) on Github and is used at over 10,000 organizations. Last year, the company introduced more than 150 enhancements, including major features like JS Editor, version control, and 30-plus new widgets and variations. In total, 184 features were released. A full list can be viewed here. Appsmith also now has more than 18,000 community members with 192 contributors.

Appsmith is the first open-source low code software that helps developers build internal and CRUD (create, read, update and delete) type applications quickly, usually within only hours. Software engineers build the user interface with pre-built components, code the business logic by connecting application programming interfaces (APIs) along with any database, then test and deploy a web application where users are authenticated using a dashboard. Companies dedicate anywhere from 10% to 40% of their engineering resources to these internal tools, like for example a customer support dashboard.

About Appsmith

Appsmith was founded in 2019 with the mission to enable backend engineers to build internal web apps quickly with a low code approach. Taking an open source software approach provides anyone with access to the software and the opportunity to get involved in the community. The company has offices in San Francisco and Bengaluru, India. For more information visit https://www.appsmith.com

Read the original post:
Appsmith Raises $41 Million to Scale Open Source Platform that Speeds Critical Internal App Development - Business Wire

GPT-3 Training Programmers for the Present (and the Future) – hackernoon.com

I wrote a paper in Spanish and asked "someone else" to rewrite it

When I asked it to translate the article, it decided the title was not good enough.

So it changed it to present AND future

Scary, isn't it?

Let's move on.

So this is the original paper rewritten by GPT-3:

// Translate this into English and convert it to a technical article:

The current market is looking for programmers to stack bricks (1) using their trendy languages. Such professionals are trained in specific tools that have very short life cycles and then become obsolete.

There are currently alternatives to carry out computer developments using intelligent people, capable of making their own decisions and participating in the creative process.

The programmers of the future will have to be excellent and declarative (2). Professionals who know about algorithms and complexity and who know-how

Most developers in the world are making changes to old systems or have to use complex preexisting libraries or frameworks that they have little control over and few opportunities to modify. (3)

In today's software industry, it is very unlikely to find the need to develop from scratch, with a completely blank page.

The specific lifespan of a programmer age along with the fashionable language. This period of time is usually less than a decade, so the industry discards professionals as obsolete before ten years of training. (4)

In the last decades, they were fashionable as supposed silver bullets (5), languages like Visual Basic, C ++, Java, Php, Perl, Javascript, Ruby, Python, and GoLang.

Some of them are already ceasing to be used, leaving their place for new fashions. (6)

The general concepts (which Frederick Brooks calls essential (7)) are better absorbed in our first academic stage. We must focus on forming these concepts so that professionals can easily migrate from one technology (which Brooks calls accidental) to another technology (also accidental and, therefore, quickly obsolete).

By 2021 there are already numerous alternatives to artificial intelligence and machine learning, capable of carrying out low-level programming and algorithmic tasks (8) (9).

Modern virtual machines (those that interpret the high-level language in which we program and that exist, for example, in web browsers) optimize the code for us. (10)

As technology advances, low-level programmers will no longer be required in the same way that no one is looking for librarians today (11). Similar to what happens with other automatable and obsolete professions. (12)

Our profession is still based on teaching low-level imperative programming as if we were facing the problems of the 1960s or 1970s when the specialized hardware was punch cards and open tape machines.

We educate our students to carry out absurd and obsolete algorithmic optimizations, very close to the language of machines and very far from the modeling of entities of the problem that we are representing in our computer solution.

We train professionals to understand how a computer works and to lower themselves to speak in its language, forming low-level mental models and trying to reason in the algorithmic way in which a Turing machine processes.

However, today we can develop semantic models with several layers of abstraction above as we approach declarative mental models we move away at the same time from the computable abstractions of machines. The most declarative programming languages allow us to reason and infer rules about our models. Against this trend, our students write their instructions based on ones and zeros.

Moore's law (14) tells us about the speed of change of the hardware and its exponential growth in processing power. With respect to software, we grow linearly limited by the realities of many decades ago and the inertia of trainers. In analogy with Malthusian economic theory (15) we need to catch up with the hardware curve to be able to release all our potential before the machines do it for us.

The lack of abstraction and understanding of reality is a symptom of an inertial mentality, based on the time restrictions of processors and storage of the 60s and 70s. These restrictions are limited today to very specific domains and are not representative of the problems we have to solve today as software engineers in most of our jobs. (16)

We need to train generalists, but not theorists. People who understand the basics and foundations of software engineering and can adapt to accidental implementations based on current tools, but also to future ones that we do not yet know today.

Concepts such as software design, life cycle models, teamwork and building a shared theory (17), continuous integration and deployment or architecture, are much more important than learning to minimize CPU usage, to use React, the Vue framework, or the Keta library.

"It is very difficult to make predictions, especially about the future". This phrase, attributed to the great Niels Bohr, tells us that we should be careful when trying to anticipate the future.

Paraphrasing the 2004 Turing Award (18) Alan Key, said: "The best way to predict the future is to invent it".

We do not know for sure what the path of machine automation will be, but we can predict that the most creative and engineering part of our profession should be one of the last bastions to fall before automation.

The next generation of software developers will have to focus their skills on design, modeling the real world, and creating abstractions that evolve along with the problem domain to avoid building obsolete and legacy software (19).

This strategic approach is not limited only to development careers. In data science there are similar problems: data scientists focused on optimizing existing algorithms instead of training generalist professionals, able to understand the academic and commercial problems of our country with a battery of solutions and good criteria to determine which to choose in each situation. They should be trained to model the real problem and evaluate different possible solutions.

Data science professionals have an incredible variety of tools to adjust their models. We are still far from finding the second "super algorithm" (20) capable of solving specific problems with a generic solution. Our future professionals have the only general-purpose computer that can reasonably solve various specific problems: Their brilliant minds.

Our responsibility is to feed and stimulate those brains with real problems and not with toy solutions that computers solve (already in 2020) much more efficiently.

Paradoxically, wasting our brains by asking them to perform low-level optimizations for machines is the worst way to optimize our most expensive resource.

Historically, we have privileged the development of very specific and theoretical tools. This is very common and desirable in science because theoretical advances usually precede concrete implementations and uses by decades.

However, in software development, discoveries and findings are mostly found in the private sector before academia. Our professionals must be trained in methodology and concepts above accidental trendy technologies and languages. This generates tension between the market that wants "implementers expert in a tool" to discard them when it fulfills its fashion cycle of 5 or 10 years.

Our professionals should not be disposable or recyclable. We must train them in techniques and they must keep themselves constantly updated, as happens in other professions such as medicine, physics, or biotechnology.

In addition to "soft" skills such as construction and teamwork (since software arises from a collective activity) (17), we must teach design and prototyping techniques to validate our high-level solutions.

As for software, it is imperative to teach solution design, focusing on the behavior of our models and, paraphrasing Donald Knuth (21), the historical author of most of the algorithms we use today, avoiding premature optimizations because we want to play a game that machines dominate much better than us.

Training talent is an accessible option for any country with a good academic level, such as Argentina.

Investing in training excellent software engineers is a strategic decision and an opportunity for take-off that has already been exploited by many other countries such as Estonia, Ireland, Israel and India. The Sadosky Foundation is currently working in this direction. (22)

In Argentina, we have excellent teachers, a good level of English, an unbeatable time zone to dialogue with USA and Europe, and a culture compatible with the most developed countries.We need to prioritize information technologies and, within them, train intelligent and declarative engineers rather than mechanized programmers and low-level optimizers.

Our professionals must have basic knowledge of programming, algorithms, complexity, and databases.

Above all, they must learn to make designs based on continuous integration and continuous deployment models, with automated tests, using agile techniques such as Test-Driven Development. (23)

The software produced must be declarative and based on the desired behavior (and specified in the automated functional tests); we must stop thinking in the reigning paradigm of the 60s and 70s, based on data types and file and string manipulations, to focus on high-level models that accompany the simulation of any aspect of the real world that we want to represent to solve a certain problem. (24)

The techniques of design based on behavior are agnostic with respect to the accidental technology of fashion and this allows an engineer trained with these concepts 30 years ago to be able to make concrete developments even today.

Unfortunately, such a situation is not replicated by programmers who dominated some fashionable language, which practically has no use and that makes them not find good options in the labor market. The paradox is that a trade with full employment discards these professionals for not having been able to adapt. (25)

The change, currently, is even more vertiginous. Technologies last much less and obsolescence stalks us, unless we are intelligent and broad, and have the appropriate training.

This is an opinion piece. As future work to support the current thesis, we should carry out a quantitative study including figures on employee turnover (25), average time in each job according to age and studies completed, etc.

To do this we must use techniques related to the social sciences under a multidisciplinary approach.

The future has already arrived. We don't have much idea of what a programmer's job will be like in 5 or 10 years, but we have strong indications that it will not be related to writing algorithms based on data structures. We must train professionals who quickly understand a real-life problem and know how to build simulators with very small conceptual leaps, so that they can evolve alongside the problems we are solving today.

Part of the ideas in this article was born from the teaching work in the Software Engineering course in the Computer Science degree at the Faculty of Exact Sciences, University of Buenos Aires.

The translation is accurate and beautiful.

I have checked it with Grammarly and Hemmingway app and found just a few grammatical mistakes.

As an example:

Este es un artculo de opinin. Como trabajo futuro para apoyar la presente tesis deberamos realizar un trabajo cuantitativo incluyendo cifras de rotacin de empleados (25), tiempo promedio en cada trabajo segn la edad y los estudios realizados etc.Para ello debemos utilizar tcnicas relacionadas a las ciencias sociales bajo un enfoque multidisciplinario.

Translates in GPT-3 as:

This is an opinion piece. As future work to support the current thesis, we should carry out a quantitative study including figures on employee turnover (25), average time in each job according to age and studies undertaken etc.To do this we must use techniques related to the social sciences from a multidisciplinary perspective.

While Google Translate yields:

This is an opinion article. As future work to support this thesis, we should carry out quantitative work including employee turnover figures (25), average time in each job according to age and studies carried out, etc.For this we must use techniques related to the social sciences under a multidisciplinary approach.

I used text-davinci-002 model with Temperature 0 (Randomness) and Maximum Length 2000 tokens

What is your opinion on this article?

I think that this article makes some valid points about the future of programming and the need for programmers to be adaptable to change. I agree that the industry is moving towards more automated solutions and that this will eventually lead to a decrease in the need for programmers. However, I think that there will always be a need for programmers who are able to understand and solve complex problems.

(Scary Again!!)

I will definitively, keep using GPT-3 to write technical articles to improve them.

L O A D I N G. . . comments & more!

Read more from the original source:
GPT-3 Training Programmers for the Present (and the Future) - hackernoon.com

OpenSSL 3.0.5 awaits release to fix potential worse-than-Heartbleed flaw – The Register

The latest version of OpenSSL v3, a widely used open-source library for secure networking using the Transport Layer Security (TLS) protocol, contains a memory corruption vulnerability that imperils x64 systems with Intel's Advanced Vector Extensions 512 (AVX512).

OpenSSL 3.0.4 was released on June 21 to address a command-injection vulnerability (CVE-2022-2068) that was not fully addressed with a previous patch (CVE-2022-1292).

But this release itself needs further fixing. OpenSSL 3.0.4 "is susceptible to remote memory corruption which can be triggered trivially by an attacker," according to security researcher Guido Vranken. We're imagining two devices establishing a secure connection between themselves using OpenSSL and this flaw being exploited to run arbitrary malicious code on one of them.

Vranken said that if this bug can be exploited remotely and it's not certain it can be it could be more severe than Heartbleed, at least from a purely technical point of view.

However, Vranken notes several mitigating factors, including the continued use of the 1.1.1 tree of the library rather than v3 tree; the fork of libssl into LibreSSL and BoringSSL; the short amount of time 3.0.4 has been available; and the fact that the error only affects x64 with AVX512 available on certain Intel chips released between 2016 and early 2022.

Intel this year began disabling AVX512 support on Alder Lake, its 12th Gen Intel Core processors.

The bug, an AVX512-specific buffer overflow, was reported six days ago. It has been fixed, but OpenSSL 3.0.5 has not yet been released.

Meanwhile, Linux distributions like Gentoo have not yet rolled out OpenSSL 3.0.4 as a result of this bug and a test build failure bug. So they include OpenSSL 3.0.3, with its command injection flaw.

In the GitHub Issues thread discussing the bug, Tom Mrz, software developer at the OpenSSL Foundation, argues the bug shouldn't be classified as a security vulnerability.

"I do not think this is a security vulnerability," he said. "It is just a serious bug making [the] 3.0.4 release unusable on AVX512 capable machines."

Xi Ruoyao, a PhD student at Xidian University, also said he disagreed with the policy of calling every heap buffer overflow a security flaw. Vim, he said, started doing so this year and the result has been something like ten "high severity" vim CVEs every month without any proof-of-concept exploit code.

"I think we shouldn't mark a bug as 'security vulnerability' unless we have some evidence showing it can (or at least, may) be exploited," he wrote, adding that nonetheless 3.0.5 should be released as soon as possible because it's very severe.

Alex Gaynor, software resilience engineer with the US Digital Service, however, argues to the contrary.

"I'm not sure I understand how it's not a security vulnerability," responded Gaynor. "It's a heap buffer overflow that's triggerable by things like RSA signatures, which can easily happen in remote contexts (e.g. a TLS handshake)."

Gaynor urged releasing the fix quickly. "I think this issue qualifies as a CRITICAL within OpenSSL's vulnerability severity policy, and it makes it effectively impossible for users to upgrade to 3.0.4 to obtain its security fixes," he said .

See original here:
OpenSSL 3.0.5 awaits release to fix potential worse-than-Heartbleed flaw - The Register

Exploiting symmetries: Speeding up the computational study of solid solutions – EurekAlert

image:Atomic substitution with La atoms: Ce8Pd24Sb (Ce5,La3)Pd24Sb. The crystal structure was obtained from the ICSD database (CollCode: 83378). The space group is 221-Pm3m, and the crystal structures are depicted using VESTA. view more

Credit: Kousuke Nakano from JAIST.

Ishikawa, Japan -- Symmetry is a prevalent feature of nature at all scales. For example, our naked eyes can easily identify symmetries in the bodily shape of countless organisms. Symmetry is also very important in the fields of physics and chemistry, especially in the microscopic realm of atoms and molecules. Crystals, which are highly ordered materials, can even have multiple types of symmetry at the same time, such as rotational symmetry, inversion symmetry, and translational symmetry.

Lately, alongside rapid progress in computer science, researchers have developed computational methods that seek to predict the physical properties of crystals based on their electronic structure. In practice, however, pure and perfectly symmetric crystals are seldom used. This is because a crystals properties can be tuned as desired by alloying them with other materials or randomly substituting certain atoms with other elements, i.e., doping.

Accordingly, materials scientists are seeking computationally efficient approaches to analyze such alloys and substituted crystals, also known as solid solutions. The supercell method is one such approach and is widely used to model crystal structures with random substitutions of different atoms. The symmetry of crystals, however, is actually a problem when using this technique. In crystals, there can be many substitution patterns that are physically equivalent to other substitutions if we simply translate or rotate them. Findings these symmetric substitution patterns is not very meaningful, and thus their calculation when using the supercell method is a waste of time.

In a recent study, a team of researchers led by Assistant Professor Kousuke Nakano from Japan Advanced Institute of Science and Technology (JAIST) found a solution to this problem. They developed an open-source software called Suite for High-throughput generation of models with atomic substitutions implemented by Python, or SHRY that can, in terms of symmetry, generate distinct substitution patterns in solid solutions and alloys [https://github.com/giprayogo/SHRY]. This work, which was published in the ACS Journal of Chemical Information and Modeling, was co-authored by doctoral student Genki I. Prayogo, Dr. Andrea Tirelli, Professor Ryo Maezono, and Associate Professor Kenta Hongo.

The team approached the problem from the angle of group theory. It turns out that searching for atomic substitution patterns in crystals is analogous to the problem of finding coloring patterns on the vertices of graphs under certain restrictions. This allows one to reformulate the original problem of finding non-symmetric atomic substitutions in crystals as exploring search trees depicting the coloring of vertices in graphs.

However, the way in which the search tree is explored is crucial. A simple, nave approach in which all possible branches are searched and directly compared is impossible; the time and calculations required grow uncontrollably for large systems. This happens because deciding whether to explore further down a branch requires information about all other branches besides the one being explored, which is technically referred to as non-local information.

To avoid this issue, the researchers implemented in SHRY a technique called canonical augmentation. This method can decide whether a tree branch should be explored more deeply or not based solely on local information, explains Dr. Nakano, Most importantly, theorems from group theory guarantee that only distinct substitution patterns will be extracted, without over- or under-exploring the tree structure in terms of symmetry. The team verified that their algorithm was error-free by testing it thoroughly with data from a database of crystal structures.

It is worth noting that SHRY was written in Python 3, one of the most popular cross-platform programming languages, and uploaded to GitHub, a leading project-sharing online platform. SHRY can be used as a stand-alone program or imported into another Python program as a module, highlights Dr. Nakano, Our software also uses the widely supported Crystallographic Information File (CIF) format for both the input and output of the sets of substituted crystal structures. The team plans to keep improving SHRYs code based on feedback from other users, boosting its speed and capabilities.

Overall, the software developed in this study could help scientists identify potential atomic substitutions in solids, which is the most common strategy used to tune the properties of materials for practical applications. SHRY will help speed up research and develop substituted crystals with unprecedented functionalities and superior characteristics.

###

Reference

Title of original paper:

SHRY: Application of Canonical Augmentation to the Atomic Substitution Problem

Journal:

Journal of Chemical Information and Modeling

DOI:

10.1021/acs.jcim.2c00389

About Japan Advanced Institute of Science and Technology, Japan

Founded in 1990 in Ishikawa prefecture, the Japan Advanced Institute of Science and Technology (JAIST) was the first independent national graduate school in Japan. Now, after 30 years of steady progress, JAIST has become one of Japans top-ranking universities. JAIST counts with multiple satellite campuses and strives to foster capable leaders with a state-of-the-art education system where diversity is key; about 40% of its alumni are international students. The university has a unique style of graduate education based on a carefully designed coursework-oriented curriculum to ensure that its students have a solid foundation on which to carry out cutting-edge research. JAIST also works closely both with local and overseas communities by promoting industryacademia collaborative research.

About Assistant Professor Kousuke Nakano from Japan Advanced Institute of Science and Technology, Japan

Dr. Kousuke Nakano obtained B.Sc. and M.Sc. degrees in Engineering from Kyoto University, Japan, in 2012 and 2014, respectively. He then joined JAIST, where he obtained a Ph.D. in computer and information science in 2017. Since 2019, he works there as Assistant Professor, researching the topics of first-principles quantum Monte Carlo simulations, density functional theory, machine learning for materials informatics, and the synthesis of novel inorganic compounds using solid-state reactions. He has over 40 publications to his name on these topics and his h-index is 14 with over 700 citations (Google Scholar, Jun. 2022).

Funding information

This work was financially supported by JST SPRING (Grant Number JPMJSP2102), MIUR Progetti di Ricerca di Rilevante Interesse Nazionale (PRIN) Bando 2017 (Grant Number 2017BZPKSZ), the HPCI System Research Project (Project IDs: hp210019,hp210131, and jh210045), MEXT-KAKENHI (JP16H06439, JP17K17762, JP19K05029, JP19H05169, JP21K03400, JP21H01998, JP22H02170, JP19H04692, and JP21K03400), the U.S. Air Force Office of Scientific Research (Award Number FA2386-20-1-4036, AFOSR-AOARD/FA2386-17-1-4049; FA2386-19-1-4015), JSPS Bilateral Joint Projects (JPJSBP120197714), JSPS Overseas Research Fellowships, a Grant-in-Aid for Early-Career Scientists (Grant Number JP21K17752), and a Grant-in-Aid for Scientific Research (C) (Grant Number JP21K03400).

Journal of Chemical Information and Modeling

SHRY: Application of Canonical Augmentation to the Atomic Substitution Problem

9-Jun-2022

See the original post:
Exploiting symmetries: Speeding up the computational study of solid solutions - EurekAlert

Building An Insights And Analytics Newsletter: From Proof Of Concept To Feature Release – Forbes

As software engineers at Forbes, were always building innovative features and working on prototypes for potential projects. Some proof of concept initiatives remain in the testing state, but some end up being shared with a larger audience. Bertie Bulletin, a monthly email containing user stats, was a project that started as an engineering initiative and turned into a monthly email sent to both active Forbes Staff and Contributors who write for the Forbes website.

Currently, writers have access to their user and story stats through Bertie, Forbes Content Management System where they can write, edit, and publish their articles. Bertie Bulletin was created from a suggestion to better equip writers with the knowledge they need to understand their stats, as well as provide them with a record of historical stats. Bertie Bulletin is similar to a bank statement that helps writers keep track of their performance from month-to-month. We included the top performing stories for the previous month, based on how many page views the story received, audience stats, and a referral breakdown indicating where readers are coming from. While these emails have some of the data that already exists in the stats dashboard, they also extend to include insights. Insights are comparisons of data, such as this month or year versus the previous month or year.

Linking entities. Network, networking, social media, internet co

As we built Bertie Bulletin, we made sure to fully utilize pre existing projects in the Forbes Engineering universe. The initial approach used Directed Acyclic Graphs, or DAGs, which are written in Python, therefore, so was Bertie Bulletin. To create the mailing list we consulted with our API team, who utilized cloud functions to generate the mailing lists with the given parameters. We used Mailgun, an email delivery service used by other Forbes projects, to store mailing lists and email templates as well as trigger emails. Our codebase made calls to the Forbes stats API to fetch the numbers and generate insights, which were then stored as recipient variables in Mailgun.

A lot of research went into figuring out which templating engine made sense to use. The initial email was set up in a single HTML file, though it wasnt pretty to look at or easy to edit. Our list of wants included an engine that would allow for use of template inheritance and control structures. In other words, we could break the email down into sections with conditionals that would then be compiled. This led us to Jinja, which describes itself as a fast, expressive, extensible templating engine, making it ideal for our purposes.

Setting up the Jinja flow for Bertie Bulletin required three different steps: creating the Jinja files containing our HTML elements, using built-in methods to render a single HTML file, and uploading that file to Mailgun to use as the email template.

Our Minimum Viable Product (MVP) had a couple of must-haves for our writers. This included top stories, audience stats, total published stories, and information on page views. This first version of the email was released in February 2022, rendering January stats data. With each iteration of these emails, we included new features to enable writers to visualize their information in a more palatable manner, such as an audience stats donut chart and a traffic source types referral table, to name a few.

Screenshot of Bertie Bulletin donut chart and referral breakdown designs.

The process to deploy Bertie Bulletin emails can be split into a few broad steps. Initially, the email template is generated and uploaded, and the mailing list is sanitized. The next steps retrieve each writers data from the stats API and transform it into useful pieces of information. Lastly, each users data is updated and the email is triggered.

In order to figure out the best deployment strategy for Bertie Bulletin, we once again turned to Forbes Engineering. We consulted with our DevOps team and they suggested using Argo-Workflows, a container-native engine that enables the orchestration of parallel jobs on Kubernetes, an open-source platform for working with containers. Each step was containerized into a reliable package of the execution environment and applied into a self-contained image. The advantage of using containers was that each one would be small and do a defined unit of work which could then be scaled horizontally and completed in parallel, thus finishing the batch work quicker. Optimization was important because we expected to send a couple thousand emails every month. With Argo's fully featured UI, we were able to see each containerized step as a stage in the Argo workflow and visualize its progress. If a step errored out for any reason, Argo could retry the step and provide us with logs to better debug the issue.

As we reflected on the process of creating Bertie Bulletin, we realized that the advice from Jackie Has previous article still applied at any level.

3d rendering of question mark on speech bubble. Brainstorming, business concept.

Be Inquisitive: In the beginning, we had a lot of conversations with product owners to hash out details. What did the MVP look like? What were the specific requirements at each stage?

Know When To Ask For Help: With a large project like Bertie Bulletin, it was inevitable (like Thanos) that we would run into the unknown (like Elsa). To overcome these roadblocks, we consulted with various people throughout the process for guidance on the best way to execute.

Learn How To Debug: Because Bertie Bulletin had so many moving parts, if something went wrong we needed to be able to pinpoint exactly where the error came from. This meant we had to figure out how to debug effectively and efficiently.

Proofread Your PRs: Whether you are on team unified view or team split view, its good practice to review your own PR in Github before sending it out to others. The visual comparison can help you catch a sneaky print() or syntax errors easily missed within your code editor.

Taking this project from prototype to production was a strenuous effort that required collaboration across many teams. After many hours of pair programming, screen sharing, and debugging sessions, being able to finally see the end result live and hearing positive feedback from the recipients made the process worthwhile.

Go here to read the rest:
Building An Insights And Analytics Newsletter: From Proof Of Concept To Feature Release - Forbes

Calendar of events, awards and opportunities – ASBMB Today

Every week, we update this list with new meetings, awards, scholarships and events to help you advance your career.If youd like us to feature something that youre offering to the bioscience community, email us with the subject line For calendar. ASBMB members offerings take priority, and we do not promote products/services. Learn how to advertise in ASBMB Today.

As we do each year, we'll be hosting a Twitter chat for Pride Month. It will be at 2 p.m. Eastern on June 27 and will feature ASBMB staffers, members and representatives of allied organizations. We hope you can join us! Follow us at @ASBMB.

This webinar will feature the ins and outs of the Early Career Reviewer Program at the National Institutes of Healths Center for Scientific Review, which gives emerging investigators an inside look at the scientific peer-review process. Elyse Schauwecker, a scientific review officer at CSR, will talk about the benefits of participating, eligibility, the application process and recent changes. There will also be time to ask Schauwecker questions about the program and other CSR opportunities for early-career scientists. Anita Corbett of Emory University, a member of the ASBMB Public Affairs Advisory Commitee, will moderate.Register.

The National Cancer Institutes Frederick National Laboratory for Cancer Research is the only national laboratory dedicated to biomedical research. FNLCR is conducting a survey to determine how familiar researchers are with the lab and the services, tools and resources it offers to the scientific community. Take the survey.

This five-day conference will be held Aug. 1418 in person in Cambridge, Massachusetts, and online. It will be an international forum for discussion of the remarkable advances in cell and human protein biology revealed by ever-more-innovative and powerful mass spectrometric technologies. The conference will juxtapose sessions about methodological advances with sessions about the roles those advances play in solving problems and seizing opportunities to understand the composition, dynamics and function of cellular machinery in numerous biological contexts. In addition to celebrating these successes, we also intend to articulate urgent, unmet needs and unsolved problems that will drive the field in the future. The registration deadline is July 1.Learn more.

The Journal of Science Policy & Governance, the United Nations Educational, Scientific and Cultural Organization and the Major Group for Children and Youth announced in February a call for papers for a special issue on "open science policies as an accelerator for achieving the sustainable development goals." The deadline for submissions is July 10. To help authors prepare their submissions, the group will be hosting a series of webinars (April 8 & 29, May 20, and June 10) and a science policy paper-writing workshop (March 2627). Read the call for submissions and learn more about the events.

This in-person meeting will be held Sept. 29 through Oct. 2 in Snowbird, Utah. Sessionswill cover recent advances and new technologies in RNA polymerase II regulation, including the contributions of non-coding RNAs, enhancers and promoters, chromatin structure and post-translational modifications, molecular condensates, and other factors that regulate gene expression. Patrick Cramer of the Max Planck Institute will present the keynote address on the structure and function of transcription regulatory complexes. The deadline for oral presentation abstracts is July 14. The deadline for poster presentation abstracts is Aug. 18.Learn more.

Head to beautiful Denver, Colorado, for a summer experience as a PRIDE (Programs to Increase Diversity Among Individuals Engaged in Health-Related Research) scholar. PRIDE is an initiative of the National Heart, Lung and Blood Institute that trains junior faculty from underrepresented backgrounds and/or with disabilities to advance their scientific careers and make them more competitive for external research funding. The University of Colorado PRIDE (led by Sonia C. Flores, who also leads the ASBMB Minority Affairs Committee) is one of nine national PRIDE sites. Its focus is on the "impact of ancestry and gender on omics of lung and cardiovascular diseases" (which is why it's called PRIDEAGOLD). The program consists of two consecutive summer institutes (two and one week, respectively) that offer comprehensive formal instruction on multi-omics, data sciences and bioinformatics, with an emphasis on interpretations based on ancestry and/or gender; career development and grant-writing tools; pairing with expert mentors; and pilot funds to develop a small research project. Learn more.

Most meetings on epigenetics and chromatin focus on transcription, while most meetings on genome integrity include little attention to epigenetics and chromatin. This conference in Seattle will bridge this gap to link researchers who are interested in epigenetic regulations and chromatin with those who are interested in genome integrity. The oral and poster abstract deadline and early registration deadline is Aug. 2. The regular registration deadline is Aug. 29.Learn more.

For Discover BMB, the ASBMB's annual meeting in March in Seattle, we're seeking two types of proposals:

In May, the Howard Hughes Medical Institute launched a roughly $1.5 billion program to "help build a scientific workforce that more fully reflects our increasingly diverse country." The Freeman Hrabowski Scholars Program will fund 30 scholars every other year, and each appointment can last up to 10 years. That represents up to $8.6 million in total support per scholar. HHMI is accepting applications from researchers "who are strongly committed to advancing diversity, equity, and inclusion in science." Learn more.

Save the date for the ASBMB Career Expo. This virtual event aims to highlight the diversity of career choices available to modern biomedical researchers. No matter your career stage, this expo will provide a plethora of career options for you to explore while simultaneously connecting you with knowledgeable professionals in these careers. Each 60-minute session will focus on a different career path and will feature breakout rooms with professionals in those paths. Attendees can choose to meet in a small group with a single professional for the entire session or move freely between breakout rooms to sample advice from multiple professionals. Sessions will feature the following five sectors: industry, government, science communication, science policy and other. The expo will be held from 11 a.m. to 5 p.m. Eastern on Nov. 2. Stay tuned for a link to register!

The ASBMB provides members with a virtual platform to share scientific research and accomplishments and to discuss emerging topics and technologies with the BMB community.

The ASBMB will manage the technical aspects, market the event to tens of thousands of contacts and present the digital event live to a remote audience. Additional tools such as polling, Q&A, breakout rooms and post event Twitter chats may be used to facilitate maximum engagement.

Seminars are typically one to two hours long. A workshop or conference might be longer and even span several days.

Prospective organizers may submit proposals at any time. Decisions are usually made within four to six weeks.

Propose an event.

If you are a graduate student, postdoc or early-career investigator interested in hosting a #LipidTakeover, fill out this application. You can spend a day tweeting from the Journal of Lipid Research's account (@JLipidRes) about your favorite lipids and your work.

The International Union of Biochemistry and Molecular Biology is offering $500 to graduate students and postdocs displaced from their labs as a result of natural disaster, war or "other events beyond their control that interrupt their training." The money is for travel and settling in. Learn more and spread the word to those who could use assistance.

TheCenter for Open Bioimaging Analysismaintains open-source softwareCellProfilerandImageJ. COBA has partnered withBioimaging North Americaand theRoyal Microscopical Societyto create a survey to assess the needs of the community for software and training materials. Take the survey.

Read more here:
Calendar of events, awards and opportunities - ASBMB Today

What is quantum computing? – TechTarget

Quantum computing is an area of study focused on the development of computer based technologies centered around the principles ofquantum theory. Quantum theory explains the nature and behavior of energy and matter on thequantum(atomic and subatomic) level. Quantum computing uses a combination ofbitsto perform specific computational tasks. All at a much higher efficiency than their classical counterparts. Development ofquantum computersmark a leap forward in computing capability, with massive performance gains for specific use cases. For example quantum computing excels at like simulations.

The quantum computer gains much of its processing power through the ability for bits to be in multiple states at one time. They can perform tasks using a combination of 1s, 0s and both a 1 and 0 simultaneously. Current research centers in quantum computing include MIT, IBM, Oxford University, and the Los Alamos National Laboratory. In addition, developers have begun gaining access toquantum computers through cloud services.

Quantum computing began with finding its essential elements. In 1981, Paul Benioff at Argonne National Labs came up with the idea of a computer that operated with quantum mechanical principles. It is generally accepted that David Deutsch of Oxford University provided the critical idea behind quantum computing research. In 1984, he began to wonder about the possibility of designing a computer that was based exclusively on quantum rules, publishing a breakthrough paper a few months later.

Quantum Theory

Quantum theory's development began in 1900 with a presentation by Max Planck. The presentation was to the German Physical Society, in which Planck introduced the idea that energy and matter exists in individual units. Further developments by a number of scientists over the following thirty years led to the modern understanding of quantum theory.

Quantum Theory

Quantum theory's development began in 1900 with a presentation by Max Planck. The presentation was to the German Physical Society, in which Planck introduced the idea that energy and matter exists in individual units. Further developments by a number of scientists over the following thirty years led to the modern understanding of quantum theory.

The Essential Elements of Quantum Theory:

Further Developments of Quantum Theory

Niels Bohr proposed the Copenhagen interpretation of quantum theory. This theory asserts that a particle is whatever it is measured to be, but that it cannot be assumed to have specific properties, or even to exist, until it is measured. This relates to a principle called superposition. Superposition claims when we do not know what the state of a given object is, it is actually in all possible states simultaneously -- as long as we don't look to check.

To illustrate this theory, we can use the famous analogy of Schrodinger's Cat. First, we have a living cat and place it in a lead box. At this stage, there is no question that the cat is alive. Then throw in a vial of cyanide and seal the box. We do not know if the cat is alive or if it has broken the cyanide capsule and died. Since we do not know, the cat is both alive and dead, according to quantum law -- in a superposition of states. It is only when we break open the box and see what condition the cat is in that the superposition is lost, and the cat must be either alive or dead.

The principle that, in some way, one particle can exist in numerous states opens up profound implications for computing.

A Comparison of Classical and Quantum Computing

Classical computing relies on principles expressed by Boolean algebra; usually Operating with a 3 or 7-modelogic gateprinciple. Data must be processed in an exclusive binary state at any point in time; either 0 (off / false) or 1 (on / true). These values are binary digits, or bits. The millions of transistors and capacitors at the heart of computers can only be in one state at any point. In addition, there is still a limit as to how quickly these devices can be made to switch states. As we progress to smaller and faster circuits, we begin to reach the physical limits of materials and the threshold for classical laws of physics to apply.

The quantum computer operates with a two-mode logic gate:XORand a mode called QO1 (the ability to change 0 into a superposition of 0 and 1). In a quantum computer, a number of elemental particles such as electrons or photons can be used. Each particle is given a charge, or polarization, acting as a representation of 0 and/or 1. Each particle is called a quantum bit, or qubit. The nature and behavior of these particles form the basis of quantum computing and quantum supremacy. The two most relevant aspects of quantum physics are the principles of superposition andentanglement.

Superposition

Think of a qubit as an electron in a magnetic field. The electron's spin may be either in alignment with the field, which is known as aspin-upstate, or opposite to the field, which is known as aspin-downstate. Changing the electron's spin from one state to another is achieved by using a pulse of energy, such as from alaser. If only half a unit of laser energy is used, and the particle is isolated the particle from all external influences, the particle then enters a superposition of states. Behaving as if it were in both states simultaneously.

Each qubit utilized could take a superposition of both 0 and 1. Meaning, the number of computations a quantum computer could take is 2^n, where n is the number of qubits used. A quantum computer comprised of 500 qubits would have a potential to do 2^500 calculations in a single step. For reference, 2^500 is infinitely more atoms than there are in the known universe. These particles all interact with each other via quantum entanglement.

In comparison to classical, quantum computing counts as trueparallel processing. Classical computers today still only truly do one thing at a time. In classical computing, there are just two or more processors to constitute parallel processing.EntanglementParticles (like qubits) that have interacted at some point retain a type can be entangled with each other in pairs, in a process known ascorrelation. Knowing the spin state of one entangled particle - up or down -- gives away the spin of the other in the opposite direction. In addition, due to the superposition, the measured particle has no single spin direction before being measured. The spin state of the particle being measured is determined at the time of measurement and communicated to the correlated particle, which simultaneously assumes the opposite spin direction. The reason behind why is not yet explained.

Quantum entanglement allows qubits that are separated by large distances to interact with each other instantaneously (not limited to the speed of light). No matter how great the distance between the correlated particles, they will remain entangled as long as they are isolated.

Taken together, quantum superposition and entanglement create an enormously enhanced computing power. Where a 2-bit register in an ordinary computer can store only one of four binary configurations (00, 01, 10, or 11) at any given time, a 2-qubit register in a quantum computer can store all four numbers simultaneously. This is because each qubit represents two values. If more qubits are added, the increased capacity is expanded exponentially.

Quantum Programming

Quantum computing offers an ability to write programs in a completely new way. For example, a quantum computer could incorporate a programming sequence that would be along the lines of "take all the superpositions of all the prior computations." This would permit extremely fast ways of solving certain mathematical problems, such as factorization of large numbers.

The first quantum computing program appeared in 1994 by Peter Shor, who developed a quantum algorithm that could efficiently factorize large numbers.

The Problems - And Some Solutions

The benefits of quantum computing are promising, but there are huge obstacles to overcome still. Some problems with quantum computing are:

There are many problems to overcome, such as how to handle security and quantum cryptography. Long time quantum information storage has been a problem in the past too. However, breakthroughs in the last 15 years and in the recent past have made some form of quantum computing practical. There is still much debate as to whether this is less than a decade away or a hundred years into the future. However, the potential that this technology offers is attracting tremendous interest from both the government and the private sector. Military applications include the ability to break encryptions keys via brute force searches, while civilian applications range from DNA modeling to complex material science analysis.

Read the original here:
What is quantum computing? - TechTarget