QunaSys Raises $10M in its Series B Funding Led by VGI to Expand Overseas Markets Further as a Japan-Based Quantum Computer Software Startup – PR…

TOKYO, March 28, 2022 /PRNewswire/ -- QunaSys Inc. ("QunaSys"), one of the world's leading developers of innovative algorithms in chemistry focused on accelerating the development of quantum technology applicability, has announced today that it has raised $10 million in its series B funding stage the led by JIC Venture Growth Investments ("VGI"), with participation from ANRI, Fujitsu Ventures Fund LLC, Global Brain, HPC Systems Inc., JST SUCCESS Program, MUFJ Capital, Shinsei Corporate Investment Limited, and Zeon Corporation. Simultaneously, QunaSys has announced that it has come to an agreement with Zeon Corporation, Fujitsu Limited, and HPC Sytems Inc. for a capital tie-up and a business alliance. The investment means an important progress for QunaSys since the previous financing in 2019.

Since 2019, QunaSys has grown its R&D and business development activities and achieved record business growth. In July 2020, QunaSys established QPARC, a Japanese consortium to study quantum computing applicability of quantum computers. Since then, more than 50 companies have participated in QPARC and the consortium has explored different quantum computing use cases, such as new energy analysis, molecular structure optimization, or sustainable material manufacturing, from ENEOS Holding and JSR Corporation.

In October 2021, QunaSys launched "Qamuy", a cloud-accessible quantum computing development platform that now has more than 3.3 million jobs executed over it. In anticipation of the quantum computing market adoption in the upcoming years, QunaSys aims to make its "Qamuy" available as the global de-facto standard for main hardware devices.

"Although quantum computer hardware is being developed around the world, for quantum computers to be widely used by users it is essential to have appropriate algorithms to meet the challenges and software that serves as an interface for users to master quantum computers. We have invested in QunaSys because we believe that QunaSys will be an indispensable company for the future spread of quantum computers in Japan. " Yuki Kuwabara, JIC Venture Growth Investments Co., Ltd., Principal

QunaSys is collaborating with Europe based consortiums to boost quantum computing, working together with the Pistoia Alliance in the development of quantum computing in the Pharma Industry and the Quantum Flagship program contributing to re-train industry workers with quantum computing for chemistry learning programs.

"It has been four years since we established QunaSys, andsince then with the help of our talented members and companies at the forefront of their industry, we have been working towards the practical application of quantum computers. This fundraising will help accelerate the development of more usable quantum computing chemical software and expand current business overseas to open a European base." Tennin Yan, QunaSys Inc. CEO.

Explore career opportunities at QunaSys and join the quantum computing revolution! >> https://qunasys.com/en/careers

About QunaSys Inc.

QunaSys is the world's leading developer of innovative algorithms in chemistry focused on accelerating the development of quantum technology applicability. QunaSys enables maximization of the power of quantum computing through its advanced joint research that addresses cutting-edge technologies providing Qamuy, one of the most powerful quantum chemical calculation cloud software to ever exist,fostering the development of collaboration through its QPARC industry consortium,and working with research institutions from both academia and government sectors. QunaSys software runs on multiple platforms with applicability in all chemical-relatedindustries to encouragethe adoption of quantumcomputing.

*All company and product names mentioned herein are trademarks or registered trademarks of QunaSys, Inc. or their respective companies.

CONTACT: QunaSys Inc., E-mail: [emailprotected]

SOURCE QunaSys Inc.

Original post:
QunaSys Raises $10M in its Series B Funding Led by VGI to Expand Overseas Markets Further as a Japan-Based Quantum Computer Software Startup - PR...

IonQ Named As One Of TIME’s 100 Most Influential Companies – Business Wire

COLLEGE PARK, Md.--(BUSINESS WIRE)--Today, IonQ (NYSE: IONQ), a leader in trapped-ion quantum computing, was named to TIMEs annual TIME100 Most Influential Companies list. This ranking highlights 100 companies that are making an extraordinary impact across the globe.

IonQ is honored to be a part of this years TIME100 Most Influential Companies list, said Peter Chapman, President and CEO of IonQ. TIMEs recognition of quantum computings impact, and its specific recognition of IonQs role as an industry leader underscores the viability and promise of whats possible as we usher in the era of quantum computing.

IonQ was selected for inclusion in the New Frontiers category. To assemble the list, TIME solicited nominations from every sectorranging from health care and entertainment to transportation and technologyfrom its editors and correspondents around the world, as well as from industry experts. TIME editors then evaluated each nominee on factors including company relevance, impact, innovation, leadership and success.

IonQ's inclusion in the TIME100 is the company's latest achievement in a year of significant momentum in business and technical breakthroughs. On Monday, IonQ announced the Companys Q4 2021 and Full Year 2021 earnings. During this announcement, the team highlighted remarkable technology progress and announced that it achieved more than triple its original bookings projection. In October, IonQ became the first publicly traded pure-play quantum computing company.

As for technical breakthroughs, this year, IonQ introduced IonQ Aria, the worlds most powerful quantum computer, based on standard application-oriented industry benchmarks; discovered a new family of quantum gates that can accelerate quantum algorithms and can only be conducted on IonQ and Duke University systems; became the first company to use barium ions as qubits to further enable advanced quantum computing architectures; and secured a public-private manufacturing partnership with Pacific Northwest National Lab (PNNL) to produce barium qubits. IonQ is also the only quantum hardware company with computers accessible on all three major cloud providers, and its computers are being used to tackle problems ranging from financial modeling to electric vehicle battery chemistry and risk management.

See the complete TIME100 Most Influential Companies 2022 list here: time.com/100companies

About IonQ

IonQ, Inc. is a leader in quantum computing, with a proven track record of innovation and deployment. IonQs latest generation quantum computer, IonQ Aria, is the worlds most powerful quantum computer, and IonQ has defined what it believes is the best path forward to scale.

IonQ is the only company with its quantum systems available through the cloud on Amazon Braket, Microsoft Azure, and Google Cloud, as well as through direct API access. IonQ was founded in 2015 by Christopher Monroe and Jungsang Kim based on 25 years of pioneering research. To learn more, visit http://www.ionq.com.

About TIME

TIME is a global media brand that reaches a combined audience of more than 100 million around the world. A trusted destination for reporting and insight, TIME's mission is to tell the stories that matter most, to lead conversations that change the world and to deepen understanding of the ideas and events that define our time. With unparalleled access to the world's most influential people, the immeasurable trust of consumers globally, an unrivaled power to convene, TIME is one of the world's most recognizable media brands with renowned franchises that include the TIME100 Most Influential People, Person of the Year, Firsts, Best Inventions, World's Greatest Places and premium events including the TIME100 Summit and Gala, TIME100 Health Summit, TIME100 Next and more.

IonQ Forward-Looking Statements

This press release contains certain forward-looking statements within the meaning of Section 27A of the Securities Act of 1933, as amended, and Section 21E of the Securities Exchange Act of 1934, as amended. Some of the forward-looking statements can be identified by the use of forward-looking words. Statements that are not historical in nature, including the words anticipate, expect, suggests, plan, believe, intend, estimates, targets, projects, should, could, would, may, will, forecast and other similar expressions are intended to identify forward-looking statements. These statements include those related to IonQ Arias technical achievements, future potential, and status in the quantum computing industry. Forward-looking statements are predictions, projections and other statements about future events that are based on current expectations and assumptions and, as a result, are subject to risks and uncertainties. Many factors could cause actual future events to differ materially from the forward-looking statements in this press release, including but not limited to: market adoption of quantum computing solutions and IonQs products, services and solutions; the ability of IonQ to protect its intellectual property; changes in the competitive industries in which IonQ operates; changes in laws and regulations affecting IonQs business; IonQs ability to implement its business plans, forecasts and other expectations, and identify and realize additional partnerships and opportunities; and the risk of downturns in the market and the technology industry including, but not limited to, as a result of the COVID-19 pandemic and the recent incursion into Ukraine. The foregoing list of factors is not exhaustive. You should carefully consider the foregoing factors and the other risks and uncertainties described in the Risk Factors section of IonQs Quarterly Report on Form 10-K for the year ended December 31, 2021 and other documents filed by IonQ from time to time with the Securities and Exchange Commission. These filings identify and address other important risks and uncertainties that could cause actual events and results to differ materially from those contained in the forward-looking statements. Forward-looking statements speak only as of the date they are made. Readers are cautioned not to put undue reliance on forward-looking statements, and IonQ assumes no obligation and does not intend to update or revise these forward-looking statements, whether as a result of new information, future events, or otherwise. IonQ does not give any assurance that it will achieve its expectations.

Read more from the original source:
IonQ Named As One Of TIME's 100 Most Influential Companies - Business Wire

How Kronos Could Help the US Win the Fusion and Quantum Computing Race With China – GlobeNewswire

WASHINGTON, March 28, 2022 (GLOBE NEWSWIRE) -- Major world governments are increasingly focusing on fusion energy research as a potential foundation for gaining the economic and military advantage in the twenty-first century, and perhaps beyond. In this emerging arena of supercharged competition, the quantum computing systems, algorithms, and tokamak design plans developed by Kronos Fusion Energy Defense Systems could be a key factor in winning a significant edge for the USA over its economic and political rival, China.

Fusion energy, known theoretically since 1920, promises potentially near-limitless energy generation, free from polluting or radioactive byproducts. With rising petroleum costs and the looming specter of global warming, developing workable fusion technology is more urgent than ever. The first country to make breakthroughs to practical fusion will become the world's energy leader, giving its decisive advantage in commerce, defense, and space exploration that could last for generations.

With immense government backing and funding, most recently reinforced in China's 14th Five-Year Plan, Chinese scientists seemingly lead the world with the $900 million Experimental Advanced Superconducting Tokamak (EAST). The EAST recently set records by maintaining stable plasma at 120 million degrees for more than 1.5 minutes. China budgeted hundreds of millions more to operate and upgrade the EAST reactor, while funding the training of over 1,000 new fusion physicists.

China's vigorous fusion program is committed to developing its quantum computing resources. Centered on the recently founded Chinese National Laboratory for Quantum Information Sciences, the program has received billions of dollars in funding. China currently holds 2.5 times more patents in deep learning than America, as well as a cornerstone of advanced quantum computing, while aggressively pursuing further developments. Chinese premier Xi Jinping even describes these technological sectors as the "main battleground" between the USA and China.

Currently, the edge in these economically and strategically vital technologies arguably belong to the PRC. However, Kronos offers the potential to redress this balance by bringing together quantum computing and fusion energy into a single powerful project. Harnessing the ability of quantum devices, neural networks, and machine learning to crunch immense quantities of data, while testing a multidimensional array of thousands of problems, learning and adapting in real-time, the potent simulations Kronos has developed should enable building fusion tokamaks 4,000% more effective than current reactors.

Kronos believes the lightning-fast development and analysis cycle provided by its algorithms will empower the U.S. to leapfrog twenty years ahead of China in fusion energy generation. Its quantum computing systems will not only enable developing precise, efficient fusion reactor designs, compact fusion engines for spacecraft, and other fusion technology, but demonstrate the viability of quantum learning as a breakthrough tool of economic and scientific success. Kronos' cutting-edge "proof-of-concept" will potentially attract robust public and private investment to the wider quantum research sector, putting the USA on course to achieve superiority not only in tokamak design but also in quantum computing research.

PR Contact: Erin Pendleton - pr@kronosfusionenergy.com

Related Images

Image 1

This content was issued through the press release distribution service at Newswire.com.

View original post here:
How Kronos Could Help the US Win the Fusion and Quantum Computing Race With China - GlobeNewswire

End of open source: Dispelling the myth – ITProPortal

Following the Log4j vulnerability disclosure in December 2021 and the recent case of a developer sabotaging his own Javascript libraries color.js & faker.js, the stateof open source software has been called into question.

With a high-profile meeting at the White House on open source, and Executive Orders from US President Biden, some have even suggested it is the end of open source. While it might be tempting to view a major vulnerability as an indication of open source somehow being deficient, the reality is far from that.

Open source software is not more nor less secure than commercial software.In fact, most commercial software either includes or runs on open source technologies. Open source simply means that the software is developed in a manner where the source code is available to anyone who wants it.

What we have been seeing with theLog4jresponse from the ApacheLog4jteam is exactly what we would expect to see a team that is taking the software they produce seriously and being responsive to the needs of their install base. Considering that they are volunteers, such a response is indicative of the pride of ownership we often see within open source communities.

Rather than instigate the end of open source, an incident likeLog4jis likely to improve open source development as a whole much in the same way that Heartbleed improved development practices of both open and closed source development teams.So, if open source is here to stay, what should organisations be doing moving forwards to more efficiently identify and mitigate vulnerabilities?

Well, the idea of identifying and mitigating vulnerabilities requires us to define some roles up-front. Most people expect their software suppliers - that is to say the people who produce the software they depend upon - to test that software. The outcome of that testing would be a set of findings highlighting the weaknesses in the software that the supplier produces. In an ideal world, each of those weaknesses would be resolved prior to the software shipping.

In the real world, however, some of those weaknesses will be fixed, some will be marked as no plan to fix and some will optimistically be fixed in a future release. What the list of weaknesses are, and which ones were fixed, is not something a supplier typically divulges. Moreover, there is no one tool that can find all weaknesses, and some only work if you have the source code, while others require a running application.

You will note that no mention was made of the word vulnerability in this, as it has a special and simple meaning. In software, a vulnerability is simply a weakness that can be exploited or that has a reasonable chance of exploitation.

Most, but not all, vulnerabilities are disclosed via a centralised system known as the National Vulnerability Database, or simply the NVD. While the NVD has roots in the US, and is maintained by the US Government, the contents of the NVD are available to all and replicated in multiple countries. From a governance perspective, monitoring for changes in the contents of the NVD is a good way of staying on top of new vulnerability disclosures.

The problem is that the NVD updates slower than media coverage, so with major vulnerabilities like Log4Shell, HeartBleed and Dirty Cow, the team discovering the vulnerability might create a branded name for the vulnerability in an effort to broaden awareness of the issue. Creating a governance policy that monitors for media coverage of these cyber-events is certainly not great practice.

If media coverage as an input to vulnerability management is a bad idea, and the NVD is a bit slow to provide all details, what is the best governance policy then? That comes from a type of security tool known as Software Composition Analysis, or SCA. An SCA tool looks at either the source code for an application, or the executable or libraries that define the application, and attempts to determine which open source libraries were used to create that application.

The listing of those libraries is known as an SBOM, or Software Bill of Materials. Assuming the SCA software does its job properly, then a governance policy can be created that maps the NVD data to the SBOM so you know what to patch Except that there is still that latent NVD data to account for.

Some of the more advanced SCA tools solve that problem by creating advisories that proactivelyalert users when there is an NVD entry pending but where the details of that NVD entry are augmented by the SCA vendor. Some of the most advanced tools also invest in testing or validating which versions of the software are impacted by the vulnerability disclosure.

Nevertheless, while SCA software can close the gap between disclosure and identification, it should be noted that it does have a fundamental limitation. If the SCA software has not scanned all of your applications, then at best it can only flag new vulnerability disclosures for a subset of your applications.

From a governance policy perspective, it then becomes an IT function to identify all software and a procurement function to ensure that all software, including updates and free downloads, both come under an SBOM and that the SBOM is validated using SCA software. Since software is available in both source and binary formats, it is critical that governance teams heading down this path select SCA software that can effectively process software in all forms and formats. Such a governance policy would assist the identification of new vulnerability disclosures and the impact to the business, but would leave the matter of effective mitigation to a different policy, since mitigation would require application testing.

Ensuring the security of ones own technology is one thing, but the beauty of open-source is that it is built to be collaborative.

To paraphrase Abraham Lincoln, open source is technology of the people, by the people and for the people. The modern open source movement was founded on the principle that if you did not like the way the code was working, then you were free to modify it and address whatever gaps in functionality that were perceived to exist.

Part of the problem that we face today is a sentiment that has consumers or users of open source projects behaving as if the open source project is a commercial software vendor.

If you look at the issues list of any reasonably popular open source project on GitHub, you will see feature requests and comments about when certain problems might be resolved. Suchissue reports and complaints about serviceability have an implicit expectation that a product manager is on the receiving end of those requests and that they will be added to a roadmap and eventually be released all for free.

In reality, gaps in functionality and even in perceived bugs, represent opportunities not to request free programming services but instead to contribute to the future success of code that is significantly important to the person complaining.

Yes, some people wont know the programming language used by the project, but to expect other people to prioritise a complaint from an unknown third party over changes that solve problems for active contributors is not realistic. As much as anything, open source functions through the altruism of contributors.

Over recent years we have heard core contributors for popular open source projects express frustration about the profits made by large businesses from the use of their software. While it is easy to relate to someone putting their energy into a project only to have a third party profit from the efforts, the reality is that if that third party is profiting from the efforts of an open source development team, then they should be contributing to its future success.

If they dont, then they run the risk that not only the code in question might change in ways they didnt expect, but also that when security issues are identified and resolved, that they might have delays in applying those fixes. After all, if a business isnt taking the time to engage with teams creating the software that powers their business, then it is likely they do not know where all the software powering their business originates and cannot reliably patch it.

Finding vulnerabilities in open source is not a problem, but the detection of software defects representing a weakness that could be exploited, is an important topic. While open source and closed source software have an equal potential for security issues, with open source it is possible for anyone to identify those issues. With that in mind, organisations must take proactive steps - that does not rely on media coverage - to monitor the latest vulnerabilities.

Equally important, they must play a contributing role to the open source projects they benefit from, otherwise they might fall victim to unexpected code changes or delayed awareness of critical patches.

Tim Mackey is Principal Security Strategist at Synopsys.

See the original post:
End of open source: Dispelling the myth - ITProPortal

The wild world of non-C operating systems – The Register

Believe it or not, not everything is based on C. There are current, shipping, commercial OSes written before C was invented, and now others in both newer and older languages that don't involve C at any level or layer.

Computer hardware is technology yet very few people can design their own processor, or build a graphics card. But software is a form of culture. Open source is created by volunteers, even if they end up getting paid jobs doing it. Even rejecting open source is a choice: paying for Windows or macOS instead reflects a preference.

This is especially visible when it comes to text editors, and even more so about programming languages. People get passionate about this stuff. So statements such as "C isn't a programming language any more" can be upsetting. Most people live and work in the cultures that are Unix and Windows and if they are all you've ever known, or know best, then it's easy to think they are the whole world.

But that doesn't make it true.

C fans tend to regard BCPL as an unimportant transitional step, but it was used in two celebrated OSes. One was the TRIPOS operating system, later used as the original basis of AmigaOS. The core system software of the original GUI workstation, the Xerox Alto, was also written in BCPL, although much of it was later rewritten in Mesa. The Alto OS survived into the late 1990s as GlobalView, which Xerox sold as a high-end desktop publishing tool.

In the 1960s, ALGOL was huge. It's the granddaddy of most modern imperative languages.

Burroughs Corporation designed a series of mainframes, the Burroughs Large Systems, around the pioneering idea of writing the OS and all applications in a high-level language, ALGOL. The first machine, the B5000, was launched in 1961. Burroughs merged with Sperry UNIVAC in 1986 to form Unisys. The Unisys Clearpath MCP OS is a direct descendant of the B5000's MCP or Master Control Program. (Yes, the same name as the big baddie in Tron.)

MCP was the first commercial OS to support virtual memory and shared libraries. Its purely stack-based design inspired Chuck Moore to develop Forth, HP to design the HP3000 mid-range computers, and influenced Alan Kay in the development of Smalltalk at Xerox PARC.

The current version of ClearPath MCP is 20.0, released in May 2021.

Swiss boffin Niklaus Wirth worked at Xerox PARC for two one-year sabbaticals. Today he's best known for inventing Pascal, which happened to catch on as an applications language notably as Borland's Delphi. An earlier implementation, the UCSD p-System, was a complete OS one of the three IBM offered for the original PC in 1981.

Pascal was just one stage in a series of "Wirthian" languages. The successor to Pascal was Modula, but it was quickly superseded by Modula-2, based on Wirth's time working with Mesa on the Alto at Xerox PARC. Modula-2 was specifically designed for OSes as well as apps.

After his first sabbatical in Palo Alto in 1976-1977, once Wirth got back home to ETH Zrich in 1977, he and his team designed and built the Lilith workstation as a cheaper replacement for the $32,000 Alto. Its object-oriented OS, Medos-2, was entirely built in Modula-2.

Meanwhile, in the USSR, a team at the Soviet Academy of Sciences in Novosibirsk, inspired by Wirth's work but limited by COCOM import restrictions, built their own OS called Excelsior in Modula-2, which ran on a series of workstations called Kronos.

Wirth returned to Xerox PARC in 1984-1985, and after that trip, back at ETH he designed the Ceres workstation, with an OS implemented in a new language, Oberon, whose text-based tiling-window interface inspired the successor to Unix, Plan 9 from Bell Labs.

(The influence went both ways. A team at Xerox PARC developed Pascal into the Euclid language, and a variant was even used to develop a Unix-compatible OS called Tunis, the Toronto University System.)

Oberon has been called the "overlooked jewel" [PDF] of computer science. It has multiple descendants today. Wirth came out of retirement in 2013 to help Project Oberon, which runs a modernized version of the OS on modern FPGA-based hardware. Another version, Native Oberon, runs on x86-32 PCs and under QEMU.

If you want to try it without installing, there's also a JavaScript version that runs in your browser. Its UI is efficient but quite strange, because it was designed in the late 1980s and predates almost every other graphical desktop except the original Mac.

Later researchers at ETH built a language called Active Oberon, and using that, a newer OS with a slightly more conventional zooming GUI called "Bluebottle". Originally the new Oberon OS was called AOS, but so are multiple other projects, so now it's called A2.

Most of these systems are relatively obscure, though. C and its descendants and derivatives remain far more mainstream. Even if Linus Torvalds is famously not fond of it, "the world is built on C++." It's not usually thought of as suitable for implementing OS kernels in, but two of the most-loved ever nonetheless were.

The Mac OS X might-have-been BeOS was entirely written in C++, and so is its modern FOSS re-implementation Haiku. It took the Haiku project 17 years to get to its first beta version, but it's now on Beta 3. Unlike its ancestor, these days it has a lot of ported FOSS apps from Linux and even some limited ability to run Windows programs.

The C++ OS that was by far the biggest commercial success was Symbian. Psion built it from scratch in the late 1990s, and for a while it was the dominant smartphone OS. We did an epic three-part feature, "Symbian, The Secret History" (part 2 and part 3) a decade ago.

But Nokia failed to capitalize on it, and Apple's iOS and Android killed it off. It eventually went open source, although not without the odd hitch.

Today, C++ is being used to build the Genode OS Framework. The latest version, release 22.02, came out just last month.

In terms of popularity, depending on which survey you look at, Rust is stalking or eclipsing C++, and there are several projects to build OSes in Rust, notably including the somewhat Minix-like Redox OS, the more experimental Theseus, and the embedded Tock.

Even Microsoft has experimented with Singularity, partly implemented in C# and other .NET-based languages, as is the FOSS COSMOS.

BCPL, C, C++, Rust, and the Pascal family all inherit from the design of ALGOL. All are imperative programming languages. There are other, different schools of language design. One of the oldest and best-loved programming languages around is Lisp, designed by John McCarthy in 1958.

Symbolics, the company which owned the first ever dotcom domain, built an entire OS in Lisp, called Genera. The last version, OpenGenera, ran on an emulator on DEC Alpha workstations, and today you can run it on Linux, but sadly the inheritors of the defunct company won't open-source it.

However, there was also a family of Lisp-based graphical operating systems that ran on Xerox's later Star workstations, written in a dialect called Interlisp. This is now FOSS and being actively modernized.

There are also several modern efforts to build OSes in Lisp, including Mezzano, Interim, and ChrysaLisp, the latter by one of the lead developers of Tao Systems' TAOS.

It would also be unfair not to include Urbit. This is a completely from-scratch effort that has reinvented some very Lisp-like technology, although the project's links to cryptocurrencies, and some of the politics of its founder Curtis Yarvin, make the author wary.

This is not intended to be a comprehensive list. There are too many such efforts to count, and that's even with intentionally excluding early OSes that were partly or wholly written in assembly language. There were several Java-based OSes, including Sun JavaOS and smartphone OS SavaJe. There are still stranger things out there, such as House, implemented in the purely functional Haskell.

This is just an overview, but I hope it makes a point: that OSes do not begin and end with C. If you can name a language that can run on bare metal, someone somewhere has probably tried to implement an OS in it, and some of them have become a lot more successful than you might expect.

Read more:
The wild world of non-C operating systems - The Register

10 Podcasts on Programming to Learn How to Be a Better Developer – Business Insider

With so many free resources from websites to YouTube tutorials available to help teach new skills, it's a great time to be a developer.

Whether you're learning how to code, a self-taught programmer, or an industry veteran, podcasts in particular can be a convenient tool for enhancing your skills and technical knowledge. But being a software developer isn't just about complicated code and programming languages developers create and maintain products that influence our everyday lives, so it's important to understand the human impact software creates, too.

While many shows help break down the technical, there are several popular podcasts that discuss that impact and touch on often overlooked aspects of development. Many are also hosted by developers themselves, who provide behind-the-scenes looks at careers in programming. Software developers also use podcasts to share career tips, thoughts on current tech news, and what it's like to be a tech worker during a huge moment of change within the industry.

Based on feedback from developers on social media, Insidercompiled a list of podcasts every aspiring software developer can add to their queue to improve their programming skills.

Go here to see the original:
10 Podcasts on Programming to Learn How to Be a Better Developer - Business Insider

New Spring4Shell Zero-Day Vulnerability Confirmed: What it is and how to be prepared – Security Boulevard

On March 29, 2022, a Chinese cybersecurity research firm leaked an attack that could impact most enterprise Java applications, globally. An investigation of the issue showed that the root cause was a vulnerability in the widely used, free, community-developed, open-source programming framework called Spring Core.

The Spring Framework is the foundation for most enterprise applications written in the Java programming language. Our recent data showed Spring Core as being used by 74% of Java applications. Specifically, Spring provides the plumbing of enterprise applications so that teams can focus on application-level business logic, without unnecessary ties to specific deployment environments.

As of Wednesday, March 30, the Contrast Security Labs team confirmed the 0-day vulnerability by use of a public poc, Spring4Shell, which could be the source of Remote Code Execution (RCE).

Spring translates the body and parameters of an HTTP request and turns them into a domain object for developers to use. This makes their lives easier.

In the process of building an object graph to give to the developer, Spring takes special care not to let attackers control any parts of the Class, ProtectionDomain, and ClassLoader of the instance being created. Unfortunately, changes to the Class object in Java 9 meant the checks Spring performed were no longer enough.

The code in question is shown here:

https://github.com/spring-projects/spring-framework/blob/b595dc1dfad9db534ca7b9e8f46bb9926b88ab5a/spring-beans/src/main/java/org/springframework/beans/CachedIntrospectionResults.java#L288

PropertyDescriptor[] pds = this.beanInfo.getPropertyDescriptors();for (PropertyDescriptor pd : pds) {if (Class.class == beanClass && (classLoader.equals(pd.getName()) || protectionDomain.equals(pd.getName()))) {// Ignore Class.getClassLoader() and getProtectionDomain() methods nobody needs to bind to thosecontinue;}

This code attempts to restrict access from overriding these object graph paths:

However, because the Class object now exposes a getModule() method, attackers can now take this slightly different path:

The introduction of Class#getModule() couldnt have been directly foreseen when they wrote this code, although we could have a spirited debate about the robustness of this style of check.

The consequences of handing users control of properties of the ClassLoader depend on the features of the ClassLoader being exploited.

The exploit and PoC being run around shows an attacker exploiting features of Tomcat 9s WebAppClassLoaderBase. The exploit works in a few stages:

Java 9 was released in July of 2017, so this vulnerability has been exploitable in production apps and APIs for five years.

The video below shows the exploit in a few quick requests. The exploit posts a payload to the index of the basic Spring boot application. The exploit takes advantage of the missing binding configuration and creates a malicious JSP on the filesystem in a web-accessible directory. From there, a request is sent with the id command to request the current ID of the user which returns as uid=0(root) gid=0(root) groups=0(root) which shows in this case the application is running as root.

There are a few requirements for an application to be vulnerable:

All of the above, plus you must be running Tomcat (unknown ranges of versions yet, but certainly including 9), because the exploit takes advantage of Tomcats ClassLoader and logging facility to write a malicious, backdoor JSP.

Its prudent to assume that exploits will be coming that take advantage of different class loaders or another environment context and that any vulnerable Spring applications that satisfy the conditions of the first section will be vulnerable.

For now, Contrast Labs recommends:

1) For all who are using Spring core and binding to non-basic types such as POJOs, set the allowed fields to specify the only bindings you wish your application to use.

2) For Contrast customers, ensure Contrast Protect is enabled on your Spring applications (especially those on JDK 9+). As you can see in the video below, when Protect is configured properly, it blocks the attack.

Spring4Shell with Protect Video:

To best protect your applications these are the settings you should enable in your Protect monitored applications in blocking mode.

(a) Command Injection in blocking mode

Example when Command Injection is in Blocking mode.

(b) For more visibility of attacks targeting your environment Enable CVE Shields for CVE-2014-0112 and CVE-2014-0114 (these specific CVE shields are for Struts issues, however, due to the similar nature of the payloads, this provides visibility into attacks through Probes)

Example when CVE shields are enabled

We have published communication to our Support Portal letting customers know we are researching implications and we will let them know exactly how they can fix this issue in their systems. Our team is currently researching the ability to exploit this vulnerability outside of a Tomcat environment.

As always, Contrast will continue to monitor the situation with Spring4Shell. The security of our customers is of utmost importance to us. If you have any questions, concerns, or would like to discuss this issue further, please dont hesitate to reach out to us at [emailprotected].

*This blog article will include updates about Spring4Shell as they become available.

David Lindner & Arshan Dabirsiaghi

By subscribing to our blog you will stay on top of all the latest appsec news and devops best practices. You will also be informed of the latest Contrast product news and exciting application security events.

On March 29, 2022, a Chinese cybersecurity research firm leaked an attack that could impact most enterprise Java applications, globally. An investigation of the issue showed that the root cause was a vulnerability in the widely used, free, community-developed, open-source programming framework called Spring Core.

The Spring Framework is the foundation for most enterprise applications written in the Java programming language. Our recent data showed Spring Core as being used by 74% of Java applications. Specifically, Spring provides the plumbing of enterprise applications so that teams can focus on application-level business logic, without unnecessary ties to specific deployment environments.

As of Wednesday, March 30, the Contrast Security Labs team confirmed the 0-day vulnerability by use of a public poc, Spring4Shell, which could be the source of Remote Code Execution (RCE).

Spring translates the body and parameters of an HTTP request and turns them into a domain object for developers to use. This makes their lives easier.

In the process of building an object graph to give to the developer, Spring takes special care not to let attackers control any parts of the Class, ProtectionDomain, and ClassLoader of the instance being created. Unfortunately, changes to the Class object in Java 9 meant the checks Spring performed were no longer enough.

The code in question is shown here:

https://github.com/spring-projects/spring-framework/blob/b595dc1dfad9db534ca7b9e8f46bb9926b88ab5a/spring-beans/src/main/java/org/springframework/beans/CachedIntrospectionResults.java#L288

PropertyDescriptor[] pds = this.beanInfo.getPropertyDescriptors();for (PropertyDescriptor pd : pds) {if (Class.class == beanClass && (classLoader.equals(pd.getName()) || protectionDomain.equals(pd.getName()))) {// Ignore Class.getClassLoader() and getProtectionDomain() methods nobody needs to bind to thosecontinue;}

This code attempts to restrict access from overriding these object graph paths:

However, because the Class object now exposes a getModule() method, attackers can now take this slightly different path:

The introduction of Class#getModule() couldnt have been directly foreseen when they wrote this code, although we could have a spirited debate about the robustness of this style of check.

The consequences of handing users control of properties of the ClassLoader depend on the features of the ClassLoader being exploited.

The exploit and PoC being run around shows an attacker exploiting features of Tomcat 9s WebAppClassLoaderBase. The exploit works in a few stages:

Java 9 was released in July of 2017, so this vulnerability has been exploitable in production apps and APIs for five years.

The video below shows the exploit in a few quick requests. The exploit posts a payload to the index of the basic Spring boot application. The exploit takes advantage of the missing binding configuration and creates a malicious JSP on the filesystem in a web-accessible directory. From there, a request is sent with the id command to request the current ID of the user which returns as uid=0(root) gid=0(root) groups=0(root) which shows in this case the application is running as root.

There are a few requirements for an application to be vulnerable:

All of the above, plus you must be running Tomcat (unknown ranges of versions yet, but certainly including 9), because the exploit takes advantage of Tomcats ClassLoader and logging facility to write a malicious, backdoor JSP.

Its prudent to assume that exploits will be coming that take advantage of different class loaders or another environment context and that any vulnerable Spring applications that satisfy the conditions of the first section will be vulnerable.

For now, Contrast Labs recommends:

1) For all who are using Spring core and binding to non-basic types such as POJOs, set the allowed fields to specify the only bindings you wish your application to use.

2) For Contrast customers, ensure Contrast Protect is enabled on your Spring applications (especially those on JDK 9+). As you can see in the video below, when Protect is configured properly, it blocks the attack.

Spring4Shell with Protect Video:

To best protect your applications these are the settings you should enable in your Protect monitored applications in blocking mode.

(a) Command Injection in blocking mode

Example when Command Injection is in Blocking mode.

(b) For more visibility of attacks targeting your environment Enable CVE Shields for CVE-2014-0112 and CVE-2014-0114 (these specific CVE shields are for Struts issues, however, due to the similar nature of the payloads, this provides visibility into attacks through Probes)

Example when CVE shields are enabled

We have published communication to our Support Portal letting customers know we are researching implications and we will let them know exactly how they can fix this issue in their systems. Our team is currently researching the ability to exploit this vulnerability outside of a Tomcat environment.

As always, Contrast will continue to monitor the situation with Spring4Shell. The security of our customers is of utmost importance to us. If you have any questions, concerns, or would like to discuss this issue further, please dont hesitate to reach out to us at [emailprotected].

*This blog article will include updates about Spring4Shell as they become available.

Read more:
New Spring4Shell Zero-Day Vulnerability Confirmed: What it is and how to be prepared - Security Boulevard

Bonitasoft introduces new tooling for developers to ease testing and deployment of automation projects – CIO Dive

SAN FRANCISCO

Bonita Digital Process Automation platform intentionally designed for developers offers a brand-new test kit and improved deployment technologies

March 31, 2022 - San Francisco, CA Bonitasoft, the leading open-source digital process automation company, today announced the launch of significant new additions to empower developers to its flagship Bonita digital process automation platform. With the new Bonita Test Toolkit and other technologies delivered with Bonita 2022.1, development teams now have the tooling they need to test, deploy and continuously deliver business process automation projects more easily. More and better testing supports sustainable automation, providing good stability for Bonita projects and allowing the development team to have more confidence in changes.

The Bonita Test Toolkit accelerates testing and deployment of complex, core, and critical automation projects. Projects can be tested efficiently at each step (as well as end-to-end) with the tooling provided to write and execute tests locally - in the Bonita Studio development environment or in any remote non-production environment. The Bonita Test Toolkit can be executed from an IDE, a command line or a Continuous Integration. It is also integrated into the Bonita Continuous Delivery add-on to include testing in the delivery pipeline.

Along with this new Bonita Test Toolkit and Bonita 2022.1, Bonitasoft is also releasing a new Docker image, robust security, with stable update handling for user authorizations, and an updated Bonita Update Tool.

Bonita 2022.1 also comes with an updated Docker image. Its healthcheck checks all Bonita levels so if something goes wrong, it is immediately visible and can be managed. Users can now take either the Bonita Docker image with its own database schema or use their specific database configured for Bonita. This new Bonita Docker image also now includes its own LDAP synchronizer, all adding up to a smoother, easier deployment.

The Bonita 2022.1 platform also includes better performance and robustness in its Runtime engine, with robust security vulnerabilities handling. Dynamic permission authorizations are activated by default. Any custom code added for permissions remains intact during platform updates, so the development team needs no time-consuming rework.

The Bonita Update tool permits easy, automated migration of Bonita process applications from one Bonita version to another, and it comes with improved documentation.

With these and other new and improved capabilities in and with Bonita, we are providing the tools we promised to developers to ease testing and deployment for enterprise-critical applications, said Miguel Valdes Faura, Bonitasoft CEO and co-founder. Our strategy is to empower developers, and the release of Bonita 2022.1 is specifically intended to support their needs.

The Bonita Community open-source edition includes all capabilities required to develop and deploy process automation projects, and can be downloaded here. About Bonitasoft Bonitasoft fully supports digital operations and modernization of information systems with Bonita, an open-source and extensible platform for automation and optimization of business processes. The Bonita platform accelerates development and production with a clear separation between visual programming and coding capabilities. Bonita integrates with existing information systems, orchestrates heterogeneous systems and provides deep visibility across all enterprise processes.

###

About Bonitasoft

Bonitasoft fully supports digital operations and modernization of information systems with Bonita, an open-source and extensible platform for automation and optimization of business processes. The Bonita platform accelerates development and production with a clear separation between visual programming and coding capabilities. Bonita integrates with existing information systems, orchestrates heterogeneous systems and provides deep visibility across all enterprise processes.

Go here to read the rest:
Bonitasoft introduces new tooling for developers to ease testing and deployment of automation projects - CIO Dive

University Recreation Tennis Center Set to Open – University of Arkansas Newswire

Photo Submitted

Students test out the new tennis courts

University Recreation's (UREC) new tennis courts are scheduled to open on Thursday, March 31, at 11 a.m. The UREC Tennis Center is located at 1357 W. Indian Trail off Martin Luther King Blvd. in Fayetteville. The UREC Tennis Center features 12 tennis courts (four courts withbackboards to allow for individual play) and a service facility that includes an equipment checkout center and restrooms.Pickleball can be played on the courts. Tennis racquets, pickleball racquets and balls can be checked out for use. Participants are encouraged to bring their own tennis balls. "There have been so many positive experiences being a part of the Tennis Club at the University of Arkansas the past four years. Our freshman year we played on two courts, where we all took turns playing. We are so excited that this facility will allow us the opportunity togrow even more as a team," said Clare Kellough and Craig Owens, 2021-2022 UREC Tennis Club co-presidents.

"We are excited to open the UREC Tennis Center; this project will bring tennis back on campus for recreational users and expandoptions for programming and educational opportunities,"said Becky Todd, executive director ofUniversity Recreation. "We look forward to welcoming the campus community to this beautiful facility. "A grand opening celebration is set forTuesday, April 12.

Read more here:
University Recreation Tennis Center Set to Open - University of Arkansas Newswire

Top 10 Algorithms Helping the Superintelligent AI Growth in 2022 – Analytics Insight

Superintelligent AI is not here yet, but these top 10 algorithms are extensively working towards its growth.

Superintelligence, roughly defined as an AI algorithm that can solve all problems better than people, will be a watershed for humanity and tech. Even the best human experts have trouble making predictions about highly probabilistic, wicked problems. And yet those wicked problems surround us. We are all living through an immense change in complex systems that impact the climate, public health, geopolitics, and basic needs served by the supply chain. Even though the actual concept of superintelligent AI is yet to be materialized, several algorithms are working to help in its growth. Here are such top 10 algorithms that are building a future for the growth of superintelligent AI.

This is the beginning of a superintelligent AI system that translates natural language to code. Codex is the model that powers GitHub Copilot, which was built and launched in partnership with GitHub a month ago. Proficient in more than a dozen programming languages, Codex can now interpret simple commands in natural language and execute them on the users behalfmaking it possible to build a natural language interface to existing applications.

CLEVER (Combining Levels of Bug Prevention and Resolution techniques) was created in a joint effort with Ubisoft and Mozilla designers. The Clever-Commit is an AI coding assistant which combines data from the bug tracking system and the codebase and helps in looking through the mistakes and bugs in the codes. The coding partner is right now being utilized inside Ubisoft for game improvement purposes. It is one of the best AI coding systems aiming for superintelligent AI.

AlphaCode was tested against challenges curated by Codeforces, a competitive coding platform that shares weekly problems and issues rankings for coders similar to the Elo rating system used in chess. These challenges are different from the sort of tasks a coder might face while making, say, a commercial app.

Built-in AI and Machine Learning, Embold is an intelligent, multi-layered analyzer for programming projects that looks forward to the growth of superintelligent AI. It comprehends the situation with the product quality and identifies issues as well as suggests arrangements and recommends code examination for the specific issue. It analyses source code utilizing strategies like natural language processing (NLP), machine learning, and a set of algorithms in order to find design issues, bugs, and so on.

Tabnines Public Code AI algorithm is the foundation for all its code completion tools and its the perfect algorithm set for the emergence of superintelligent AI. The Free, Advanced, and Business level solutions train on trusted open-source code with permissive licenses. Tabnines AI Assistant anticipates your coding needs, providing code completions for you and your development team that boosts your productivity.

mabl is a Software-as-a-Service (SaaS) supplier and abound together with the DevTestOps stage for AI and Machine Learning-based test robotization. The critical highlights of this arrangement incorporate auto-recuperating tests, Artificial Intelligence-driven relapse testing, visual peculiarity discovery, secure testing, information-driven useful testing, cross-program testing, test yield, reconciliation with well-known devices, and substantially more.

Augmented Coding is a set of tools that leverage the power of AI to enhance the coding process, making it easier for developers to cover compliance needs over documentation, reuse of existing code, and code retrieval within your IDE. It is one of the best AI coding systems available in the market today.

Pylint is a Python source code analyzer that searches for programming mistakes, assists with authorizing a coding standard, and other such. This quality checker for Python programming incorporates a few elements, for example, coding standard where it checks for the length of line codes, mistake identification, refactoring by recognizing the copied code, among others. It is one of the best AI coding systems that are going to be a vital element in the growth of superintelligent AI.

Sketch2Code is a web-based solution that uses Artificial Intelligence to transform a handwritten user interface plan from an image to a legitimate HTML markup code. The arrangement works in a manner, for example, it initially recognizes the plan designs, comprehends the manually written draw or text, comprehends the construction, and afterward assembles a legitimate HTML code as needed to the identified format containing the distinguished plan components. It is one of the best AI coding systems available in the market today.

AI-assisted development. IntelliCode saves you time by putting what youre most likely to use at the top of your completion list. IntelliCode recommendations are based on thousands of open source projects on GitHub each with over 100 stars. When combined with the context of your code, the completion list is tailored to promote common practices. It is one of the best AI coding systems that are as good as human programmers.

Share This ArticleDo the sharing thingy

More:
Top 10 Algorithms Helping the Superintelligent AI Growth in 2022 - Analytics Insight