The IRS Takes on Cryptocurrency-Funded Terrorists – BankInfoSecurity.com

The IRS Criminal Investigation Cyber Crimes Unit is waging a battle against the use of cryptocurrency for financing terrorists and other money-laundering activities. Agents Chris Janczewski and Jon Gebhart describe recent cryptocurrency-related takedowns.

In a joint interview with Information Security Media Group (see audio link below photos), agents Janczewski and Gebhart discuss:

Janczewski is a special agent with the Internal Revenue Service - Criminal Investigation's Cyber Crimes Unit in Washington. His investigative experience includes international money laundering cases relating to child exploitation, terrorism, state-sponsored hacks, darknet markets, virtual currency, stolen identity and tax refund fraud, and the Department of Justice's Organized Crime Drug Enforcement Task Force Program.

Gebhart is also a special agent in the same IRS unit. He conducts investigations into allegations of tax fraud, money laundering and violations of the Bank Secrecy Act, focusing on those involving virtual currency. A certified public accountant, he held a variety of positions in accounting and finance prior to joining the IRS.

Read more from the original source:
The IRS Takes on Cryptocurrency-Funded Terrorists - BankInfoSecurity.com

Cryptocurrency provider tightens compliance in face of widespread fraud – JD Supra

CEP Magazine (October 2020)

LocalBitcoins, a Finnish peer-to-peer bitcoin marketplace plagued by criminal transactions, has spent the past year bolstering its compliance functions[1] after a study found that the marketplace processed the most criminal transactions of any cryptocurrency platform in the world.

CipherTrace, a blockchain forensics firm, released its spring 2020 report on criminal transactions across global cryptocurrency platforms,[2] which found that Finnish exchanges led the world for the third straight year, with just over 12% of all dark web cryptocurrency transactions flowing through its platform. The next highest percentage was found to be through Russian exchanges, which processed more than 5% of all criminal transactions.

The results led LocalBitcoins to invest in automatic screening and vetting software, as well as to implement protocols to align itself with Europes anti-money laundering directive[3] (AMLD5), which was published in May 2018. AMLD5 requires financial firms to implement strict know-your-customer protocols and extends to cryptocurrency platforms.

The CipherTrace study found that overall criminal transactions across all cryptocurrency platforms have decreased as more platforms implement stricter controls and compliance programs.

Read more:
Cryptocurrency provider tightens compliance in face of widespread fraud - JD Supra

Cryptocurrency Chief Predicts Next Amazon Or Apple To Be A Blockchain Company And Based In Asia And Not The US – Digital Market News

Ben Weiss, the chief of Operations at CoinFlip that is the worlds largest Bitcoin ATM operator makes a massive prediction that tech giants like Amazon or Apple will be followed by blockchain-built companies. The prediction further mentioned that these blockchain companies will be setting up base in Asia and the United States.

Weisss prediction is based on a careful analysis regarding the US. He stated in an interview this Tuesday that the United States doesnt have the essential regulatory clarity surrounding the cryptocurrency industry to innovate and grow yet. Without such an environment, information and support, the blockchain industry cannot hope to boom in the US in the future. The nation is not yet ready to welcome a major cryptocurrency boom if that happens in the near future.

- Advertisement -

In his statement he further compared the state of the US regarding the cryptocurrency market to that in Asia. According to Ben Weiss, there is a lot more clarity in Asia. US lacks a proper system regarding cryptocurrency. They dont have well-defined regulations and rules that has the potential to increase confidence in digital cash users and the cryptocurrency market in general. Weiss particularly suggested that Singapore could be the next biggest blockchain hub of the world housing blockchain giants equivalent to Amazon and Apple.

In 2020, the Bitcoin market has seen a boom. Bitcoin has been rallying towards $13,000- easily crossing $12,000 and pushing further up. Weiss expects that Bitcoin prices will hit $13,500 by the end of the term. His statement further clarifies that the US government has ignored the 46% growth of Bitcoin this year. He mentions that if the cryptocurrency market continues to grow at this scale then the US government will be forced to address it later.

Regulations regarding blockchain companies and the entire cryptocurrency scene is necessary to safeguard consumers, inspire innovation. More and more institutional investors are realizing the importance of having bitcoin back-up but there is no support for innovation by retail and institutional investors.

- Advertisement -

Despite the massive growth in digital cash, a huge amount of uncertainty still looms over this industry. There is a lack of US guidelines or policies regarding crypto-cash even before the presidential elections in November.

Weiss also believes the necessity of presidential candidates to take up a stance on the blockchain industry front. A pro-cryptocurrency candidate has huge chances of winning youth votes for the upcoming 15 to 16 years, states Weiss. However, none of the presidential candidates has given the required support to bitcoins.

- Advertisement -

View post:
Cryptocurrency Chief Predicts Next Amazon Or Apple To Be A Blockchain Company And Based In Asia And Not The US - Digital Market News

Cryptocurrency startup Paxful streamlines its operations with Dynamics 365 Business Central – Microsoft

Paxful, a global peer-to-peer cryptocurrency marketplace and startup headquartered in the United States, pursues its mission to give people a simple, fair, and highly secure platform for trading cryptocurrencies. To drive international business growth, its accounting department based in Estonia needed reliable software for running complex financial operations across the companys global divisions. Microsoft Dynamics 365 Business Central was the spot-on solution to automate processes, speed up reporting, and simplify invoicing.

Paxful offers peer-to-peer transactions based on digital currencies. Our vision is to help people all over the world to take control of their finances using cryptocurrencies. Especially people who live in remote, underserved areas where personal finance management options are not readily available, says Darja Antropova, Chief Accountant at Paxful.

However, the startups legacy accounting system couldnt keep up with its rapid expansion and began showing weaknesses in handling fast calculation of foreign exchange rates, for example. Paxful turned to local software integrator, Columbus Eesti, to implement Microsoft Dynamics 365 Business Central. Using unique software developed by local Microsoft partner Columbus Eesti to seamlessly integrate Business Central with third-party solutions, the system went live within two weeks, with the accounting team able to start migrating business data. After 15 days, we were submitting the first VAT declarations. And within one or two months, we had transferred all the information necessary to be fully operational. It was spectacular, says Antropova.

I used to submit tax declarations manually, which took about one day per month. With Dynamics 365 Business Central, I can just export a file and send it to the tax authority website, which just takes 10 minutes, explains Antropova.

The companys previously siloed platforms for financial operations, such as payroll software and an invoicing solution, are now aligned in a single, coherent workspace. We can import all the transactions from payroll software to Dynamics 365 Business Central in one click. And it takes about 10 seconds to transfer a months-worth of information into the system. Same with the e-invoicing system, Antropova says. With the previous document management system, it took up to 24 hours for an invoice to reach the accounting software. All in all, we save roughly 100 hours of work per year.

I worked with different solutions before. But none produced reports as quickly as Dynamics 365 Business Central, observes Antropova. The accounting department was able to redirect time and resources to focus on higher-value work.

Paxful can also scale and add features to its accounting platform depending on its needs. Once you are in the Microsoft environment, you have so many different possibilities to digitize your processes, notes Antropova. For now, Business Central is used primarily by the Estonian office, but the startup plans to broaden its use to branches overseas. If we open new entities around the world, teams there could easily start using Dynamics 365 Business Central, as everything is consolidated and constantly upgraded within the same space, concludes Antropova.

See more here:
Cryptocurrency startup Paxful streamlines its operations with Dynamics 365 Business Central - Microsoft

Singapore-Based Cryptocurrency Investment Firm Novum Launches a Free Trading Bot – CryptoHero – Business Wire

SINGAPORE--(BUSINESS WIRE)--Novum Group, a Singapore-based blockchain advisory and investment firm, launched CryptoHero, a free cryptocurrency trading bot that is simple and intuitive for crypto beginners to setup and run.

The 24/7 opening hours of the crypto market has necessitated the use of trading automation for investors that are involved in trading cryptocurrencies. Understanding this, we created CryptoHero to help investors automate trades right from their phone. Run it, manage it, anytime, anywhere - said Irwin Chee, cofounder of CryptoHero.

We are excited with the launch of CryptoHero. CryptoHero will give many of us who are time-starved to trade the cryptomarkets via automated trading bots. - Christopher Low, chairman of Novum Group added.

Using familiar or popular technical indicators such as Bollinger Band or Relative Strength Index, CryptoHero will be constantly adding to the available technical indicators. Besides, CryptoHero portfolio allows you to access trade history, individual performances, profit and losses from a top level overview to a granular level of each bot or trade. Select trading pairs, direction, technical indicators to enter or exit and risk management settings in the bot to trade on a connected exchange such as Binance, Huobi or OKex.

For beginners, paper trade functions allow you to test bot trading using real-time data without risking actual cryptocurrencies. Youre able to backtest your strategies to get historical performances with key metrics and indicators that can help forecast future performance or shape expectations.

CryptoHero is cloud-based and runs 24/7 to execute trades timely through connected exchanges and notify you whenever a trade is completed.

A free account on CryptoHero allows you to use up to 3 bots with access to all indicators. Premium plans for more bots and features are available at $9.99 per month. New users will get 1 month of premium free when they first sign up without a credit card until December 2020.

CryptoHero is currently available on web and iOS with Android version ready in 2021.

Read more from the original source:
Singapore-Based Cryptocurrency Investment Firm Novum Launches a Free Trading Bot - CryptoHero - Business Wire

FCA crypto ban is a setback for the UK in race to lead growing digital assets marketplace, says GDF – Wealth Adviser

The Financial Conduct Authoritys (FCA) recent decision to ban the sale of derivatives and exchange traded notes (ETNs) linked to cryptoassets to retail customers is a huge setback for the UK in maintaining its dominant position as a global fintech hub. The FCAs decision has left many in the cryptoasset sector questioning the regulators willingness to collaborate with them and listen to the views of key market participants.

These are the views of Global Digital Finance (GDF), an industry membership body that promotes the adoption of best practices for cryptoassets and digital finance technologies through the development of conduct standards in a shared engagement forum with market participants, policymakers and regulators. Over 100 global organisations are members of GDF and over 350 industry professionals from around the world have worked on developing the GDF codes of conduct, the only global standard in this emerging sector.

The trade body also questions the FCAs decision to ban these products when no similar steps have been taken in Europe, the US or Asia.

It is also critical of the regulator for ignoring its own research findings and the overwhelming majority of responses to its consultation on the cryptoasset investment sector. A survey conducted by the FCA, published this year noted that the majority of cryptoasset owners are generally knowledgeable about the product, are aware of the lack of regulatory protection afforded and understand the risk of price volatility.

Lavan Thasarathakumar, head of regulatory affairs at Global Digital Finance, says: The 2,681 participants in the FCAs own survey offer firm evidence that its policy statement to ban the sale of certain cryptocurrency related products is perhaps misguided and leaves one wondering why the FCA disregarded its own evidence-based foundation.

In addition to this, a 2019 FCA Consultation seeking industry input on the suitability of offering crypto derivatives to retail clients revealed that an overwhelming 97 per cent of the consultation respondents disagreed with the FCAs proposal to ban these products.

Jeffrey Bandman, board member at Global Digital Finance and former director at the US CFTC, says: Other regulators, notably the US CFTC, have been safely overseeing regulated crypto derivatives markets for nearly three years with products that offer a reliable basis for valuation. These markets are accessible to retail as well as professional investors. Given the strong ties and coordination among global agencies, it is surprising a forward-looking regulator such as the FCA did not find itself able to adapt these safeguards to the UK market.

GDF also points out that recently Germanys regulator BaFin approved a bitcoin exchange traded fund (ETF). BTCetc Bitcoin ETP (BTCE) is an exchange traded cryptocurrency (ETC) that tracks the price of bitcoin. It is 100 per cent physically backed by bitcoin, and for every unit of BTCE, there is bitcoin stored in regulated, institutional-grade custody. BTCE was the first cryptocurrency ETP admitted to Xetra to be cleared centrally.

Lawrence Wintermeyer, executive co-chair of Global Digital Finance, says: In stark contrast to other global regulatory trends with cryptoassets, the FCAs ban puts the UK out on its own in terms of taking a prohibitive stance. This is an unfortunate move following the UK Governments snub to fintech companies by initially excluding them from its Coronavirus Business Interruption Loans Scheme (CBILS administration scheme). This surprising exclusion damaged the Governments credibility as a champion of fintech following more than a decade of promoting fintech competition as an antidote to the concentration risk of incumbent UK banks following the Financial Crisis. The FCAs decision to ban the sale of certain investment products linked to cryptocurrencies is yet another setback for the UK in trying to strengthen its position as a leading market for fintech and the digital asset markets.

Some may wish to argue the moot point that the FCAs ban is good for retail customers, good for the financial services market, and good for the UK. We would most certainly disagree with this. What is unarguable is that digital is global, and that digital finance is global. The effectiveness of jurisdictional bans of this nature is questionable in a world where customers can find the products and services they choose on the internet, wherever these products and services come from, and this choice often drives customers offshore.

Read the original post:
FCA crypto ban is a setback for the UK in race to lead growing digital assets marketplace, says GDF - Wealth Adviser

Is It Time to Leave Open Source Behind? – Built In

Once there was a group of people who got tired of the status quo, so those people started their own community based on a commitment to freedom and equality. That community got bigger and bigger, until what was once revolutionary became widely accepted.

But there were problems. Big business had too much control, some people warned. Others thought the communitys founding documents had become outdated they served the needs of the people who wrote them, not the community in its current form.

This story could be about the United States, but it isnt. Its about open-source software.

What started as a small, homogeneous online community fed up with proprietary software has exploded into a mainstream framework that powers the tech giants in your stock portfolio and the mobile phone in your hand. Now, the open-source community is much bigger and (slightly) more diverse, but its inner workings remain largely the same.

And a lot of people think thats okay. The philosophical bedrock of free and open-source software no hidden source code, no limitations on use could be as legitimate today as it was when it was written.

Many others disagree. Big companies profit from the work of underpaid and overtaxed project maintainers, they argue. Some organizations take open-source tools and use them for unethical ends, and developers cant stop them. Real freedom, #EthicalSource activists like Coraline Ehmke claim, requires limitations.

So, if a large faction of open-source participants arent happy with the state of things why dont they just leave?

Read This, TooIs Your Open-Source Code Fueling Human Rights Abuses?

* * *

This story is the fourthin a series on cultural battles facingthe open-source community. You can read the first article, on ethics and licensure,here, the second article, on governance,hereand the third article, on the rights of end users, here.

Its really, really hard to leave, Don Goodman-Wilson told me.

Goodman-Wilson is an engineer, open-source advocate, philosopher and former academic. His disenchantment with open source didnt happen overnight. Rather, it was a long process of noticing and questioning some assumptions hed taken for granted.

Its something that had been a long time coming for me, he said. I was, very slowly, attending talks and feeling doubts rise within me over the years.

Now, hes joined with Ehmke and other #EthicalSource proponents to call for changes. Could they abandon traditional definitions of open source and make common-pool software their own way? Sure. But people have tried that, and it didnt go great.

Take King Games, which in May decided to list its game development engine Defold on GitHub for community collaboration.

We are immensely proud to announce that@king_games has released the Defold game engine as open source on GitHub and transferred Defold to the Defold Foundation, Defold Engine tweeted.

Gamers rejoiced; then the fallout came.

Its really, really hard to leave.

Can we discuss the license choice? I had missed this initially and thought it was [the open-source license] Apache 2.0, but I see now that its custom, one user replied. It means that its not open source as per the [Open Source Initiatives] open-source definition.

Thats because King, which presumably didnt want other gaming studios to take and profit from its code, released Defold under a modified license that prevented commercial reuse. That violates the definitions of free and open-source software as per the Free Software Foundations four freedoms and the Open Source Iniatives (OSIs) open-source definition.

So, Defold Engine tweeted again five hours later: We are humbled by the positive reactions to the news we shared earlier today but also sorry for misrepresenting the license under which we make the source code available. Defold is a free and open game engine with a permissive license, and we invite the community to contribute.

But that didnt do the trick.

The use of the words open and free, and the derived from [open-source license] Apache makes me upset, one user replied. It is a blatant attempt to use someone elses good name.

So, Defold Engine tweeted again. And again. And again.

Some thoughts on the open source discussions yesterday, the first in a nine-tweet thread read. There was no ill-intent on our part when said that Defold is open source. The source code is available on GitHub for anyone to play around with and hopefully contribute to. This is what we meant, nothing else.

Comments on that thread appear to have been disabled.

Its a familiar scene, Goodman-Wilson said. A person or organization fiddles with an open-source license and is met with righteous anger. Thats what happened when Ehmke introduced the Hippocratic License which prohibits the use of software for human rights abuses although plenty voiced their support, as well.

Open sources strength lies in its community. Without community buy-in, options are limited for people looking to expand or reimagine what open source means.

Reputation is another barrier to exit for open-source participants, Goodman-Wilson said.

Today, open source is often touted as a resume-builder, or a stepping stone to high-paying jobs with tech companies. For developers, that means creating a high-profile project or even contributing to one might mean the difference between writing your own ticket and languishing in software obscurity.

What makes a project high-profile is, invariably, adoption rates. The more people use your software, the more successful its considered.

You want that [adoption rate] number to go up and to the right, because youve been told over and over again that is the metric for success. And if you cant show that metric, then your project is not successful, Goodman-Wilson said.

That creates what Goodman-Wilson views as a problematic incentive: To boost adoption rates, developers must take care to appeal to corporate interests.

Corporations are notoriously risk-averse. OSI worked hard to bring them into the open-source fold, and their involvement has largely been limited to projects with standard, approved licenses. If developers built some software and slapped on a modified license with caveats for ethics or commercial use, corporations would balk. By sticking with OSI-approved licenses, developers greatly improve their chances of getting their software into corporate tech stacks.

That means higher adoption, more repute and, potentially, more money. Split with OSI, and those benefits of open-source involvement all but disappear.

What happens when an open-source developer creates a successful project with a relatively high adoption rate? They might end up with a job offer. Or, they might get stuck maintaining that codebase for little or no pay.

When Goodman-Wilson was working on GitHubs developer relations team, the company organized a series of meetings for open-source project maintainers to discuss their experiences and make recommendations for improvements. The last one was held in 2019 in Berlin.

Those conversations were eye opening. Holy crap. A lot of the complaints were around like, I feel taken advantage of. I feel like my time is being given freely to people who do not value it, typically large corporations, Goodman-Wilson said. Based on those conversations, it felt like [open source] had come full circle and was now a system that, although initially intended to overturn power hierarchies in the tech world, actually ended up reinforcing them.

It felt like [open source] had come full circle and was now a system that, although initially intended to overturn power hierarchies in the tech world, actually ended up reinforcing them.

The accompanying report named frequent and widespread burnout as a cause for concern, as maintainers cited unmanageable volumes of work and problems with competing interests.

Maintainer burnout is one issue that arises when corporations can dip into the open-source pool with few limitations. But companies can also toss things into the pool.

Often, those contributions are extremely helpful. Tech entrepreneurs rely on open-source to spin up new and innovative offerings. Googles release of Kubernetes as open source, for example, changed the game for cloud-native projects, and TensorFlow laid the foundation for accessible neural network technology.

Other times, the effects are mixed. React, for instance, is a Facebook-maintained open-source library thats served as a powerful recruiting tool as React grew in popularity, Facebook engineering grew in esteem. But React has also been accused of harboring toxic community members and attitudes, leading to the departure of several prominent contributors.

Despite some systemic flaws and personal risk, the desire for industry success and peer repute drives developers to stick with open source. It also drives them to build software that will get them noticed.

Like Avatarify, a program by developer Ali Aliev that uses artificial intelligence to superimpose one face onto another during video capture. Avartify grabbed attention because it is the first software to create semi-convincing real-time deepfakes. Check out this demo, in which Elon Musk bombs a Zoom meeting.

Its really cool, in some very sad sense of the word cool, Goodman-Wilson said.

The implications of technology like this are complicated. On one hand, it is really cool. Combined with a convincing audio deepfake to mask the impostors voice, perhaps a person really could convince their friends that a celebrity had joined their Zoom call. Or they could make and release a video of a real politician saying fake things. They could spread false information. Or incite violence.

Its fair to say that, in the wrong hands, a tool like Avatarify goes from fun to scary. And, because Aliev released it under a traditional open-source license, anyone could take and use its technology.

[Aliev] gained reputation from doing it, so he was incentivized to work on this release in open source, Goodman-Wilson said. On the other hand, now weve got state actors that would love to have this sort of tool available to them. So, knowing that there are oppressive, unjust organizations that can dip into the pool of open source and take from it what they need is actually deeply terrifying to a lot of developers.

The horror in the room was palpable.

What Goodman-Wilson is describing has actually happened. Developers who oppose war, for instance, have been alarmed to learn that the U.S. Air Force and Navy use Kubernetes, an open-source project, to run combat aircraft and warships. For developers outside the U.S., these connections may be particularly disturbing.

While giving a talk in Amsterdam to a group of developers who worked on JavaScript extension TypeScript, Goodman-Wilson presented a U.S. Air Force recruiting website with a TypeScript dependency. The website is a sort of drone flight simulator, and visitors fly through an abstracted city, shooting at blips of light that represent insurgents.

A lot of people in the room were from the Netherlands and unknowingly had their code used by this Air Force recruiting site, and the horror in the room was palpable, Goodman-Wilson said. The last thing that they expected was to be working on a language extension and find that it was being used to recruit drone pilots.

Read This, TooOpen-Source Governance, Meet Feminist Economics

Theres this huge disconnect between what we think were doing when were contributing to open source, which is, quote, unquote, making the world a better place, and the reality of the incentive and access structure behind open source, which is such that, who knows if what youre building is being turned into a weapon? Goodman-Wilson said.

But is it a developers fault if a totally separate entity uses something they helped build for unethical ends? Wont bad actors get their hands on the tools they need, Hippocratic License or no Hippocratic License?

Yes, to both, Goodman-Wilson told me. Organizations that hurt people will always get the software they need but with formal, ethical boundaries around open-source resources, theyd have to pay for that software rather than taking it for free. From a moral perspective, that distinction matters, he argued, because open-source developers would no longer share responsibility for abuses.

Even if theyll just take that software from somewhere else, at least I have cut off one avenue of access that links back to me.

If we think of ethics as a causal relationship, moral actions are ones whose outcomes we can influence, he said. If a dictator in a faraway country uses a tool weve never heard of to aid in human rights abuses, we shouldnt feel responsible. But if an organization uses a piece of software we helped build to conduct drone strikes on civilians, we might feel some sense of responsibility.

To the extent that I want to take responsibility for my own actions and decisions, I might want to find ways to cut down that causal chain, he said. Even if theyll just take that software from somewhere else, at least I have cut off one avenue of access that links back to me. Then you convince enough people to do that, and, as a movement, you begin to cut off more and more avenues.

For Goodman-Wilson, that movement looks like #EthicalSource and Hippocratically licensed software. But cutting off access for some while maintaining the spirit that made open source special access for all is profoundly difficult.

Its a balance Goodman-Wilson, and other open-source activists, are continually trying to strike.

The story of open source feels like the story of communities.

They start small and single-minded. But as they grow, factions form and power dynamics arise. New people show up, bringing new ideas. And eventually, the community is faced with a decision: Should we evolve, or hold fast to the principles we started with?

Ehmke, Goodman-Wilson and others are asking for evolution, and theyve encountered plenty of obstacles. So far, the #EthicalSource movement has been limited to a tweet here, a presentation there, and many behind-the-scenes conversations. Potential allies are afraid to put their reputations and career prospects at risk, Goodman-Wilson said, which limits the movements scope.

What do we need to do to create an atmosphere where people arent afraid to speak out? he said. I dont know the answer to that, but thats a question a lot of us are asking. And I would really like more people to ask.

For now, #EthicalSource will continue to promote unapproved models and licenses and hope that open sources governing bodies come around. But its proponents might not wait forever.

Ive certainly never built a political movement before, but I think a lot of us are starting to see this as a political movement that needs to be built, instead of just throwing some good arguments out there and seeing what sticks, Goodman-Wilson told me.

In the end, open-source participants are free to choose where they stand. Their decisions will affect each and every one of us.

Read This, TooThe Rules of Open Source No Longer Apply

Read this article:

Is It Time to Leave Open Source Behind? - Built In

Three best practices for responsible open source usage in the COVID-19 era – Help Net Security

COVID-19 has forced developer agility into overdrive, as the tech industrys quick push to adapt to changing dynamics has accelerated digital transformation efforts and necessitated the rapid introduction of new software features, patches, and functionalities.

During this time, organizations across both the private and public sector have been turning to open source solutions as a means to tackle emerging challenges while retaining the rapidity and agility needed to respond to evolving needs and remain competitive.

Since well before the pandemic, software developers have leveraged open source code as a means to speed development cycles. The ability to leverage pre-made packages of code rather than build software from the ground up has enabled them to save valuable time. However, the rapid adoption of open source has not come without its own security challenges, which developers and organizations should resolve safely.

Here are some best practices developers should follow when implementing open source code to promote security:

First and foremost, developers should create and maintain a record of where open source code is being used across the software they build. Applications today are usually designed using hundreds of unique open source components, which then reside in their software and workspaces for years.

As these open source packages age, there is an increasing likelihood of vulnerabilities being discovered in them and publicly disclosed. If the use of components is not closely tracked against the countless new vulnerabilities discovered every year, software leveraging these components becomes open to exploitation.

Attackers understand all too well how often teams fall short in this regard, and software intrusions via known open source vulnerabilities are a highly common sources of breaches. Tracking open source code usage along with vigilance around updates and vulnerabilities will go a long way in mitigating security risk.

Aside from tracking vulnerabilities in the code thats already in use, developers must do their research on open source components before adopting them to begin with. While an obvious first step is ensuring that there are no known vulnerabilities in the component in question, other factors should be considered focused on the longevity of the software being built.

Teams should carefully consider the level of support offered for a given component. Its important to get satisfactory answers to questions such as:

Its no secret that COVID-19 has altered developers working conditions. In fact, 38% of developers are now releasing software monthly or faster, up from 27% in 2018. But this increased pace often comes paired with unwanted budget cuts and organizational changes. As a result, the imperative to do more with less has become a rallying cry for business leaders. In this context, it is indisputable that automation across the entire IT security portfolio has skyrocketed to the top of the list of initiatives designed to improve operational efficiency.

While already an important asset for achieving true DevSecOps agility, automated scanning technology has become near-essential for any organization attempting to stay secure while leveraging open source code. Manually tracking and updating open source vulnerabilities across an organizations entire software suite is hard work that only increases in difficulty with the scale of an organizations software deployments. And what was inefficient in normal times has become unfeasible in the current context.

Automated scanning technologies alleviate the burden of open source security by handling processes that would otherwise take up precious time and resources. These tools are able to detect and identify open source components within applications, provide detailed risk metrics regarding open source vulnerabilities, and flag outdated libraries for developers to address. Furthermore, they provide detailed insight into thousands of public open source vulnerabilities, security advisories and bugs, to ensure that when components are chosen they are secure and reputable.

Finally, these tools help developers prioritize and triage remediation efforts once vulnerabilities are identified. Equipped with the knowledge of which vulnerabilities present the greatest risk, developers are able to allocate resources most efficiently to ensure security does not get in the way of timely release cycles.

When it comes to open source security, vigilance is the name of the game. Organizations must be sure to reiterate the importance of basic best practices to developers as they push for greater speed in software delivery.

While speed has long been understood to come at the cost of software security, this type of outdated thinking cannot persist, especially when technological advancements in automation have made such large strides in eliminating this classically understood tradeoff. By following the above best practices, organizations can be more confident that their COVID-19 driven software rollouts will be secure against issues down the road.

Read more from the original source:

Three best practices for responsible open source usage in the COVID-19 era - Help Net Security

DeepMind open-sources the FermiNet, a neural network that simulates electron behaviors – VentureBeat

In September, Alphabets DeepMind published a paper in the journal Physical Review Research detailing Fermionic Neural Network (FermiNet), a new neural network architecture thats well-suited to modeling the quantum state of large collections of electrons. The FermiNet, which DeepMind claims is one of the first demonstrations of AI for computing atomic energy, is now available in open source on GitHub and ostensibly remains one of the most accurate methods to date.

In quantum systems, particles like electrons dont have exact locations. Their positions are instead described by a probability cloud. Representing the state of a quantum system is challenging, because probabilities have to be assigned to possible configurations of electron positions. These are encoded in the wavefunction, which assigns a positive or negative number to every configuration of electrons; the wavefunction squared gives the probability of finding the system in that configuration.

The space of possible configurations is enormous represented as a grid with 100 points along each dimension, the number of electron configurations for the silicon atom would be larger than the number of atoms in the universe. Researchers at DeepMind believed that AI could help in this regard. They surmised that, given neural networks have historically fit high-dimensional functions in artificial intelligence problems, they could be used to represent quantum wavefunctions as well.

Above: Simulated electrons sampled from the FermiNet move around a bicyclobutane molecule.

By way of refresher, neural networks contain neurons (mathematical functions) arranged in layers that transmit signals from input data and slowly adjust the synaptic strength i.e., weights of each connection. Thats how they extract features and learn to make predictions.

Because electrons are a type of particle known as fermions, which include the building blocks of most matter (e.g., protons, neutrons, quarks, and neutrinos), their wavefunction has to be antisymmetric. (If you swap the position of two electrons, the wavefunction gets multiplied by -1, meaning that if two electrons are on top of each other, the wavefunction and the probability of that configuration will be zero.) This led the DeepMind researchers to develop a new type of neural network that was antisymmetric with respect to its inputs the FermiNet and that has a separate stream of information for each electron. In practice, the FermiNet averages together information from across streams and passes this information to each stream at the next layer. This way, the streams have the right symmetry properties to create an antisymmetric function.

Above: The FermiNets architecture.

The FermiNet picks a random selection of electron configurations, evaluates the energy locally at each arrangement of electrons, and adds up the contributions from each arrangement. Since the wavefunction squared gives the probability of observing an arrangement of particles in any location, the FermiNet can generate samples from the wavefunction directly. The inputs used to train the neural network are generated by the neural network itself, in effect.

We think the FermiNet is the start of great things to come for the fusion of deep learning and computational quantum chemistry. Most of the systems weve looked at so far are well-studied and well-understood. But just as the first good results with deep learning in other fields led to a burst of follow-up work and rapid progress, we hope that the FermiNet will inspire lots of work on scaling up and many ideas for new, even better network architectures, DeepMind wrote in a blog post. We have just scratched the surface of computational quantum physics, and look forward to applying the FermiNet to tough problems in material science and condensed matter physics as well. Mostly, we hope that by releasing the source code used in our experiments, we can inspire other researchers to build on our work and try out new applications we havent even dreamed of.

The release of the FermiNet code comes after DeepMind demonstrated its work on an AI system that can predict the movement of glass molecules as they transition between liquid and solid states. (Both the techniques and trained models, which were also made available in open source, could be used to predict other qualities of interest in glass, DeepMind said.) Beyond glass, the researchers asserted the work yielded insights into general substance and biological transitions, and that it could lead to advances in industries like manufacturing and medicine.

Read more here:

DeepMind open-sources the FermiNet, a neural network that simulates electron behaviors - VentureBeat

A new way to think about your favorite games code – The Verge

Its surprisingly hard to archive a video game. Cartridges decay, eventually; discs become unreadable as their plastic degrades. Source codes are lost to corporate mergers and acquisitions. But whats most dangerous to preserving game history isnt a physical or corporate consideration: its the prevailing attitude that games are playful, evanescent, and therefore not worth archiving.

Obviously thats not true, and games deserve critical historical consideration, the kind that other, older mediums get. Frank Cifaldi and Kelsey Lewin, co-directors of the Video Game History Foundation, are two of the people leading that charge. I spoke with them a little while ago about preserving video game history, and their new program, the Video Game Source Project, which takes as its footing the idea that theres no better way to study a video game than to access its raw material.

Theres only so much you can learn from studying the final product, they say because studying the final iteration of a creative project leaves out the hows and whys that brought it to life in the first place. And Lewin and Cifaldi have started with a classic: LucasArtss The Secret of Monkey Island, which is celebrating its 30th anniversary this month.

First of all, [the project is] just kind of a call to arms to everyone that this stuff is really important and useful. And, at least when it comes to the older material, is rapidly dying, says Cifaldi, pointing out that most game companies dont have source code archives. But I think, most importantly, we want to normalize the availability and study of video games source material, because right now, video game source is just a very proprietary trade secret.

Which is true! And in modern gaming, cloning is a big deal, even leaving out the issues with source code. But from our perspective, its like, if you havent been doing anything with this game for 10, 20 years, why why the lockdown? he says.

Its a great question, one that makes me think a lot about the traditional stories of video game archiving rather, I should say, one story in particular, the one about E.T. and the landfill. See, the game E.T. The Extra-Terrestrial was a 1982 adventure game movie tie-in, developed for the Atari in a blazing 36 days. When it hit store shelves that Christmas, it was an unprecedented flop; Atari buried the unsold inventory somewhere deep in the New Mexico desert, where it was dug up in 2014 by a documentary crew. The next year, the game entered the Smithsonians collection, and they produced an episode of a podcast in 2019 to retell the legend.

Its a wonderful story, especially when you consider Ataris fortunes in the aftermath. (Spoiler: they werent great.) But the problem is, its basically the only one people know. Cifaldi and Lewins real goal is to bring more fascinating stories about this kind of history to the public, and to preserve the raw materials that make them possible. (The Foundations blog is excellent, at least when it comes to great stories.)

This isnt without controversy. The Nintendo Gigaleak, as its been called, happened earlier this summer and exposed a rich trove of new data about classic games. It also exposed a moral dilemma: if the riches in the leak were obtained illegally, as they likely were, does that change how historians think about and use what they learn from it? That answer is an individual calculus, of course. But on the other hand: if Nintendos secrets were less closely held, the leak wouldnt be quite as monumental.

I dont know that theres a tidy answer here. Whats clear, though, is that historical research into games should be something companies expect and prepare for. The kind of work the Video Game History Foundation does is important and necessary, even if the industry doesnt quite appreciate it. They just want a more open world.

And just to be clear, I mean, I dont think we expect a world where everyones just like, Great, lets open source everything, says Lewin. Its unrealistic, agrees Cifaldi.

But what is realistic is normalizing that someone could actually study this, just for historical purposes, that they should be able to look at this and learn things from it and tell the stories that they find within it, Lewin says.

The Secret of Monkey Island is a seminal game, not least for Cifaldi, who cites it as maybe my favorite game, depending on which day you ask me. He says it taught him what games could be like that they could have funny, memorable worlds and characters. The Foundation received the game the way they receive a lot of them, which is to say, on the sly. (Cifaldi identifies this as another problem the Foundation is out to solve: making it safe for people to donate games and such to the archive, even if they dont own the rights.)

When they got in touch with Lucasfilm about making content around the game, however, the studio was supportive. I mean, theyre the guys that make Star Wars, right, they understand, says Cifaldi. They understand that fans really enjoy this behind-the-scenes material, and that it directly benefits them if people are talking about it.

This month, the Video Game History Foundation plans to reveal what its learned about The Secret of Monkey Island, to fans, historians, and everyone else who might be interested in the hidden corners of a 30-year-old video game. We are able to reconstruct deleted scenes from the games that no ones ever seen before, because that data literally isnt on the disk that you get, because its not compiled into the game, says Cifaldi. (The Foundation has also gotten Ron Gilbert, the creator of the game, to join them in a livestream happening on October 30th.)

Cifaldi and Lewins perspective can be summed up pretty simply: they want to expand the kinds of stories we can tell about video games, as both fans and historians. Weve only really had crumbs of games development through, you know, finding an unfinished version that was maybe sent to a magazine to preview, or through seeing what accidentally got compiled into the final game, says Cifaldi.

That, he says, has tainted how fans view the development process as something perhaps linear, instead of as a gradual pileup of creative decisions. I think a really interesting thing to come out of this conversation with Ron is being able to sort of show that when we find this unused character in the source code, its not, like, this was a character with a fleshed-out biography, Cifaldi says. Sometimes an unused pirate is just an unused pirate.

It kind of sometimes tends to put either false importance on something, Lewin agrees. If you only have two clues about something, you can come up with a wild variety of scenarios that those two clues fit into. If you have 20 or 30 clues, on the other hand, the realm of possibility narrows. If you saw it in an earlier build of the game, you might be led to believe that its something that it absolutely wasnt, Lewin says.

To take a real example from The Secret of Monkey Island: the collapsing bridge. As Cifaldi explains it, there are frames of animation for a bridge collapsing in the original artwork for the game, but theres no code that calls for it. Its just there. So they took it to Gilbert, the creator, and asked about it. Rons like, Oh, yeah, thats not anything that was ever in the game. Im not really sure why thats there, says Cifaldi.

They also had access to Gilberts sketchbook from when he was making the game, which contained the raw ideas that eventually made it into the finished product. There is a page that just says, booby trap on bridge?. And I think thats like, all it ever was, Cifaldi continues. Like, the game wasnt designed enough, but artists need to be working on something. So its like, I dont know, work on a booby-trapped bridge, and maybe well revisit it, and they never did. Its not a cut puzzle; it doesnt mean anything other than it was an idea that didnt quite make it.

Thats part of the creative process, Cifaldi says. Youre collaborating, theres a lot of people involved, and you try ideas out, you rough draft them, and then they might get cut before you even try to use them. A collapsing bridge is just a collapsing bridge.

In this demystifying the game development process the Video Game History Foundation is something of a pioneer, one thats actively writing the rules for archiving this kind of art as it goes. Its source project is a holistic examination of how the games you love actually get made, which is as important as the games themselves. In our conversation, Cifaldi likened his work to archaeology.

If youre able to access raw source code for a game, bare minimum, you can understand how the systems work and talk to each other and things like that, he says. But if youre a good historian, its a dig. When youre looking at a mummy, you dont have access to that person when they were alive or whatever, right? But you can find clues that help you understand who they were, and what their social status was, and things like that, he says. And those clues eventually become a story, one we might tell, years later, on a podcast for the Smithsonian.

Its just kind of a new idea in the world to have source material for any kind of software in an archive. And I think its going to be a rough road ahead, Cifaldi says. But this has to start somewhere. And its starting now.

Link:

A new way to think about your favorite games code - The Verge