Three things central bankers can learn from Bitcoin – MIT Technology Review

For central bankers, the game changed last summer when Facebook unveiled its proposal for Libra. Many have responded by seriously exploring whether and how they should issue their own digital money.

Arguably, though, the more fundamental change is more than a decade old. It was Bitcoin that first made it possible to transfer digital value without the need for an intermediary, a model that competes directly with the traditional financial system. The networks resilience against attackers suggests there is another way of setting up the system.

Last weekend at the MIT Bitcoin Expo held on campus in Cambridge, Massachusetts, I sat down with experts familiar with central banking as well as cryptocurrency. We discussed the practical concerns central bankers should be considering as they begin to design their own digital money systems. One common theme: central bankers have plenty to learn from Bitcoin.

Sign up for the Chain Letter blockchains, cryptocurrencies, and why they matter

Security can be achieved through resilience.

The US Federal Reserve has no current plans to issue a central bank digital currency (CBDC). But if it ever did, nine out of the top 10 requirements would pertain to security, said Bob Bench, director of applied fintech research at the Boston Fed. Because the second that thing goes live, he said, its the most attacked program in the world.

Bitcoin, with its mix of transparency, cryptography, and economic incentives, has something to teach central bankers about data security, according to Robleh Ali, a research scientist at the MIT Media Labs Digital Currency Initiative. Its a system that exists in a very hostile environment, and its proved to be resilient to that, said Ali. Its also a fundamentally different way of achieving security compared with how it is done in the traditional system: Rather than try to hide the data behind walls, its trying to make the system so its inherently resilient.

Keep it simple.

CBDCs can be thought of as third-generation digital currencies, said Ali. If Bitcoin is the first generation, Ethereum and other so-called smart-contract platforms, which include relatively complicated programming languages, can be seen as the second generation. While it may be tempting to add even more bells and whistles to a CBDC system, that would be the wrong approach, Ali said, because the more complexity you have, the more opportunities you give attackers to break in. What you want in the third generation is a much simpler system even than Bitcoin, he said. Its more about taking things away than adding things, and I think in terms of making it secure, that should be the mindset.

Privacy is going to be very tricky.

Ali said he expects not all central banks that choose to issue digital currency will use the same system, but many will likely pursue a hybrid between blockchain-based cryptocurrencies like Bitcoin and more traditional, centralized systems.

Such permissioned blockchain systems, also called distributed ledger technologies, could give central banks new tools, like the ability to program the currency to perform specific functions, said Sonja Davidovic, an economist at the International Monetary Fund. For instance, it may let banks automate their responses to certain kinds of economic changes and give central bankers more precise control over the money supply. They would also have much more detailed visibility into the goings-on in their respective economies. Theres a problem, however, said Davidovic: We havent really seen yet how privacy could be protected.

Bitcoin privacy is tricky. Though users are pseudonymous, its public accounting ledger, called the blockchain, makes all transactions traceable. How would a blockchain-based CBDC system keep transaction data private? How would it represent people on the blockchain? Unless the system allows only small transactions, users will have to identify themselves somehow in order to comply with anti-money-laundering rules. How will their identity data be protected from theft, fraud, or even government surveillance?

In the cryptocurrency world, so-called privacy coins like Zcash and Monero, which use advanced cryptographic techniques to hide blockchain transaction data from public view, have arisen as alternatives to Bitcoin. But even if central banks are able to do something similar, it still might be possible to construct profiles of people based on their metadata, said Davidovic: Im not entirely sure that this is a problem that technology alone can solve.

Read the original post:
Three things central bankers can learn from Bitcoin - MIT Technology Review

Only 6% of the ad industry is happy with the digital advertising ecosystem – AdNews

Just 6% of the industry is satisfied with the current digital advertising ecosystem, according to a survey by Industry Index.

The figures come as the Australian Competition and Consumer Commission (ACCC) kicks off two inquiries into online advertising, with one focusing on the adtech industry.

The survey was conducted in partnership with TV advertising solutions company MadHive and AdLedger, a nonprofit research consortium which has members such as Publicis Media, GroupM and OMG.

More than 100 brand marketers, agencies and digital publishers were surveyed, with 6% saying theyre satisfied with the current digital advertising ecosystem. Another 92% believe there is a need for industry-wide standardisation.

Digital advertising is still suffering from the same issues of transparency, fraud and fragmentation, Christiana Cacciapuoti, executive director at AdLedger, says.

And its because we just keep slapping band-aids on a fundamentally broken system, when we need to be developing a new infrastructure thats driven by innovative technologies.

The Australian watchdog is calling for feedback from the industry as it begins its inquiry into the sector, which it has described as opaque. Its expected to hand down its interim report by December.

The adtech ecosystem absolutely needs to be changed, Alysia Borsa, executive vice president and chief business and data officer at Meredith Corporation says.

There is a lack of transparency which leads to fraud, which leads to low quality, which leads to poor performance. Its a really bad cycle.

The survey also found that 83% of respondents believe cryptography can be used to create transparencies and efficiencies, most often agreeing that cryptography could improve problems associated with fraud (66%) and the ability to track results (66%).

Sooner or later, the industry is going to realise that this dysfunctional relationship has got to end, and the only way to fix it is with next-generation technologies, MadHive CEO Adam Helfgott says.

And with blockchain and cryptography already weeding out fraud and solving similar issues on OTT, its only a matter of time till the industry stands together and overhauls the system.

Have something to say on this? Share your views in the comments section below. Or if you have a news story or tip-off, drop us a line at adnews@yaffa.com.au

Sign up to the AdNews newsletter, like us on Facebook or follow us on Twitter for breaking stories and campaigns throughout the day.

More:
Only 6% of the ad industry is happy with the digital advertising ecosystem - AdNews

‘Crypto’ is deadlong live the digital transaction era – CoinGeek

Using the word crypto to describe Bitcoin is an inaccuracy that comes from early misunderstandings and a false connection to distinctly different systems. As we move into an era where Bitcoin will begin to power the digital transaction economy, its time to let go and use more appropriate terms to describe this industry. So, digital currency it is.

On that note, mining also needs to go. Transaction processors play a far more important role than simply digging up new coins. Mining only applies to the disappearing block subsidy, which is only there to prime the pumpthe true function of the nodes is transaction processing and this takes over as the subsidy dies. No more coal mine stock photos, or gold mining puns in articles, lets give them the respect they deserve.

After 11 years of Bitcoin, were grown accustomed to seeing these expressions. But something never felt quite right about them. They seemed forced, bolted-on, a rushed attempt to describe a new industry using words people could relate to somehow. And over those years, crypto and cryptocurrency have also acquired the stigma of anonymity, crime, and get-rich-quick schemes like ICOs.

Digital transaction and transaction processorsget used to those expressions, because youll be seeing them a lot more often in the future.

Bitcoin is not like this

Bitcoin creator Dr. Craig Wright described the need for more accurate descriptors, saying:

Cryptocurrency is linked to a lot of discredited systems (right back to eCash) that are linked to black market and illegal use cases (drugs, money laundering etc). Next, Bitcoin is not encrypted. Blind cash systems such as Digicash used cryptographic constructs directly and are associated with encrypted transactions that cannot be traced. Bitcoin is not like this.

Bitcoin is sent in clear text. The hash is an index to the identity exchange between peers (people) but nothing is secret. It is private but not hidden.

Theres another, er, key difference. Dr. Wright continued:

With encrypted systems, the loss of the key means the inability to access data. Bitcoin is not encrypted. The transaction is published publicly. So, nothing is going to stop recovery if the nodes agree (nodes enforce rules and courts issue rules).

Even the NSA cannot force access to an encrypted file when the private key is destroyed, but, again Bitcoin is clear text and public. So, losing a key doesnt mean the transaction is lost.

I used digital signature algorithms in bitcoin, but this is still not cryptography. It is a system that uses the same maths in a new manner.

He also noted that Bitcoin was never a new technology. In his 1996 piece, The Wild, Wild Web, Gregory Spears articulated web IPOs, which used tokens to raise funds. We now call these ICOs, but the difference is that a blockchain stops the Ponzi schemes from deleting logs. Dr. Wright said:

Token security offers and capital raising dates back decades. Bitcoin was never a new technology in this area.

In the Nineties

Cryptocurrency is a word that comes from Dr. Wrights favorite decade, the 1990s. Early allusions to the term run alongside untraceable and anonymous moneywhich has shown itself to be undesirable. Interestingly, some of the first published references come from those concerned with investigating its uses. Theres the NSAs How to Make a Mint: the Cryptography of Anonymous Electronic Cash from 1996, also referenced in The American Law Review in 1997.

Its a difficult word to say, and one that causes outsiders eyes to glaze over. If they can relate to it at all, their minds subconsciously connect it to more negative aspects of the past like scams, bubbles, and crime.

Writers also know its a difficult word to type. If you dont believe us, search the web for crypocurrency and crytocurrency and see how many results you get.

We know youll check the whitepaper, so here it is

For the record, Satoshi Nakamotos 2008 Bitcoin whitepaper does not refer to cryptocurrency or miners the way theyre commonly used in todays discourse. There are just two sentences that allude to these terms. This one:

What is needed is an electronic payment system based on cryptographic proof instead of trust

And this one:

The steady addition of a constant of amount of new coins is analogous to gold miners expending resources to add gold to circulation.

Bitcoin is based on cryptography, but it is not itself a form of cryptography and it doesnt function the same way. Contrary to popular belief, it is not secured by cryptography either. Bitcoin has an economic model secured by incentives and coststransaction processors do their job for profit, and hacking or taking over the network is prohibitively expensive.

Cryptographers themselves have never liked the word crypto for digital transaction networks. That abbreviation describes their field, and many have expressed displeasure at it being reapplied to just one specific application. And miners? Like Satoshi said, its only meant to be an analogya comparison to allow newcomers to more easily grasp whats going on.

Were happy for the real cryptographers to have their word back, and restore it to its proper meaning.

Bitcoin and the digital transaction industry are mature now. The industry is ready to move on to greater challenges, and in doing so must discard the expressions that narrowed its mission so much. It gives us great pleasure to announce that crypto is dead long live the digital transaction era.

New to Bitcoin? Check out CoinGeeksBitcoin for Beginnerssection, the ultimate resource guide to learn more about Bitcoinas originally envisioned by Satoshi Nakamotoand blockchain.

See more here:
'Crypto' is deadlong live the digital transaction era - CoinGeek

Disruptive Defenses Are The Key To Preventing Data Breaches – Forbes

A report from DLA Piper states that more than 160,000 data breach notifications have been reported across 28 nations in the European Union since the General Data Protection Regulation (GDPR) went into effect in May 2018 -- an average of more than 260 data breaches per day. And, when you consider that since California's 2004 law on privacy breaches, over 9,000 breaches have been recorded and 11.5 billion records have been exposed,it is evident that threats against sensitive data are unprecedented.

Is this the new normal? Can there be any expectation of security and privacy when even the most stringent of data privacy regulations appear to have little effect?

Companies, government agencies and consumers must change their behavior if they expect to stem this tide. They must adopt disruptive defenses to make it extremely hard for attackers to compromise data.

What is a disruptive defense? It is an uncommon defense, based on existing industry standards, that raises application security to higher levels than what is currently used by most applications.

There are six disruptive defenses that, when deployed, create significant barriers to attackers. They are as follows:

1. Eliminate shared-secret authentication schemes.

This includes passwords, one-time PINs, knowledge-based authentication, etc. This should be replaced with public key cryptography-based authentication that uses cryptographic hardware to protect keys.

Public key cryptography authentication (also known as "strong authentication") does not store secrets on the server. The secret remains with the user, stored in special hardware available on business desktops, laptops, modern mobile phones, smartcards and security keys. This is a modern authentication standard that eliminates passwords and is supported by all major operating systems as well as browsers. According to NIST, it provides the "highest assurance" among authentication technologies currently. Eliminating a 1960s authentication scheme on a 21st-century application should be the first defensive step of every web application.

2. Ensure the provenance of a transaction before it is committed.

This is accomplished through the use of a digital signature on a transaction, applied by the user using the same technology for strong authentication. Not only does this establish an authoritative source for the transaction (since only the user could have applied that digital signature with their consent), but it provides the business with a transaction confirmation, which is becoming increasingly necessary in many business environments.

3. Preserve the confidentiality of sensitive data within the application layer.

This excludes the practice of encrypting data within the database, operating system or disk drive. Encrypting sensitive data has become mandatory via multiple recent regulations. But application developers fool themselves when they use data at rest (data in a database, operating system or disk drive) encryption instead of ensuring that only authorized applications can decrypt sensitive data. Systems are rarely at rest; they're working 24 hours a day and decrypting data for attackers when a legitimate user's password-based credential is compromised. By combining disruptive defenses No. 1 and 3, applications will ensure unauthorized users never get to see decrypted data.

4. Preserve the integrity of a transaction through its lifetime.

This is accomplished, once again, by a digital signature, but it is applied by the application itself. While a digital signature acquired at the source of a transaction guarantees authenticity, transactions are modified in many applications. When data within the transaction changes, a new digital signature must be applied by the application to preserve the integrity of the modified transaction. Verifying the digital signatures of a transaction from its origin to its current state assures applications that unauthorized changes have not been made to data.

5. Use cryptographic hardware wherever cryptographic keys are stored and used.

Cryptography represents the last bastion of defense when protecting sensitive data. As such, cryptographic keys are the only objects standing between an attacker and a major headache for your company. While convenient, keys protected in files are protected by passwords and are subject to the same attacks that compromise user passwords. By using cryptographic hardware -- present in all modern systems -- applications create major barriers to attacks. While it may be argued that cryptographic hardware is also subject to attacks, evidence shows that these attacks are neither scalable nor common, as attackers would need access to the physical computer on which your cryptographic keys are stored to be able to compromise the keys.

6. Ensure cloud applications access cryptographic services from within a secure zone.

While the cloud offers many business benefits, attempting to access cryptographic services from within a public cloud's virtual machine is a recipe for disaster -- the credentials necessary to authenticate to the cryptographic services are vulnerable to compromise in a public virtual machine (as some recent breaches highlight), enabling attackers to use legitimate credentials to command key management systems to decrypt sensitive data for the attacker. Using an application architecture that guarantees access to cryptographic services only from a secure zone eliminates that risk completely.

All these disruptive defenses are based on industry standards and have been around for decades in most cases. Security-conscious professionals recognize protecting data by focusing on system security is a secondary defense; network defenses are generally nonproductive and should be minimized because the use of disruptive defenses assumes an attacker is on the network. The objective, now, is to protect data, even in the presence of this threat.

Visit link:
Disruptive Defenses Are The Key To Preventing Data Breaches - Forbes

Xilinx announces addition to ACAP platform with Versal Premium – Electronics Weekly

Versal is an adaptive compute acceleration platform (ACAP), a heterogeneous compute device with capabilities that the firm claims far exceed those of conventional silicon architectures.

Added to the Premium series are 112Gbps pam4 transceivers, multi-hundred GBe and Interlaken connectivity, cryptography, and PCIe gen5 with built-in DMA, supporting both CCIX and CXL.

Developed on TSMCs 7nm process, Versal Premium combines vector and scalar processing elements, coupled to programmable logic and tied together with a network-on-chip (NoC) which provides memory-mapped access to all three processing element types.

The scalar engines are built from the dual-core Arm Cortex-A72, the adaptable engines are made up of programmable logic and memory cells and the intelligent engines are an array of very long instruction word (VLIW) and single instruction, multiple data (SIMD) processing engines and memories.

The device has over 120TB/s of on-chip memory bandwidth which, coupled with the customizable memory hierarchy, is designed to reduce data movement and remove key bottlenecks, while pre-engineered connectivity and cores allow integration into existing cloud infrastructure.

Versal Premium is designed for high bandwidth networks operating in thermally and spatially constrained environments, as well as for cloud providers. It delivers up to 9Tb/s of serial bandwidth and 5Tb/s of throughput with Ethernet, with flexibility to support various data rates and protocols, the firm says.

Its cryptography engines provide up to 1.6Tb/s of encrypted line rate throughput and support for AES-GCM-256/128, MACsec and IPsec.

The firm suggests the device will find applications in 5G communications, aerospace and defense, along with ADAS.

The Versal Premium series takes ACAPs to the next level delivering breakthrough networked hard IP integration enabling the development of single chip 400G and 800G solutions, says Kirk Saban, vice president of product and platform marketing at Xilinx.

Targeting next-generation networks and cloud deployments, Versal Premium delivers superior bandwidth and compute density in a scalable platform that is readily programmable by hardware and software developers alike for optimized acceleration and reduced TCO.

Link:
Xilinx announces addition to ACAP platform with Versal Premium - Electronics Weekly

5 Actionable Takeaways from Ponemon and KeyFactor’s 2020 PKI Study – Hashed Out by The SSL Store – Hashed Out by The SSL Store

Looking for the latest stats and info about public key infrastructure? Lookno further

74%. Thats how many organizations report not knowing how many keys and certificates they have. This unsettling statistic was reported in the latest data from The Impact of Unsecured Digital Identities, a new public key infrastructure (PKI)-focused research study by the Ponemon Institute and KeyFactor

Last year, KeyFactor and the Ponemon Institute joined forces to publish a study on public key infrastructure. This years publication is chock full of goodies and valuable insights on PKI as a whole. In early March, Chris Hickman, chief security officer at KeyFactor, and Larry Ponemon, chairman and founder of the Ponemon Institute, shared key insights from the study during a webinar. And in this years report, they included something new the Critical Trust Index. This 16-question core competency measurement aims to help businesses measure their certificate management capabilities, the effectiveness of their PKI efforts, and their agility and growth.

Its a great study one well definitely quote cybersecurity statistics from throughout the year. But what makes it so good? The items highlighted in the study are the ones we see every day from our clients across multiple industries both good and bad.

So, what can the results of this study tell you and how canit help you make informed decisions for your own PKI? And who was involved withthe study?

Lets hash it out.

The study, sponsored by our friends at KeyFactor, was independently conducted by the Ponemon Institute, both of which are well-known names within the industry.

The data in the study comes from the survey responses of 603IT and infosec professionals from across North America. The majority of therespondents (61%) reported their positions as supervisor or above and another30% indicated that they are at the staff/technician level. The majority arefrom large Enterprises, with 64% of the respondents indicating that they workfor organizations with at least 5,001 employees.

The participants were asked to respond to a series ofquestions relating to cyber security threats, strategies, budgets, certificatemanagement, compliance, and financial impacts relating to several of theseareas.

From a 30,000-foot perspective, the current mechanisms forsecuring and managing digital certificates and cryptographic keys are lacking.Many companies lack the personnel and technical resources, budgets, procedures,or policies to effectively support public key infrastructure. As such, thisleaves organizations open to significant risks from a variety of cybersecuritythreats the world over.

But no matter how challenging it can be, IT security andinformation security practitioners alike know that public key infrastructure iscritical to organizations. After all, PKI helps organizations to increase trustwith end users and clients (their web browsers) alike through authenticationand encryption. As certificate lifespans shrink and threats continue to evolve,the risk that your organization will be impacted increases with them.

But how important is PKI in the eyes of the C-suite executivesabove them? Lets find out as we glean insights about this topic and othersrelating to the PKI ecosystem.

Perception and reality are frequently two different things this is particularly the case regarding how PKI tasks and IT securitychallenges are handled. Probably the biggest takeaway that the study highlightsthe tremendous gap in perceptions in terms of confidence in the responses toquestions between the technical guardians within an organization and those who areamong the executive leadership above them.

In that data alone, it showed us very significantly how the problems of managing these types of critical assets in the organization, from the practitioners to the executives, differ when asked the same questions, Hickman said in the webinar on the study.

Their observation made them question why theres such adifference in the landscape between these different ranks within anorganization. Executives tend to be significantly more optimistic in theirresponses than their staff/technician counterparts averaging 6.2 on a 1-10scale, versus staff/technicians, who have an average confidence rating of 3.7. Thisis particularly true concerning issues relating to managing critical assets.

These responses demonstrate why challenges might exist withinorganizations leaders think issues are being handled or resolved, andpractitioners are struggling to keep up with the never-ending demands.

As with any organization and tasks, communication is key.There needs to be clear communication and transparency about the situation. Ifthere are deficiencies, insufficient resources, or other challenges, everyoneneeds to be on the same page.

Dont sugar coat things. Be open and honest about PKI and ITsecurity-related issues that exist within your organization. Make yourleadership aware of any challenges and offer recommendations and solutions toaddress the issues. Most importantly: Learn to speak their language.

One suggestion from Hickman and Ponemon shared during the webinar comes from Gartner:

Security leaders that successfully reposition X.509 certificate management to a compelling business story, such as digital business and trust enablement, will increase program success by 60%, up from less than 10% today.

Essentially, executives want to know the bottom line costsinvolved and how circumstances will affect the operation and organization as awhole. Dont speak technical mumbo-jumbo. Give them what they want while stillpushing for the resources you need by changing how you frame the situation.

Listen to your experts. Listen to understand and not to reply. Recognize that theyre humans and that the industry and cyber threats are continually changing. The threats we face today arent necessarily the same as those well face in the future. Be flexible and open to change. If you want to protect your organization, dont put off investing in your cybersecurity infrastructure and resources until tomorrow. Commit to making those changes today.

According to the report, 60% of respondents believe theyhave more than 10,000 certificates in use across their organization. Thats alot of cats to herd. Interestingly, though, the respondents arent all thatconfident in their estimates 74% indicate that they have no clue how manycertificates and keys they actually are using for certain.

So, what do all of these statistics have in common? A lackof certainty (and clarity), for one. Thats because these organizations lackvisibility into their PKI certificate management. Essentially, they dont know:

This lackadaisical approach is kind of like trying to run a restaurant without any clue about whos responsible for what and how its all getting done. For a restaurant to work, you need to know whos ordering the supply deliveries, whos making the food, whether the food thats available to serve to customers meets certain quality and hygienic standards (it hasnt expired), and whos serving it.

If you dont know these things because you lack visibilitywithin your operation, then, frankly, youre not going to be in business forvery long.

Honestly, this finding that organizations have a lack ofvisibility into their PKI doesnt strike me as surprising. After all, a lack ofvisibility is an ongoing issue for many organizations within the industry as awhole and was also an issue in their previous study from 2018. But it doessurprise me a little is that the organizations are willing to admit thatthey lack this visibility and that it continues to be an ongoing issue.

According to their data, 55% of surveyed organizations saidthey had four or more certificate outages over the last 2 years! And 73% saidthat their organizations still experience unplanned downtime and outages due tomismanaged digital certificates.

So, what can done to help you address this lack ofvisibility and poor certificate management within your organization?

Here at Hashed Out, were all about helping our readers avoidcommon PKI certificate management mistakes. One of the things we always emphasizeis the importance of having visibility over your PKI. An issue that many adminshave is that theyre trying to manage their keys and certificates using manualmethods such as Excel spreadsheets. This is not only clunky and cumbersome, butit leads to a variety of issues.

One such example is shadow ITcertificates. If youre not the only person in charge of installing,renewing, and managing X.509 digital certificates, then some certificates canget installed that you dont know about. And certificates that you may haveinstalled yourself may fall through the cracks and expire without yourknowledge. And you cant effectively manage what you dont know you have.

Using a reliable and reputable certificatemanagement solution can help you to avoid this issue. The best certificatemanagement tools enable you to

This provides you with full visibility of your public keyinfrastructure. Considering that many organizations believe they have at least10,000 certificates, you can see how trying to manually manage these assets is virtuallyimpossible.

Manage Digital Certificates like a Boss

14 Certificate Management Best Practices to keep your organization running, secure and fully-compliant.

We get it. Everyones busy and, frankly, there just arentenough hours in the day to handle every task that comes our way. But thatdoesnt change the importance of having a specific team or department thatsresponsible for handling essential tasks.

Despite this need, study respondents indicate that digitalcertificate budgets and responsibility ownership are lacking. The tasks,responsibilities, and budgetary requirements associated with certificatemanagement are often times spread among various departments within differentorganizations. Essentially, theres no clear center of excellence forcryptography.

Their findings also report that nearly just a third (38%) oforganizations claim that their organizations have the human resources dedicatedto their PKI deployment. Part of this might be because of the stagnantcybersecurity budgets in comparison to the industrys growing costs, or itcould be related to the challenges companies report facing in terms of hiringand retaining talent.

Organizations represented in the KeyFactor/Ponemon Institutestudy reported spending only 16% of their budgets on PKI. Thats approximately$3 million from the reported average IT security annual budget of $19.4 million!And they also discovered that the responsibilities and ownership is frequentlyspread among other departments:

In the U.S., were experiencing some of the lowest unemployment levels in more than two decades. The U.S. Bureau of Labor Statistics (BLS) reports that for college grads, the unemployment rate is at 2.0% and 3.8% for high school graduates as of January 2020. Were experiencing the lowest unemployment rates in IT security and technology, which is literally at 0%, according to Cybersecurity Ventures.

While this is great for jobseekers, its not as great fororganizations looking to hire them. Why? Because it would imply that theres agreater demand for skilled workers than there are people looking for jobs. Thismeans that businesses and organizations are competing for talent. So, what canyou do to combat growing workloads when you have static resources?

Some organizations are turning to automation and the use of artificial intelligence (AI). Automation can help reduce the load on your staff and augment their capabilities by eliminating the menial tasks from their workloads. Predictive analytics, language processes, authentication, and log analysis to identify anything unusual. Using AI helps to free up your employees so they can focus some of their attention on higher-level priorities and tasks.

One example of automation in PKI is a certificate management solution. You can use this tool to gain visibility into your PKI and discover shadow certificates. Its also invaluable in terms of helping your team effectively manage all aspects of the certificate lifecycle and avoid certificate expirations, which Gartner estimates can cost an average of $300,000 per hour.

SSL/TLS certificates are a must for any ecommerce business(or any website, really, that wants to rank on Google and other searchengines). And as more organizations readily adopt PKI solutions, it means thereare more keys and digital certificates to manage. Using certificate managementtools and other automation solutions can help you to not only streamline youroperations and make them more effective, but it also helps you to controlrising operational costs.

While certificate outages are a major cause of concern, theresponses received during the study indicate that failed audits due to insufficient key management practices, rogue orcompromised certificate authorities (CA), and misuse of code signingcertificates and keys are even bigger areas of concerns. This is true both interms of financial costs as well as compliance.

The seriousness of failedaudits and compliance headed up the rankings (4.1 on a 1-10 scale where 1 isconsidered a least serious problem and 10 is most serious problem). Inparticular, survey respondents are worried about insufficient or unenforced keymanagement policies and practices. The next most serious issue related toman-in-the-middle (MitM) and phishing attack vulnerabilities that stem from CAcomprormise.

We mentioned earlier that nearly three-quarters (73%) of respondents indicate that they experience unplanned outages and downtime due to mis-managed digital certificates. These occurences are more frequent than unplanned outages that result from certificate expiration. What makes these numbers even more dire is that disruptive outages are expected to keep increasing rather than decreasing. According to the report:

59 percent of respondents say the misuse of keys and certificates by cybercriminals is increasing the need to better secure these critical assets. Yet, more than half (54 percent) of respondents are concerned about their ability to secure keys and certificates throughout all stages of their lifecycle from generation to revocation

If youre using a private CA, its not really surprisingwhen things go sour. One of the best things you can do to avoid issues relatingto rogue or compromised certificate authorities is to work with established,reputable commercial CAs who provide managedPKI services. It would be best to stay away from free PKI certificateproviders because they lack the support and resources that commercial digitalcertificate providers have at their disposal.

The final insight well share from the survey is that respondents concerns stemming from post-quantum cryptography are decreasing for now. The KeyFactor and Ponemon report says:

Only 47 percent of respondents are concerned about the impact that quantum computing will have on their key and certificate management practices, but we expect this number will rise as recent advances in quantum technology bring us closer to the potential breaking point of the keys and algorithms we rely upon today.

Essentially, there is and has been hype surrounding thetopic for several years. But until quantum computing is available at the commerciallevel, well overestimate the potential negative impacts rather than highlightits positive impacts on security, Ponemon said.

Hickman says that quantum computing is our future reality its just a matter of when, not if it will become a thing. Thats why theindustrys work on post-quantum algorithms is critical (see our previousblog post highlighting DigiCerts work on post-quantum cryptography) andwhy organizations need to:

Rarely have we seen something in this industry with thepotential cataclysmic effect of quantum, and the disruptive nature that it willbring from a security standpoint, says Hickman, who emphasizes the importanceof planning, which seems to be taking a back seat in terms of being considereda priority.

Hickman continues:

Having a plan, understanding where your digital assets live, where your cryptography is deployed, having ways to manage that crypto is absolutely important. Things are going to happen along the way such as the deprecation of algorithms But youll be able to reuse that same plan and actually validate it top make sure that youre ready for a post-quantum world.

From these survey responses, its obvious that theres noone clear owner of PKI budgets and efforts with multi-discipline and multi-functionalteams. And theres also no one agreed upon method that these surveyedorganizations rely on to deal with these increasing crypto responsibilities. Butits obvious that having a governance process in place and clear visibility ofyour public key infrastructure are essential to improving a businesss certificatemanagement capabilities. Part of this entails establishing a cryptographiccenter of excellence if one doesnt already exist.

The increasing use of encryption technologies, digital certificates,etc. for compliance with regulations and policies dictates the need for better certificatemanagement practices. And as operational costs continue to increase without a parallelincrease in operating budgets to cover those costs, automation will becomeimportant the closer we get to a PQC world.

Read the original here:
5 Actionable Takeaways from Ponemon and KeyFactor's 2020 PKI Study - Hashed Out by The SSL Store - Hashed Out by The SSL Store

Novel error-correction scheme developed for quantum computers – News – The University of Sydney

Dr Arne Grimsmo from Sydney Nano and the School of Physics. Photo: Stefanie Zingsheim

Scientists in Australia have developed a new approach to reducing the errors that plague experimental quantum computers; a step that could remove a critical roadblock preventing them scaling up to full working machines.

By taking advantage of the infinite geometric space of a particular quantum system made up of bosons, the researchers, led by Dr Arne Grimsmo from the University of Sydney, have developed quantum error correction codes that should reduce the number of physical quantum switches, or qubits, required to scale up these machines to a useful size.

The beauty of these codes is they are platform agnostic and can be developed to work with a wide range of quantum hardware systems, Dr Grimsmo said.

Many different types of bosonic error correction codes have been demonstrated experimentally, such as cat codes and binomial codes, he said. What we have done in our paper is unify these and other codes into a common framework.

The research, published this week in Physical Review X, was jointly written with Dr Joshua Combes from the University of Queensland and Dr Ben Baragiola from RMIT University. The collaboration is across two leading quantum research centres in Australia, the ARC Centre of Excellence for Engineered Quantum Machines and the ARC Centre of Excellence for Quantum Computation and Communication Technology.

Our hope is that the robustness offered by spacing things out in an infinite Hilbert space gives you a qubit that is very robust, because it can tolerate common errors like photon loss, said Dr Grimsmo from the University of Sydney Nano Institute and School of Physics.

Scientists in universities and at tech companies across the planet are working towards building a universal, fault-tolerant quantum computer. The great promise of these devices is that they could be used to solve problems beyond the reach of classical supercomputers in fields as varied as materials science, drug discovery and security and cryptography.

With Google last year declaring it has a machine that has achieved quantum supremacy performing an arguably useless task but beyond the scope of a classical computer interest in the field of quantum computing and engineering continues to rise.

But to build a quantum machine that can do anything useful will require thousands, if not millions of quantum bits operating without being overwhelmed with errors.

And qubits are, by their very nature, error prone. The quantumness that allows them to perform a completely different type of computing operation means they are highly fragile and susceptible to electromagnetic and other interference.

Identifying, removing and reducing errors in quantum computation is one of the central tasks facing physicists working in this field.

Quantum computers perform their tasks by encoding information utilising quantum superposition a fundamental facet of nature where a final outcome of a physical system is unresolved until it is measured. Until that point, the information exists in a state of multiple possible outcomes.

Dr Grimsmo said: One of the most fundamental challenges for realising quantum computers is the fragile nature of quantum superpositions. Fortunately, it ispossible to overcome this issue using quantum error correction.

This is done by encoding information redundantly, allowing the correction of errors as they happenduring a quantum computation. The standard approach to achieve this is to use a large number of distinguishable particles asinformation carriers. Common examples are arrays of electrons, trapped ions or quantum electrical circuits.

However, this creates a large network of physical qubits in order to operate a single, logical qubit that does the processing work you require.

This need to create a large network of physical qubits to support the work of a single operating qubit is a non-trivial barrier towards constructing large-scale quantum machines.

Dr Grimsmo said: In this work, we consider analternative approach based on encoding quantum information in collections of bosons. The most common type of boson is the photon, a packet of electromagnetic energy and massless light particle.

By trapping bosons in a particular microwave or optical cavity, they become indistinguishable from one another, unlike, say, an array of trapped ions, which are identifiable by their location.

Theadvantage of this approach is that large numbers of bosons can be trapped in a single quantum system such as photons trappedin a high-quality optical or microwave cavity, Dr Grimsmo said. This could drastically reduce the number of physical systems required to build a quantum computer.

The researchers hope their foundational work will help build a roadmap towards fault tolerance in quantum computing.

Go here to see the original:
Novel error-correction scheme developed for quantum computers - News - The University of Sydney

What next in the world of post-quantum cryptography? – Ericsson

Research in quantum computers is advancing quickly and researchers recently claimed to have reached quantum supremacy, in other words, the ability of quantum computers to perform a calculation out of reach of even the most powerful classical supercomputers.

However, any claims that quantum computers are close to cracking any practically used cryptosystems are highly exaggerated. Such powerful quantum computers are very likely several decades away, if indeed they will ever be built. Many significant technical advances are still required before a large-scale, practical quantum computer can be achieved, and some commentators even doubt whether such a scenario will ever be possible.

What we do know, however, is that large-scale cryptography-breaking quantum computers are highly unlikely to develop during the next decade. Yet, in spite of this, systems which need very long-term protection such as government systems with classified information or root certificates with very long lifetimes must nevertheless start preparing to replace todays asymmetric algorithms.

In traditional cryptography, there are two forms of encryption: symmetric and asymmetric.

Most of today's computer systems and services such as digital identities, the Internet, cellular networks, and crypto currencies use a mixture of symmetric algorithms like AES and SHA-2 and asymmetric algorithms like RSA (Rivest-Shamir-Adleman) and elliptic curve cryptography.

The asymmetric parts of such systems would very likely be exposed to significant risk if we experience a breakthrough in quantum computing in the coming decades.

In anticipation of such a quantum computing paradigm, cryptography is being developed and evolved by using so-called quantum-safe algorithms. They run on classical computers and are believed to withstand attacks from powerful quantum computers.

When we compare post-quantum cryptography with the currently used asymmetric algorithms, we find that post-quantum cryptography mostly have larger key and signature sizes and require more operations and memory. Still, they are very practical for everything except perhaps very constrained Internet of Things devices and radio.

Large-scale cryptography-breaking quantum computers are highly unlikely to develop during the next decade

The US National Institute of Standards and Technology (NIST) is currently standardizing stateless quantum-resistant signatures, public-key encryption, and key-establishment algorithms and is expected to release the first draft publications between 20222024. After this point, the new standardized algorithms will likely be added to security protocols like X.509, IKEv2, TLS and JOSE and deployed in various industries. The IETF crypto forum research group has finished standardizing two stateful hash-based signature algorithms, XMSS and LMS which are also expected to be standardized by NIST. XMSS and LMS are the only post-quantum cryptographic algorithms that could currently be considered for production systems e.g. for firmware updates.

The US government is currently using the Commercial National Security Algorithm Suite for protection of information up to top secret. They have already announced that they will begin a transition to post-quantum cryptographic algorithms following the completion of standardization in 2024.

Why should the industry be taking note of this decision? Top secret information is often protected for 50 to 75 years, so the fact that the US government is not planning to finalize the transition to post-quantum cryptography until perhaps 2030 seems to indicate that they are quite certain that quantum computers capable of breaking P-384 and RSA-3072 will not be available for many decades.

When we turn our focus to symmetric cryptography as opposed to asymmetric cryptography, we see that the threat is even more exaggerated. In fact, even a quantum computer capable of breaking RSA-2048 would pose no practical threat to AES-128 whatsoever.

Grovers algorithm applied to AES-128 requires a serial computation of roughly 265 AES evaluations that cannot be efficiently parallelized. As quantum computers are also very slow (operations per second), very expensive, and quantum states are hard to transfer from a malfunctioning quantum computer, it seems highly unlikely that even clusters of quantum computers will ever be a practical threat to symmetric algorithms. AES-128 and SHA-256 are both quantum resistant according to the evaluation criteria in the NIST PQC (post quantum cryptography) standardization project.

In addition to post-quantum cryptography running on classical computers, researchers in quantum networking are looking at quantum key distribution (QKD), which would theoretically be a provably secure way to do unauthenticated key exchange.

QKD is however not useful for any other use cases such as encryption, integrity protection, or authentication where cryptography is used today as it requires new hardware and is also very expensive compared to software-based algorithms running on classical computers.

In a well-written white paper, the UK government is discouraging use of QKD stating that it seems to be introducing new potential avenues for attack, that the hardware dependency is not cost-efficient, that QKDs limited scope makes it unsuitable for future challenges, and that post-quantum cryptography is a better alternative. QKD will likely remain a niche product until quantum networks are needed for non-security reasons.

Standardization of stateless quantum-resistant signatures, public-key encryption and key-establishment algorithms is ongoing and first draft publications are expected no earlier than 2022

The calculation recently used to show quantum supremacy was not very interesting in itself and was contrived to show quantum supremacy. The claim was also criticized by competing researchers who claim that the corresponding classical calculation could be done over a million times faster. Quantum computers able to solve any practical problems more cost-effectively than classical computers are still years away.

The quantum supremacy computer consists of 54 physical qubits (quantum bit), which after quantum error correction corresponding to only a fraction of a single logical qubit. This is very far away from quantum computers able to break any cryptographic algorithm used in practice which would require several thousand logical qubits and hundreds of billions of quantum gates. Scaling up the number of qubits will not be easy, but some researchers believe that the number of qubits will follow a quantum equivalent of Moores law called Nevens law. We will likely see undisputed claims of quantum supremacy in the coming years.

Since our earlier post in 2017 about post-quantum cryptography in mobile networks, the hype around quantum computers and the worries about their security impacts have been more nuanced, aligning with our previous analysis.

Recent reports from academia and industry now says that large-scale cryptography-breaking quantum computers are highly unlikely during the next decade. There has also been general agreement that quantum computers do not pose a large threat to symmetrical algorithms. Standardization organizations like IETF and 3GPP and various industries are now calmly awaiting the outcome of the NIST PQC standardization.

Quantum computers will likely be highly disruptive for certain industries, but probably not pose a practical threat to asymmetric cryptography for many decades and will likely never be a practical threat to symmetric cryptography. Companies that need to protect information or access for a very long time should start thinking about post-quantum cryptography. But as long as US government protects top secret information with elliptic curve cryptography and RSA, they are very likely good enough for basically any other non-military use case.

Read our colleagues earlier blog series on quantum computing, beginning with an introduction to quantum computer technology.

Read our earlier technical overview to cryptography in an all encrypted world in the Ericsson Technology Review.

Visit our future technologies page to learn how tomorrows world is evolving.

Link:
What next in the world of post-quantum cryptography? - Ericsson

Gilles Brassard honoured by the BBVA Foundation for his work in quantum computing – Quantaneo, the Quantum Computing Source

Gilles Brassard, a professor in the Department of Computer Science and Operations Research at Universit de Montral, along with Charles Bennett of IBM's New York State Research Center and Peter Shor of the Massachusetts Institute of Technology, has been awarded the BBVA Foundation's Frontiers of Knowledge Award in the basic sciences category for "outstanding contributions to the areas of computing and quantum communication."

Thee three researchers will receive the award June 2 in Bilbao, Spain, and will share the 400,000 that comes with it.

Professor Brassard is the seventh Canadian to receive the prize and the first ever in the basic sciences category (physics, chemistry, mathematics).

In 1984, Brassard and Bennett devised the first quantum cryptography technique, which makes it possible to encode messages in order to exchange information with absolute confidentiality. Then, in 1993, they laid the foundations for quantum teleportation in collaboration with four other researchers. The group proved that it was possible to transport information in subatomic particles, such as photons, from one place in the galaxy to another, without physically moving them. This principle is based on the rules of quantum theory, according to which a particle can simultaneously exist in several states.

Industry is currently investing billions of dollars in quantum technologies, particularly in China and Europe, and the theoretical work of Brassard, Bennett and Shor has helped put this discipline on track.

This is the third major award that Brassard and Bennett have jointly won on the international scene. In 2018, the duo received the Wolf Prize in Physics from the President of Israel, a prize often seen as leading to a Nobel Prize in Physics. Last year in China, they were awarded the Micius Prize for their breakthroughs in quantum theory.

Read the rest here:
Gilles Brassard honoured by the BBVA Foundation for his work in quantum computing - Quantaneo, the Quantum Computing Source

DARPA to Explore Fully Homomorphic Encryption in New Program – ExecutiveBiz

The Defense Advanced Research Projects Agency is pursuing an effort to develop hardware that allows for computing on encrypted data with continuous protection.

DARPA said Monday its Data Protection in Virtual Environments or DPRIVE program aims to accelerate computations done with fully homomorphic encryption or FHE, an approach that protects encrypted data while still allowing for processing.

FHE uses lattice cryptography to block cyber attacks via complex, nearly unsolvable mathematical barriers. However, FHE computations generate noise that would eventually corrupt the data at a certain point, and addressing this noise results in a large amount of computational overhead.

The DPRIVE program aims to reduce this overhead and accelerate FHE computations.

"Today, DARPA is continuing to invest in the exploration of FHE, focusing on a re-architecting of the hardware, software and algorithms needed to make it a practical, widely usable solution," said Tom Rondeau, a program manager at DARPA.

The agency hosted a proposer's event on Monday to further inform interested parties on the program. DARPA also launched a presolicitation for DPRIVE and will continue to accept responses through June 2.

Follow this link:
DARPA to Explore Fully Homomorphic Encryption in New Program - ExecutiveBiz