2020 vision: Synopsys predictions – Gigabit Magazine – Technology News, Magazine and Website

Happy New Year! To kick off 2020, the leadership team at Synopsys share their predictions for the year to come.

Steve Cohen, Security Services Manager at Synopsys:

Focus: Cloud Security

In 2020, I believe well see the accelerated adoption of finer granular objects to drive efficiencies. As developers adopt these finer granular objects within their cloud applications, such as containers, microservices, micro-segmentation, and the like, security testing tools will need to be object aware in order to identify unique risks and vulnerabilities introduced by utilizing these objects.

I anticipate that new approaches to collecting security related data may become necessary in the cloud. In addition to application logs, cloud API access will be seen as necessary. There will also be a growing focus on centralized logging in the upcoming year.

In addition to application security, the cloud management plane will become an additional security layer that needs addressing in 2020. Developers, for example, will require access to the management plane to deploy applications. Incorrect settings here could expose the application to security risks as sensitive information flows through it.

Reduced transparency around whats going on within a given application will likely be a growing trend. A cloud provider doesnt necessarily tell you what security controls exist for the PaaS services they expose to you. Businesses will therefore need to make some assumptions about their security considerations and stance.

In terms of data security and integrity in the cloud, there will be more of a need to have proper policies in place so prevent improper disclosure, alteration or destruction of user data. Policies must factor in the confidentiality, integrity and availability across multiple system interfaces of user data.

In 2020, the adoption of PaaS and serverless architecture will provide even more of an opportunity to dramatically reduce the attack surface within the cloud.

Tim Mackey, Principal Security Strategist at the Synopsys CyRC (Cybersecurity Research Centre):

Focus: General Cybersecurity

Cyber-attacks on 2020 candidates will become more brazen. While attacks on campaign websites have already occurred in past election cycles, targeted attacks on a candidates digital identity and personal devices will mount.

With digital assistants operating in an always listening mode, an embarrassing live mic recording of a public figure will emerge. This recording may not be associated directly with a device owned by the public figure, but rather with them being a third party to the device. For example, the conversation being captured as background noise.

With the high value of healthcare data to cybercriminals and a need for accurate healthcare data for patient care, a blockchain-based health management system will emerge in the US. Such a system could offer the dual value of protecting patient data from tampering while reducing the potential for fraudulent claims being submitted to insurance providers.

Emile Monette, Director of Value Chain Security at Synopsys:

Focus: General Cybersecurity

In the year to come, I anticipate that well see continued developments in software transparency (e.g., NTIA Software Component Transparency efforts). Additionally, a continued need for software testing throughout the software development life cycle (SDLC) will also persist as a focus in 2020most assuredly a positive step in terms of firms understanding the criticality of proactive security maturity. I also have reason to believe well see increased efforts to secure the hardware supply chain, and specifically efforts to develop secure microelectronic design and fabrication will come into focus in the upcoming yearb

Asma Zubair, Sr. Manager, IAST Product Management at Synopsys:

Focus: Endpoint Security

In 2020, we know that attackers will continue to exploit all applications, end-points, and networks they possibly can. This includes, but isnt limited to, web and mobile apps (internal or external), IoT devices in smart homes, and even the 5G network as it is being rolled out. Attackers will also continue to use the latest and greatest technologies (be it in machine learning, AI, or open source components that are freely available) to carry out ever-more sophisticated attacks at even greater scale. At the same time, organizations will continue to struggle as they try to balance competing priorities: the need to improve security, reduce time to market, and complete projects within budget and time constraints.

SEE ALSO:

As we look to what will change in the year to come, California's SB-327 IoT bill will take effect on Jan 1, 2020 requiring manufacturers to build reasonable security into their connected devices. This is a step in the right direction as it will establish minimum standards and improve security of IoT devices available in the market. I anticipate there will be more legislative activity in 2020, especially in the US. The California Consumer Privacy Act will also take effect on January 1, 2020. I expect more states to follow suit. If done properly, regulations will bring about the accountability needed to improve the overall state of cybersecurity.

We saw several high-profile GDPR-related lawsuits, fines, and settlements in 2019. I wouldnt be at all surprised to see more of these to hit the headlines in the coming year.

Organizations tend to focus a good deal of attention to their end-point protection and network security, and this is indeed very important. But applications, another very critical piece in the overall security puzzle, often dont get as much attention and therefore tend to become a weak link in terms of security. Organizations need to test their applications throughout the development process for security vulnerabilities using methods such as interactive application security testing (IAST), static application security testing (SAST), or dynamic application security testing (DAST). They must also actively work to address the vulnerabilities detected by these testing methods.

Kimm Yeo, Senior Manager at Synopsys:

Focus: Cellular/Wireless

The introduction of wireless broadband communication technologies such as 4G and LTE havent only affected consumer lifestyles. Such technology has also fueled the growth of ride-sharing business models. Although the adoption of LTE has been broad based, with over 600 carriers in 200 countries deployed, and over 3.2 billion subscribers worldwide (as of 2018), the enhanced user experience and convenience hasnt come without a price. Several dozen new security flaws related to LTE have been identified through fuzz testing.

As both cellular and wireless technologies continue to advance to 5G, 6G and beyond, this will not only greatly reduce latency and improve the user experience, it will also open the door to new attack surfaces and attack strategies. Its extremely difficult to anticipate and prevent such malicious advances in the increasingly connected ecosystems and lifestyles in which we all live. However, this is something we should strive to improve upon in the not-so-distant future.

Dennis Kengo Oka, Senior Solution Architect at Synopsys:

Focus: Automotive

There are two major trends emerging. The first is the concept of CASE (connected, autonomous, shared, electric). As technologies such as 5G lead to increased connectivity alongside advances in proprietary and open source software (e.g., Automotive Grade Linux), well see targets move beyond the vehicle. Malicious actors will leverage new, evolving attack vectors in backend systems, mobile apps, infrastructure and services relating to automotive technologies.

The second major trend well see in 2020 is that of standardization and regulations such as ISO/SAE 21434 and UNECE WP.29 driving cybersecurity activities in the automotive industry. This will lead to changes in organizational teams and processes, including the addition of security gates such as static code analysis, open source risk management, fuzz testing, and penetration testing to implement security throughout the entire vehicle life cycle. An increased focus on automated test processes and toolchains will continue to emerge as well in the year to come.

Go here to read the rest:
2020 vision: Synopsys predictions - Gigabit Magazine - Technology News, Magazine and Website

Recruiting Developers: the importance of finding the right people – Techerati

Taking the time to make the right hires and carefully thinking through your recruiting strategy is one of the best investments your business will ever make

Just about every business today relies on people who write code. The problem is that hiring good developers is difficult. It may even be the most difficult thing a business will do.

The reason developer hiring is such an important topic (and something many businesses find challenging) is that unlike many other professions, good developers can be many times more productive than their peers.

If you are hiring a driver to get you from A to B, regardless of how fast the driver you hire is, the difference between a high-performing driver and any other driver will be fairly minimal: they will both get you from A to B within a reasonable amount of time. It is essentially impossible for a driver to get you from A to B 10 times or 100 times quicker than another driver.

But this is not true in the technology industry. A great developer may be many times more productive than other developers, and a poor developer may actually remove value from your organisation. In short, hiring developers is a high-stakes game because the productivity multiple between one developer and another may be significant and business-altering.

There are only two ways to reach developers: in-person and online. Regardless of your tactics, if you want to recruit good people you need to get their attention, and without question, the best way to do this is to be an active participant in the developer community.

For in-person recruiting, this might involve giving technical talks at programming conferences, hosting developer dinners, and participating in developer events, such as hackathons or community meetups.

If youre able to, having your existing technical talent present on new methods and tools they are using at programming events can be a great way to connect with like-minded developers working on similar problems, make friends, and build a reputation for both your business and your employees.

Similarly, hosting a relaxed dinner where you invite some of your top developers as well as other respected developers in your area can be a great way to make authentic connections and explore opportunities. I have met some truly great people hosting these types of intimate events. Supporting these activities by giving your existing developers time and resources so they can attend these types of events is an authentic and effective way to recruit great people to your business.

But as much as I love in-person developer events, it would be remiss to not mention more scalable, online ways to attract great developers.

Some of the most effective ways Ive found to recruit great developers online is to publish technical articles and videos, answer questions on topics related to your business on popular developer sites like StackOverflow, and build and share open source software that other developers can use to solve problems.

Giving your top people time to share some of the interesting technical things they have learned on a company blog and YouTube channel can be incredibly effective. It can get the attention of developers working on similar problems, build developer awareness of your company and attract thousands of developers to your site over a number of years

While it can be a lot of work, allowing your technical teams to publish some of the software they create as open source solutions can be very effective too. Not only will open sourcing some of the projects your teams work on attract external developers to your company, it often makes your engineering team work more effectively by forcing them to build reusable solutions to common problems.

These strategies will help you reach the right people, but after you have reached them, it is still up to you to win them over. That means understanding fair market rates, developer culture, and engineering management. If you can foster an environment in which great developers want to work, you will have a much easier time getting great people to join your company.

One common misconception I have heard from business owners is that if you hire great developers they will perform well. This is not true. All developers can perform well under certain conditions, but it is up to you to design a hiring process that ensures the developers you hire will flourish based on your engineering culture, management, company values and technology needs.

When you are designing a developer hiring process the first thing you need to know is that testing developers and finding a great fit is tricky. There is no perfect way to do it and you will never be able to guarantee you always hire the right people.

With that said, here are the things that I have found work well in a developer hiring process.

Ask developers in-depth questions about projects they have worked (or are working) on. Avoid just asking them what they are doing currently, instead have them explain it to you in great depth. Ask them why they are doing things certain ways and how they might change things. Probe at a deeper level and you can gain a deeper understanding of how they think and what their realm of expertise is.

It is important to ask a candidate what their favourite project has been. I often have them walk me through it what they liked about it, and what they disliked. This is a great way to figure out not only what the candidate knows, but also the types of projects they enjoy working on.

Instead of coding-puzzles, give candidates a take-home project. Not only are coding-puzzles a poor reflection of what candidates will actually be doing on the job, they also incentivise poor behaviour. Instead of making the interview process about a candidates experience and depth of knowledge, coding-puzzle-style technical quizzes end up merely testing the candidate on how well they have memorised a series of common math problems, which is almost certainly what you do not want to test for.

Instead of forcing a candidate to solve problems on a whiteboard, consider giving them a take-home project. What I like to do is ask candidates to build a very small application (which they should spend no more than four hours on); something similar to what they would be working on if they get the job. This way, the candidate has a chance to think through what they are working on without the performance pressure of an interview and can show you how they perform in a real-world scenario.

An added benefit of the take-home project is that if the candidate does come in for an onsite, you will have plenty to talk about using the take-home assignment as a basis for conversation. I like to ask candidates what they liked and disliked about the assignment and use those questions as a starting point to dive deeper into the technology choices and strategies they used.

Making sure every developer you hire understands your business challenges and how things can be improved is critical. Bringing on developers who will just take orders is a recipe for disaster, as your business will be unable to innovate effectively with this mindset. It is vital that the strongest members of your team have the same vision for fixing issues and pushing for change that you do.

When this is all done successfully, developers will be one of the strongest growth factors for your business. Taking the time to make the right hires and carefully thinking through your recruiting strategy is one of the best investments your business will ever make.

Read more here:
Recruiting Developers: the importance of finding the right people - Techerati

Assistive Technolgy Switch Is Actuated Using Your Ear Muscles – Hackaday

Assistive technology is extremely fertile ground for hackers to make a difference, because of the unique requirements of each user and the high costs of commercial solutions. [Nick] has been working on Earswitch, an innovative assistive tech switch that can be actuated using voluntary movement of the middle ear muscle.

Most people dont know they can contract their middle ear muscle, technically called the tensor tympani, but will recognise it as a rumbling sound or muffling effect of your hearing when yawning or tightly closing eyes. Its function is actually to protect your hearing from loud sounds screaming or chewing. [Nick] ran a survey and found that 75% can consciously contract the tensor tympani and 17% of can do it in isolation from other movements. Using a cheap USB auroscope (an ear camera like the one [Jenny] reviewed in November), he was able to detect the movement using iSpy, an open source software package meant for video surveillance. The output from iSpy is used to control Grid3, a commercial assistive technology software package. [Nick] also envisions the technology being used as a control interface for consumer electronics via earphones.

With the proof of concept done, [Nick] is looking at ways to make the tech more practical to actually use, possibly with a CMOS camera module inside a standard noise canceling headphones. Simpler optical sensors like reflectance or time-of-flight are also options being investigated. If you have suggestions for or possible use case, drop by on the project page.

Assistive tech always makes for interesting hacks. We recently saw a robotic arm that helps people feed themselves, and the 2017 Hackaday Prize has an entire stage that was focused on assistive technology.

Continued here:
Assistive Technolgy Switch Is Actuated Using Your Ear Muscles - Hackaday

China To Enforce First-ever Cryptography Law As It Kicks-off Its First Digital Currency – The Coin Republic

Steve Anderrson Thursday, 02 January 2020, 01:28 EST Modified date: Thursday, 02 January 2020, 02:23 EST

Reports from the Reference News Network has now stated that the cryptography law passed in October 2019 will now go into full effect starting the 1st January 2020. The legislation divided passwords into core passwords, ordinary passwords, and commercial passwords.

The law allows the Chinese government to hold full management over the core and regular passwords while it has promised to help nurture the industry to manage commercial passwords. Although the law doesnt directly mention cryptography, passwords are at the core of protecting data within a blockchain network.

Blockchain technology based on a decentralized, transparent, ledger system that takes data and distributes them into blocks, each protected by its hash. The hash is a very complex cryptographic password that helps maintain the integrity of the block. This powerful cryptographic encryption and transparency of the ledger are why blockchain technology is profoundly revolutionary and influential.

This is part of an aggressive push from the Chinese government on the adoption of blockchain technology over the past couple of months. This is a significant move to improve on the core technology behind the blockchain network to facilitate Chinas digital currency. The digital RNB that will be launched by the Peoples Bank of China will be the first digital currency form ever in a major country.

The issuance of a digital RNB will be most likely using the blockchain network, and it aims to strengthen the financial economy of the country while holding strict supervision of funds. It would also help reduce the load on financial institutions in the regulation of currency.

It as also reported that Chinese leaders have also made a policy that makes a strong argument for the acceleration of blockchain-based technology. This crypto law passed is a first step in improving the blockchain technology that will likely be backing the digital RMB. The bill also demands the Chinese government will hold authority over the designation of national and industry cryptographic standards.

Reports suggest that the Chinese government will also keep oversight over overseas remittances and production lines, from raw materials to manufacturing on the assembly line and circulation to make sure none of the records tampers.

This move has also seen with some criticism from Facebook CEO, Mark Zuckerberg who is also planning on releasing the Libra stablecoin, purportedly sometime during 2020. He argued that allowing Chinese superiority in the cryptocurrency space could prove disastrous for the value of the US dollar and that it could quickly lose its leading position in the currency space if it doesnt innovate.

Read this article:
China To Enforce First-ever Cryptography Law As It Kicks-off Its First Digital Currency - The Coin Republic

Global Banks’ Inclination For Blockchain And Cryptography – EconoTimes

With sovereign governments and their central banks across the globe have been exploring the essence and the opportunities of CBDC (Central Bank Digital Currency) foreseeing a swift transformation phase in the prevailing finance system.

The advanced era of FinTech has come up with the new trends and inventions, such as, Smart contracts & DeFi which seem to be lucrative prospects and is most likely to hit the financial avenue by endorsing luring use cases of digital assets that enables flexibility, controllability of the financial as well as real assets, efficient trade finance & loans business and offer interest-bearing contracts etc.

The First Deputy Governor of Banque de France, Denis Beau, has recently recommended deploying distributed ledger technology (DLT) for euro payment settlements within the Eurozone.

At the Second Annual Capital Markets Technology and Innovation Conference, Denis Beau advocated the European Central Bank (ECB) that the European Central Bank (ECB) should be liberal in experimenting with distributed ledger technology (DLT) as a way of settling euro-denominated transactions.

The Swiss National Bank (SNB) made an effort by signing an operational pact with the Bank of International Settlements (BIS) to delve into digital currencies in the BIS Innovation Hub Centre established in Switzerland.

Although US FED has clarified that they dont have the concrete plans of developing CBDC, the lingering hush-hush from the recent past that a top Fed authority has mentioned the US Central Bank is pondering over the idea of a digital dollar, while Democratic and Republican members of Congress communicated with the Fed Chairman Jay Powell to know the implications of such a revolutionary adoption.

In reply, the Chairman of the Federal Reserve, Powell, admitted the trends and said to the US representatives French Hill and Bill Fosters, who had asked whether the Federal Reserve plans to launch a digital currency, in a descriptive letter in reply, he clarifies that they have been observing the trends of digital currencies keenly.

While the emerging economies are no far from this race,

Bank of China pilots Blockchain-Based bond issuance programme, while PBoC eyeing on stimulating cryptocurrency and Fintech projects upon their Presidents perspectives on Blockchain technology. Huang Qifan, the vice chairman of the China International Economic Exchange Center, announced the name of the digital currency to be launched by the Peoples Bank of China, DCEP.

Indonesias private lending institution PT Bank Yudha Bhakti has associated with a Fintech Firm Akulaku through partnership pact to fortify Banks digital transformation strategy.

Turkey-based Takasbank introduces physical gold-backed transfer system on a blockchain-based platform.

The Ripple who has been popular among the banking community as its edge to transact overseas payments swiftly and efficiently is perceived as a competitive advantage, however, the banks have the current SWIFT mechanism (Society of Worldwide InterBank Financial Telecommunications) in place that seems unlikely to lose their importance in the industry so easily.

The EU has created a blockchain and artificial intelligence (AI) fund worth EUR 400 million. It is interpreted as a movement to keep up with innovation efforts of competitor countries such as the US and China.

While EU has created a blockchain and artificial intelligence (AI) fund worth EUR 400 million. It is interpreted as a movement to keep up with innovation efforts of competitor countries such as the US and China.

While the French based credit institution, Societe Generale SFH, which is a subsidiary of one of Europe's largest financial services groups, Societe Generale Group, has also issued a 100 million euro ($112 million) bond as a security token on the Ethereum (ETH) blockchain. But their redemption has not yet been confirmed through the DLT platform.

Spanish Banking Giant Santander Pilots Ethereum-Powered Bond Redemption

Bank of America Taps R3 and TradeIX to strategize and develop the International Trade Network.

Lloyds Bank has made an announcement of partnership with a blockchain platform, Komgo to streamline its commercial banking division.

While Banco Santander carried out the fixed business through blockchain, the head of digital investment banking division, Mr. John Whelan, clarified the news that the bank has carried out an early redemption of its Ethereum blockchain-enabled bond that was issued in September of this year.

HSBC also performed the first blockchain-based letter of credit transaction denominated in Chinese yuan. The transaction has successfully been executed for the Voltron trade finance blockchain platform which has been developed by the consortium of eight banks, in association with the renowned banking names like, BNP Paribas and Standard Chartered also, as per the reports.

See the article here:
Global Banks' Inclination For Blockchain And Cryptography - EconoTimes

Inside the race to quantum-proof our vital infrastructure – www.computing.co.uk

"We were on the verge of giving up a few years ago because people were not interested in quantum at the time. Our name became a joke," said Andersen Cheng, CEO of the UK cybersecurity firm Post-Quantum. After all, he continued, how can you be post- something that hasn't happened yet?

But with billions of pounds, renminbi, euros and dollars (US, Canadian and Australian) being pumped into the development of quantum computers by both governments and the private sector and with that research starting to bear fruit, exemplified by Google's achievement of quantum supremacy, no-one's laughing now.

One day, perhaps quite soon, the tried and trusted public-key cryptography algorithms that protect internet traffic will be rendered obsolete. Overnight, a state in possession of a workable quantum computer could start cracking open its stockpiles of encrypted secrets harvested over the years from rival nations. Billions of private conversations and passwords would be laid bare and critical national infrastructure around the world would be open to attack.

A situation often compared with the Y2K problem, the impact could be disastrous. Like Y2K, no-one can be quite sure what the exact consequences will be; unlike Y2k the timing is unclear. But with possible scenarios ranging from massive database hacks to unstoppable cyberattacks on the military, transport systems, power generation and health services, clearly, this is a risk not to be taken lightly.

Critical infrastructure including power generation would be vulnerable to quantum computers

Post-quantum cryptography uses mathematical theory and computer science to devise algorithms that are as hard to crack as possible, even when faced with the massive parallel processing power of a quantum computer. However, such algorithms must also be easy to deploy and use or they will not gain traction.

In 2016, the US National Institute of Standards and Technology (NIST) launched its competition for Public-Key Post-Quantum Cryptographic Algorithms, with the aim of arriving at quantum-safe standards across six categories by 2024. The successful candidates will supplement or replace the three standards considered most vulnerable to quantum attack: FIPS 186-4 (digital signatures), plusNIST SP 800-56AandNIST SP 800-56B (public-key cryptography).

Not all types of cryptography are threatened by quantum computers. Symmetric algorithms (where the same key is used for encryption and decryption) such as AES, which are often deployed to protect data at rest, and hashing algorithms like SHA, used to prove the integrity of files, should be immune to the quantum menace, although they will eventually need larger keys to withstand increases in classical computing power. But the asymmetric cryptosystems like RSA and elliptic curve cryptography (ECC) which form the backbone of secure communications are certainly in danger.

Asymmetric cryptography and public-key infrastructure (PKI) address the problem of how parties can exchange encryption keys where there's a chance that an eavesdropper could intercept and use them. Two keys (a keypair) are generated at the same time: a public key for encrypting data and a private key for decrypting it. These keys are related by a mathematical function that's trivial to perform one in one direction (as when generating the keys) but very difficult in the other (trying to derive the private key from the corresponding public key). One example of such a 'one-way' function is factorising very large integers into primes. This is used in the ubiquitous RSA algorithms that form the basis of the secure internet protocols SSL and TLS. Another such function, deriving the relationship between points on a mathematical elliptic curve, forms the basis of ECC which is sometimes used in place of RSA where short keys and reduced load on the CPU are required, as in IoT and mobile devices.

It is no exaggeration to say that in the absence of SSL and TLS the modern web with its ecommerce and secure messaging could not exist. These protocols allow data to be transmitted securely between email correspondents and between customers and their banks with all the encryption and decryption happening smoothly and seamlessly in the background. Unfortunately, though, factorising large integers and breaking ECC will be a simple challenge for a quantum computer. Such a device running something like Shor's algorithm will allow an attacker to decrypt data locked with RSA-2048 in minutes or hours rather than the billions of years theoretically required by a classical computer to do the same. This explains NIST's urgency in seeking alternatives that are both quantum-proof and flexible enough to replace RSA and ECC.

NIST is not the only organisation trying to get to grips with the issue. The private sector has been involved too. Since 2016 Google has been investigating post-quantum cryptography in the Chrome browser using NewHope, one of the NIST candidates. Last year Cloudflare announced it was collaborating with Google in evaluating the performance of promising key-exchange algorithms in the real world on actual users' devices.

Of the original 69 algorithms submitted to NIST in 2016, 26 have made it through the vetting process as candidates for replacing the endangered protocols; this number includes NewHope in the Lattice-based' category.

One of the seven remaining candidates in the Code-based' category is Post-Quantum's Never-The-Same Key Encapsulation Mechanism (NTS-KEM) which is based on the McEliece cryptosystem. First published in 1978, McEliece never really took off at the time because of the large size of the public and private keys (100kB to several MB). However, it is a known quantity to cryptographers who have had plenty of time to attack it, and it's agreed to be NP-hard' (a mathematical term that in this context translates very roughly as extremely difficult to break in a human timescale - even with a quantum computer'). This is because it introduces randomisation into the ciphertext with error correction codes.

"We actually introduce random errors every time we encrypt the same message," Cheng (pictured) explained. "If I encrypt the letters ABC I might get a ciphertext of 123. And if I encrypt ABC again you'd expect to get 123, right? But we introduce random errors so this time we get 123, next time we get 789."

The error correction codes allow the recipient of the encrypted message to cut out the random noise added to the message when decrypting it, a facility not available to any eavesdropper intercepting the message.

With today's powerful computers McEliece's large key size is much less of an issue than in the past.Indeed, McEliece has some advantages of its own - encryption/decryption is quicker than RSA, for example - but it still faces implementation challenges compared with RSA, particularly for smaller devices. So for the past decade, Cheng's team has been working on making the technology easier to implement. "We have patented some know-how in order to make our platform work smoothly and quickly to shorten the keys to half the size," he said.

Post-Quantum has open-sourced its code (a NIST requirement so that the successful algorithms can be swiftly distributed) and packaged it into libraries to make it as drop-in' as possible and backwards-compatible with existing infrastructure.

Nevertheless, whichever algorithms are chosen, replacing the incumbents like-with-like won't be easy. "RSA is very elegant," Cheng admits. "You can do both encryption and signing. For McEliece and its derivatives because it's so powerful in doing encryption you cannot do signing."

An important concept in quantum resistance is crypto-agility' - the facility to change and upgrade defences as the threat landscape evolves. Historically, industry has been the very opposite of crypto-agile: upgrading US bank ATMs from insecure DES to 3DES took an entire decade to complete. Such leisurely timescales are not an option now that a quantum computer capable of cracking encryption could be just three to five years away.

Because of the wide range of environments, bolstering defences for the quantum age is not as simple as switching crypto libraries. In older infrastructure and applications encryption may be hard-coded, for example. Some banks and power stations still rely on yellowing ranks of servers that they dare not decommission but where the technicians who understand how the encryption works have long since retired. Clearly, more than one approach is needed.

It's worth pointing out that the threat to existing cryptosystems comes not only from quantum computers. The long-term protection afforded by encryption algorithms has often been wildly overestimated even against bog standard' classical supercomputers. RSA 768, introduced in the 1970s, was thought to be safe for 7,000 years, yet it was broken in 2010.

For crypto-agility algorithms need to be swappable

Faced with the arrival of quantum computers and a multiplicity of use cases and environments, cryptographers favour a strength-in-depth or hybridised approach. Cheng uses the analogy of a universal electrical travel plug which can be used in many different counties.

"You can have your RSA, the current protocol, with a PQ [post-quantum] wrapper and make the whole thing almost universal, like a plug with round pins, square pins or a mixture of both. Then when the day comes customers can just turn off RSA and switch over to the chosen PQ algorithm".

Code-based systems like NTS-KEM are not the only type being tested by NIST. The others fall into two main categories: multivariate cryptography, which involves solving complex polynomial equations, and lattice-based cryptography, which is a geometric approach to encrypting data. According to Cheng, the latter offers advantages of adaptability but at the expense of raw encryption power.

"Lattice is less powerful but you can do both encryption and signing,

but it has not been proven to be NP-hard," he said, adding: "In the PQ world everyone's concluded you need to mix-and-match your crypto protocols in order to cover everything."

Professor Alan Woodward (pictured) of Surrey University's Department of Computing said that it's still too early to guess which will ultimately prove successful.

"Lattice-based schemes seem to be winning favour, if you go by numbers still in the race, but there is a lot of work being done on the cryptanalysis and performance issues to whittle it down further," he said. "If I had to bet, I'd say some combination of lattice-based crypto and possibly supersingular isogeny-based schemes will emerge for both encryption and signature schemes."

Quantum mechanics can be an aid in the generation of secure classical encryption keys. Because of their deterministic nature, classical computers cannot generate truly random numbers; instead they produce pseudo-random numbers that are predictable, even if only to a tiny degree. One of Edward Snowden's revelations was that the NSA had cracked the random number generator used by RSA. More recently, weaknesses in RSA's random number generation were discovered in some IoT devices, where one in 172 were found to use the same factor to generate keys. However, a quantum random number generator (QRNG) produces numbers that are truly random, according to quantum theory, resolving this key area of vulnerability.

QKD commonly uses polarised photos to represent ones and zeros

Whereas post-quantum cryptography is based on maths, the other major area of research interest, quantum key distribution (QKD), is rooted in physics, specifically the behaviour of subatomic particles. QKD is concerned with key exchange, using quantum-mechanics to ensure that eavesdroppers cannot intercept the keys without being noticed.

In BB84, the first proposed QKD scheme and still the basis for many implementations, the quantum mechanical properties of subatomic particle, such as the polarity of a photon, is manipulated to represent either a zero or a one. A stream of such photons, polarised at random, is then sent by one party to a detector controlled by the other.

Before they reach the detector, each photon must pass through a filter. One type of filter will allow ones' to pass, the other zeros'; as with the polarisation process, the filters are selected at random, so we'd expect half of the photons to be blocked by the filtering process. Counterintuitively, however, their quantum mechanical properties mean that even those photons that are blocked' by a filter still have a 50 per cent chance of passing their correct value to the detector. Thus, we'd expect an overall agreement between transmission and detection of 75 per cent (50 per cent that pass straight through plus 25 per cent that are blocked' but still communicate their correct value).

Once enough photons have been transmitted to produce a key of the required length, the parties compare, over a separate channel, the sequence of emitted ones and zeros with the filter used for each, discarding the individual results where they disagree. A classical symmetric encryption key is then created from the remaining string of ones and zeros. This key can be used as an uncrackable one-time pad' which is then used to encrypt data such as a message or a login.

Should a man-in-the-middle intercept the stream of photons, the parties will be alerted because of the observer effect: measuring the state of a quantum particle will change it. Statistically, the number of photons registered as correct' by the detector will drop from 75 per cent to around 62.5 per cent and this will be noticed when the two parties compare a random sample of their results at the end of the process. Any such discrepancy will cause the key to be rejected. Properly implemented, QKD can be considered as a provably unbreakable method of exchanging keys.

Switzerland is a QKD pioneer, deploying the technology to secure electoral votes as far back as 2007. The company that helped to achieve this feat, Geneva University spin-off ID Quantique (IDQ), has since become one of the main manufacturers of QKD and QRNG hardware. CEO Grgoire Ribordy (pictured) has seen an recent upsurge of interest beginning in 2016 when the European Commission unveiled its 1 billion, ten-year Quantum Flagship programme. The market is now starting to mature, he said, adding that his company boasts customers in government, finance and "other organisations that have high-value IP to protect".

There's a certain rivalry between physics and maths, between QKD and post-quantum encryption, not least because funding has been hard to come by. Being hardware-based, QKD has so far gobbled up the lion's share of the research grants, but it's possible that when NIST returns its verdicts more money will flow into PQ. Arguments also rage over the practical limits of security.

"The physicists tend to talk about QKD as being perfectly secure' which sets the cryptographers on edge as there is no such thing in practice," Woodward said.

Ribordy is adamant that both techniques will be required. As with the hybrid approach to adopting algorithms, it's not an either-or situation; it all depends on the use case.

"I think they're actually complementary. Quantum crypto [another name for QKD] will provide a higher security and should be used maybe in backbone networks where there's a lot of at stake, big pipes must be protected with more security, and then the quantum-resistant algorithms can find an application in areas where security is not as critical or maybe where there's less data at stake."

One company that's looking to scale up QKD on a national basis is

the startup QuantumXchange. Based in Bethesda, Maryland, USA, it was founded in 2018 with VC funding to provide ultra-secure data networks. During his interview with Computing, president and CEO John Prisco (pictured) bemoaned the fact that his country, while forging ahead with quantum computers, is behind the curve when it comes to defending against them. It's possible that by 2024 when NIST selects its winning algorithms, the game will already be up.

"Everybody is saying, OK, let's fight quantum with quantum and I subscribe to that. We've got quantum computers that are offensive weapons and quantum keys that are the defensive of counterpart to that. The rest of the world outside of the United States is embracing this a lot more quickly - Europe, Japan and China."

As if in answer to his prayers, last month the US House of Representatives voted overwhelmingly to pass the $1.2 billionNational Quantum Initiative Act, designed to accelerate the country's efforts in this area, a rare example of bipartisan agreement in the increasingly fractious political landscape of the US.

Quantum particles are uniquely sensitive to any kind of disturbance, so while China may have successfully transmitted quantum keys between Earth and the Micius satellite, this was only possible because of ideal weather conditions at the time (although, interestingly, Woodward believes it could ultimately be the winning approach).

Particles transmitted through the more common fibreoptic cable are also limited by the tendency of the polarised photons to react with the medium. Even with the most pristine fibre, this limits real-world transmission distance to around 100km. After that, you need intermediary repeaters and trusted nodes' to relay the signal. Since it's not possible to directly clone quantum states, the quantum signal must be converted to classical and then back to quantum again, representing a weak point in the otherwise unbreakable chain. So trusted nodes must be very thoroughly secured, which inevitably increases costs and limits current applications. It is also possible for an attacker to interfere with emitters and detectors to corrupt the key generation process.

Other issues? Well, there's a lack of standards and certifications and the equipment is costly. Also, without some sort of secure signature process, how can parties exchanging keys be sure who they are exchanging them with? In addition, it's restricted to point-to-point communications and it's also incompatible with existing networks.

The theory is sound, said Woodward, but the engineering is still a challenge.

"It's in practice that QKD is encountering difficulties. For example, QKD is not yet at a stage where it is using single photons - it uses pulses of light. Hence, the very basis of not being able to clone the quantum state of a photon is put in question as there is more than one of them."

Woodward added that even after the kinks in QKD - be that via satellite, fibreoptic cables or over the airwaves - have been ironed out, the technology will still likely be confined to highly sensitive data and backbone networks because PQ cryptography will be easier to slot into existing infrastructure.

"Whichever [QKD] scheme proves most reliable and robust they all require that expensive infrastructure over what we have now, and so I can envisage it being used for, possibly, government communications but not for home users whose machines are picking a means to communicate securely with their bank's website," he said.

"The post-quantum schemes in the NIST competition would simply replace the software we already have in places such as TLS so the cost would be much lower, and the level of disruption needed for adoption by end-users would be far less."

However, QuantumXchange is working on overcoming some of these limitations. The firm already operates a small number of high security QKD connections between financial institutions in New York and datacentres in nearby New Jersey over dedicated fibreoptic cables using trusted nodes (manufactured by ID Quantique) to extend the reach of its QKD infrastructure. But it is also working on a hybrid system called Phio TX. This will allow the transmission of electronic quantum keys (i.e. keys created using a QRNG) or classical symmetric keys created from the quantum key via a secure channel separate from that used for the encrypted data. The idea is to make the technology more widely applicable by straddling the QKD-PQ divide and removing the point-to-point restrictions.

"The point is to be crypto-agile," Prisco said. "If a company is trying to come up with a quantum-safe strategy they can implement this product that has quantum-resistant algorithms, electronic quantum keys and optical quantum keys, so it becomes a level-of-service discussion. If you have a link that absolutely has to be protected by the laws of physics, you'd use an optical quantum key. If there's virtually no chance of someone intercepting the data with your key you could use a trusted exchange and the combination of the quantum-resistant algorithm with the quantum random number generated key is very powerful."

Go here to read the rest:
Inside the race to quantum-proof our vital infrastructure - http://www.computing.co.uk

5 basic courses for IT competency in the industry – State-Journal.com

5 Basic Courses for IT competency in the industry

Information technology, its always a prestigious culture of work towards all the young job aspirants of the twenty-first century. Why not? It pays really well. And demands? What can I say when almost all kinds of brick-and-mortar businesses now call for the IT-focused employees? Besides, its challenging, lucrative as well as rewarding. That means the right rig to grow interest among young blood.

But, IT is not just an industry. Its a world, right? With so many different aspects, categories, and subcategories. It covers such a huge genre of work culture that its difficult to mention what an IT aspirant should select for an easier way out. But that doesnt mean that the IT pathways dont show some common traits. From software developments to database creation, all these work zones have something similar on the basis to start with. And this is the focus of todays discussion.

So, lets check out what are these five important basic aspects every IT aspirant needs to know about.

Basic Computer Skills and Project Courses:

When you are applying for an IT job, you must already be well acquainted with Windows, Microsoft Word, Excel, Powerpoint Presentation, Outlook Express, SharePoint, and similar tools.

More job-specific requirements demand advanced courses on Excel and PowerPoint, MS Access, Office365, social media experience, technical writing, digital marketing, HTML, Pivot Tables, and some Project courses. Advanced Excel study is a necessary qualification for most of the business verticals. It is your pathway to manage everyday spreadsheets and financial infrastructures.

Project management skills play a crucial role in different levels of your job profile. These courses teach you how to arrange a vast amount of resources for optimizing manpower under a cost-effective budget. Human-relationship, leadership-quality, problem-solving, and quick decision-making abilities are some soft skills project managers expected to acquire.

Basic Programming Language and Coding

Since modern technologies demand the involvement of computers in all aspects of communications, every IT professional must procure some usable knowledge of computer language. Ok, youre not expected to deliver expertise on machine learning, but some primary and simple languages wont hurt you much. Even the customer service crew should know how to code basic stuff.

At least the working knowledge of some event-driven languages like Excel-based basic VBA programming, HTML, or C++ is your necessity to make a decent job profile. If its a regular office suit or a graphical interface, quantitative reporting or architecture, quality assurance or application development, you need the ABC of programming language everywhere.

Also, you must consider some simple web development courses, cloud computing, Azure, web designing and testing skills. These will lift your resume to the employers eyes. Also, you must sharpen your social skills with confidence. Regularly brush up your relationship-building qualities, stress management, conflict resolution, articulation, persuasion, and intrinsic motivation. Your patience will keep your stand stable.

Big Data Analysis and Business Intelligence

Data is the life-force in modern industries. You manage the data, you control the entire work system. Well, Im not asking you to become a core data analytics before seeking simple IT jobs, but you may improvise yourself in some way. Because youre gonna need it for the long haul.

Do you manage the clientele or the products? In either way, youre gonna end up spending your days with a mountain of data flow. What percentage of customers are happy with your companys new product? Are the product distribution systems earning more sell? Do you face any of these in your office-days? Then you need to start with basic data analysis and business intelligence courses to learn the strategy-making technology over big data resources. How far should you go? Thats up to your job profile. But, basic data modeling, big data mining, assembling and organizing, structural designing, and quantitative research on statistical data interpretation are the calls from every IT professionals.

Network Security

When data is the essence, companies need a protective layer around their sensitive data resources. The network is the way-out for information to get stolen or leaked. Hence, the network is also the armor over it. Good network security ensures a companys every whereabouts and protects them from losing a huge amount of money over data leakage. Thats why a good knowledge of network architecture, encryption algorithms, cryptography, risk assessment ability, authentication systems, knowledge on both the host-based and virtual firewalls increases the demand of an employee to a large scale than the others without having that kind of knowledge.

Also, knowledge networking connects a chunk of people in a working environment to collaborate with the knowledge and opinions of all of them in different aspects of a project. Network professionals conglomerate their creative efforts altogether, and build an impeccable structure out of a chunk of raw resources. That's why planning skills as well as adaptability, flexibility, and quick learning process makes their way towards a better career opportunity. Also, time management is another essential quality for every IT aspirant.

Communication Skills

IT is a field of teamwork. Naturally, effective communication skill is the way of building your work culture perfect. Thats why this criterion in your resume adds extra score to your acceptability to the employers. Knowing different foreign languages expands your work-field to a larger section of the world. Not only public speaking but digital communication and copywriting techniques are also included in this area of expertise. These skills walk you through the clear path of expressing ideas better in team meetings and hence make your views easier to understand.

Well, we reached the end of our discussion. As you can see, we have talked about the five most common criteria in this short discourse to help the IT-focused generation. But you already know that its not actually the end, its just the starting point of your journey. Stay blessed and grow better, always.

Information technology, its always a prestigious culture of work towards all the young job aspirants of the twenty-first century. Why not? It pays really well. And demands? What can I say when almost all kinds of brick-and-mortar businesses now call for the IT-focused employees? Besides, its challenging, lucrative as well as rewarding. That means the right rig to grow interest among young blood.

But, IT is not just an industry. Its a world, right? With so many different aspects, categories, and subcategories. It covers such a huge genre of work culture that its difficult to mention what an IT aspirant should select for an easier way out. But that doesnt mean that the IT pathways dont show some common traits. From software developments to database creation, all these work zones have something similar on the basis to start with. And this is the focus of todays discussion.

So, lets check out what are these five important basic aspects every IT aspirant needs to know about.

Basic Computer Skills and Project Courses:

When you are applying for an IT job, you must already be well acquainted with Windows, Microsoft Word, Excel, Powerpoint Presentation, Outlook Express, SharePoint, and similar tools.

More job-specific requirements demand advanced courses on Excel and PowerPoint, MS Access, Office365, social media experience, technical writing, digital marketing, HTML, Pivot Tables, and some Project courses. Advanced Excel study is a necessary qualification for most of the business verticals. It is your pathway to manage everyday spreadsheets and financial infrastructures.

Project management skills play a crucial role in different levels of your job profile. These courses teach you how to arrange a vast amount of resources for optimizing manpower under a cost-effective budget. Human-relationship, leadership-quality, problem-solving, and quick decision-making abilities are some soft skills project managers expected to acquire.

Basic Programming Language and Coding

Since modern technologies demand the involvement of computers in all aspects of communications, every IT professional must procure some usable knowledge of computer language. Ok, youre not expected to deliver expertise on machine learning, but some primary and simple languages wont hurt you much. Even the customer service crew should know how to code basic stuff.

At least the working knowledge of some event-driven languages like Excel-based basic VBA programming, HTML, or C++ is your necessity to make a decent job profile. If its a regular office suit or a graphical interface, quantitative reporting or architecture, quality assurance or application development, you need the ABC of programming language everywhere.

Also, you must consider some simple web development courses, cloud computing, Azure, web designing and testing skills. These will lift your resume to the employers eyes. Also, you must sharpen your social skills with confidence. Regularly brush up your relationship-building qualities, stress management, conflict resolution, articulation, persuasion, and intrinsic motivation. Your patience will keep your stand stable.

Big Data Analysis and Business Intelligence

Data is the life-force in modern industries. You manage the data, you control the entire work system. Well, Im not asking you to become a core data analytics before seeking simple IT jobs, but you may improvise yourself in some way. Because youre gonna need it for the long haul.

Do you manage the clientele or the products? In either way, youre gonna end up spending your days with a mountain of data flow. What percentage of customers are happy with your companys new product? Are the product distribution systems earning more sell? Do you face any of these in your office-days? Then you need to start with basic data analysis and business intelligence courses to learn the strategy-making technology over big data resources. How far should you go? Thats up to your job profile. But, basic data modeling, big data mining, assembling and organizing, structural designing, and quantitative research on statistical data interpretation are the calls from every IT professionals.

Network Security

When data is the essence, companies need a protective layer around their sensitive data resources. The network is the way-out for information to get stolen or leaked. Hence, the network is also the armor over it. Good network security ensures a companys every whereabouts and protects them from losing a huge amount of money over data leakage. Thats why a good knowledge of network architecture, encryption algorithms, cryptography, risk assessment ability, authentication systems, knowledge on both the host-based and virtual firewalls increases the demand of an employee to a large scale than the others without having that kind of knowledge.

Also, knowledge networking connects a chunk of people in a working environment to collaborate with the knowledge and opinions of all of them in different aspects of a project. Network professionals conglomerate their creative efforts altogether, and build an impeccable structure out of a chunk of raw resources. That's why planning skills as well as adaptability, flexibility, and quick learning process makes their way towards a better career opportunity. Also, time management is another essential quality for every IT aspirant.

Communication Skills

IT is a field of teamwork. Naturally, effective communication skill is the way of building your work culture perfect. Thats why this criterion in your resume adds extra score to your acceptability to the employers. Knowing different foreign languages expands your work-field to a larger section of the world. Not only public speaking but digital communication and copywriting techniques are also included in this area of expertise. These skills walk you through the clear path of expressing ideas better in team meetings and hence make your views easier to understand.

Well, we reached the end of our discussion. As you can see, we have talked about the five most common criteria in this short discourse to help the IT-focused generation. But you already know that its not actually the end, its just the starting point of your journey. Stay blessed and grow better, always.

Read more here:
5 basic courses for IT competency in the industry - State-Journal.com

The Relationship Between Cryptocurrencies and the Global Market – Qrius

Cryptocurrencies and the global market have an intricate relationship, and not that one would discuss over a cup of tea on a random evening. Cryptocurrencies are incredibly complex, and the technology that underpins it, the blockchain technology, more so. Therefore, taking a minute out of our busy lives to think of how these digital currencies could affect the global market is not a usual phenomenon. But, being the rational creatures that human beings are, sometimes it becomes necessary to look into how the world economy is being affected by certain elements that run through the subtle networks of society. Thus, with such a notion in mind, we have come up with an article that can explain the exact correlation between cryptocurrencies and the global economy.

Basics of Cryptocurrencies:

Long story cut short, cryptocurrencies are digital currencies that serve as one of the best mediums of financial exchange and transactions. They use the mathematics of cryptography that ensures maximum security in this form of digital transaction. Cryptography also ensures that these digital coins are not easy to be counterfeited, but transactions involving cryptocurrencies become as easy as ever. The network through which the transactions take place is known as the blockchain technology, and this network works on a complex algorithm. Any data that goes into this network becomes immutable at once; which means to say that once a transaction is deemed complete, there is absolutely no way to reverse it.

The Appeal That Cryptocurrencies Have Globally:

The way we deal in cryptocurrencies affects the global economic market in ways we might not have imagined before. One transaction in one part of the world affects the entire chain worldwide and affects the world economy substantially. If you had been thinking that it is only fiat currency that affects the economy, you have been misinformed, and it is time to step out of the bubble of ignorance.

The Decentralized Approach-

Cryptocurrencies, especially like that of Bitcoin, does not require an intermediary for a transaction to go through or be deemed as completed. The most appealing feature of cryptos is that it uses decentralized technology to go about their transactions. And since it does not require any medium for a complete transaction, it is rather quick and frictionless. This further means that cryptocurrencies have a massive contribution to the economy, and it probably affects the world economy quicker than the other forms of currency.

Its Independence From The Dollar-

The dollar acts as a frame of reference or a yardstick for the global economy. However, since cryptocurrencies have nothing to do with banks or any intermediary, they remain independent from the dollar. This is indeed a fresh way for various other financial actors to participate directly into the global economy. There are various payment gateways though, that make transactions involving cryptocurrencies easy. For instance, you could look up the website ofFlexipayto learn more about these gateways.

Its Ability To Remove Impediments From Entering The Market-

Cryptocurrencies have made it easy for various financial actors to enter the financial market without any smidgen of apprehension. Several entrepreneurs are also making use of the ICO system to take their businesses forward with utmost courage and ease. Therefore, the more businesses and entrepreneurs enter the financial market, the higher shall the contribution be towards the economy.

Conclusion:

Cryptocurrencies affect the economy widely and in ways that we hardly think about. These currencies work in a complicated chain and transactions are set in stone once they are deemed complete. Therefore, the above discussion proves that cryptocurrencies affect the economy in numerous ways and enhance the way in which financial systems function.

Stay updated with all the insights.Navigate news, 1 email day.Subscribe to Qrius

Continued here:
The Relationship Between Cryptocurrencies and the Global Market - Qrius

Facebooks a mess, but it doesnt mean backdoors are the answer – The Next Web

Its been a tough year for Facebook. It has faced international scrutiny, from its role in elections to the potential regulation of its cryptocurrency Libra. However, perhaps the most contentious argument for the social media giant, that is sure to rage on into 2020, is one of its longest fought. How to protect the privacy of users on its messaging platforms, whilst navigating the demands of governments who want backdoor access. What should we most prize? Consumers privacy or national security?

Currently, the solution proposed by governments internationally is a backdoor to allow access into messaging platforms one that in my opinion, is highly unsatisfactory. With a backdoor, there is potential for abuse by the government agency in question but perhaps more concerningly, that same backdoor can be found and exploited. On the other hand, my own experience with members of the Islamic State has shown that absolute privacy of communications can be dangerous in the wrong hands.

The solution doesnt lie in an open door for anyone with the right tools to climb through. What we need in a trustless environment, is a pre-agreed, cryptographically secure, and verifiable way to access certain data sources, which helps to bring tech companies and governments together.

An emergency entrance with access granted through a consensus voting mechanism from a pre-agreed group will be the way forward.

Facebook has made its stance on the issue of data privacy pretty clear. With further encryption of its video and calling systems being tested in October 2019, not to mention its current very public lawsuit against the NSO Group, its dedication to end-to-end encryption is plain to see.

Its a position I once took myself. In 2014, my firm developed the worlds first quantum safe instant messaging system with the us having zero knowledge of the contents. It featured encryption so advanced that not even a mature quantum computer, let alone the technology available at the time, would be able to decipher the coding in order to gain access. We were elated.

It was a much-needed victory for privacy, in an age where it was widely agreed that the misuse and exploitation of user data was getting out of control. We took the decision to make the solution available to all through the Apple app store as an easy to download application. We never would have predicted that the solution would end up appearing on an Islamic State recommended technical tools list.

We had created a tool which protected a fundamental right to online privacy. But, in doing so, had enabled an abhorrent group the ability to benefit from unfettered, untraceable communications. This created a period of great debate for our team. To create government backdoors in what we had claimed was a fully-encrypted and privacy-protecting service was counter-intuitive.

However, we simply couldnt reconcile the idea that an organization such as Islamic State might be able to cause great harm using our technology. We felt we were left with no choice but to withdraw the messaging system altogether. Today, we only provide it to companies and governments for carefully selected and compliant use.

In this scenario, it might be easy to argue that a government backdoor would have been appropriate. But we must remember that a backdoor for one, is a backdoor for all. Anyone can walk through it, whether thats the government agency that was intended for, a hacker or even a malicious nation. Facebook, for all its flaws, is right to object to this on behalf of its users.

This is why I believe that governments should consider the creation of an emergency entrance, or side-door. Whatever you call it, these are metaphors for a process where pre-agreed access to data is enabled within a trustless environment.

In this scenario, the government agency, the social media provider, and a neutral third-party such as a court, would each safe-keep a fragment of the cryptographic key, which when used to reach a voting threshold, could allow sanctioned and pre-agreed access to messaging data. To remove any anxiety about the government keeping the data, the data and the key management could be hosted by the social media companies.

In a way, this idea known as threshold cryptography would be similar to a Swiss bank safe deposit box, which can only be opened if both the bank and the customer are present. Except these cryptographic keys could not be replicated, and companies could even use blockchain to create an immutable record of how, when and why the data was accessed.

It would significantly limit the ability of rogue actors to stroll through a backdoor uninvited. There would be no golden key kept by a social media company, which would remove any insider threat to security and privacy, even if governments werent pushing for a way in.

Facebook has a responsibility to find a solution to this ongoing debate. It can shout about respecting its users privacy from the rooftops, and in doing so defend its decision to continue with end-to-end encryption, but that argument only holds true when lives and liberty are not being endangered by the secrecy their messenger applications allow.

It is a governments prerogative to keep its people safe, but if they think backdoors are the prize, I believe they are mistaken. In this scenario, the data is not even kept by the government. The social media companies should not complain either as the telco industry already has to comply with lawful intercept warrants.

There is common ground that can be found here in the form of key-splitting, and thats been sadly absent from the privacy debate thus far.

Published January 2, 2020 10:00 UTC

The rest is here:
Facebooks a mess, but it doesnt mean backdoors are the answer - The Next Web

Cryptocurrency and Tokenization Activities in Sports – Coin Idol

Jan 02, 2020 at 14:40 // News

Due to the way sports are massively loved and supported globally, tokenization is a new solution that will have a strong impact on the future of gathering capital in the games and sports industry.

Sooner or later, the blockchain and cryptocurrency community will witness the first examples of tokenization also in professional sport. Possibly, starting from the building and sponsoring of big systems and arenas that are the largest and more multifaceted to be built.

Professional sport is the perfect candidate for tokenization operations. Briefly, tokenization is an instrument which can be effectively applied within multifaceted ecosystems so as to lessen the chafing points and bring into line the inducements and interests of numerous interested parties.

When it comes to the bankrolling of multifaceted sports infrastructure initiatives, tokenization deals with various stakeholders like local administrations, the sports team, sponsors, the media and big communities of fans. All of these are various participants and should also be able to benefit from different enticements.

Thus, a cryptocurrency is an inordinate technological novelty that can aid in gathering the financial resources necessary to construct the modern sports grounds, and to support the interest of all participants by generating the veracious incentives for each and reducing the friction points.

Also, the phrase "tokenization" points to the formation of a cryptographic coin that is nothing more than the digital representation of an asset, a right, data or a financial value that is stored on a blockchain network and moved with the means of public or private key cryptography.

This happens via the execution of "smart contracts" that involves transferring the terms and conditions of the core pact for these particular rights, goods, or values into formats of computer program which will be robotically and digitally worked out by the software when particular settings are met. In this case, the important point in this case is the automatic execution sans the requirement for an intermediary in the role of trustee / third-party.

In simpler terms, for people who are unfamiliar with groundbreaking innovations like blockchains and cryptocurrencies, this tech literally allows a user to "pack" in a digital vessel/tool (i.e. the coin) not only the personal seat license (PSL) contractual pacts, but also game coupons, accessing rights to exceptional proceedings and supporter rendezvous options, and considerably any other vital prescribed rights.

The cryptocurrency is kept in an electronic private wallet of the user (either in his mobile smartphone) and could be simply transferred to unlimited other users directly so-called peer to peer (P2P) in a quick, safe and transparent way. The tech also allows you to program certain contractual conditions directly (i.e. encode) - such as the point that the PSL has certain restrictions on the transfer - and to apply them automatically.

See original here:
Cryptocurrency and Tokenization Activities in Sports - Coin Idol