Google Uses DeepMind’s Artificial Intelligence to Fight Coronavirus: Is It Time to Trust AI? – Tech Times

Googleis using itsDeepMindartificial intelligence to use to combat thecoronavirusor COVID-19. With the coronavirus still spreading slowly but surely and no cure is yet to be seen, is there any hope that this AI might be able to help find the cure?

Read More:READ! Coronavirus Has Two Strains Which Will Make it Even More Difficult to Contain Since The Other Half Doesn't Know They Are Infected Until It's Too Late

A post that was published Thursday, DeepMind is now using itsAlphaFold systemto create "structure predictions of several under-studied proteins associated with SARS-CoV-2, the virus that causes COVID-19."

The predictions, however, have not been experimentally verified, but DeepMind is confident that the data will be useful to the scientists who have a better understanding of the novel coronavirus that it will be of use to them.

DeepMind stated thatunderstanding a protein's structureusually takes months or even longer. Of course, previous knowledge of the protein structures is another key, surely. AlphaFold is using cutting edge technology and methods to ascertain "accurate predictions of the protein structure" with no knowledge prior to the strain.

Assistance is helpful for sure, no matter where it comes from in the fight against the coronavirus. However, as stated by DeepMind, that AI has no prior knowledge of the protein and how to look for things that the scientists can use. So many what if's can happen: what if the AI didn't find something worthwhile? Does it even know what to look for? Can it be used realistically and applied in the real world? Questions come to mind, but we have to be optimistic, and thankfully, if there are any findings regardless of how minuscule they may be, it might still be a piece in the puzzle to find the cure, which we will all definitely benefit from.

Read More:10 Ways to Greet Someone in Style Without Getting Sick From Deadly Coronavirus

If there are any findings from DeepMind's artificial intelligence, scientists may verify, record, and then, if possible, find the cure or the vaccine to the deadly virus. The key to finding key components to help the production of a cure is a difficult science to be sure, and even with all the scientists crunching their brains 24/7, it will take a gigantic effort.

"Artificial intelligence could be one of humanity's most useful inventions. We research and build safe AI systems that learn how to solve problems and advance scientific discovery for all." so they advertise on their website, which brings us to their advances in all fields. So far, their systems have shown companies how to save energy, identify eye diseases, accelerating science, and, as they are partnered with Google, improves their products to use all over the world.

So DeepMind shows promise, and humanity is counting on their AI to try and solve one of the problems the whole world has to face every day. The cure needs to be found sooner rather than later before thevirus mutates again.

Read More:Guide to Proper Smartphone Cases Cleaning to Prevent Spread of Germs and Virus Like Coronavirus

2018 TECHTIMES.com All rights reserved. Do not reproduce without permission.

Continued here:
Google Uses DeepMind's Artificial Intelligence to Fight Coronavirus: Is It Time to Trust AI? - Tech Times

Airlines take no chances with our safety. And neither should artificial intelligence – The Conversation AU

Youd thinking flying in a plane would be more dangerous than driving a car. In reality its much safer, partly because the aviation industry is heavily regulated.

Airlines must stick to strict standards for safety, testing, training, policies and procedures, auditing and oversight. And when things do go wrong, we investigate and attempt to rectify the issue to improve safety in the future.

Its not just airlines, either. Other industries where things can go very badly wrong, such as pharmaceuticals and medical devices, are also heavily regulated.

Artificial intelligence is a relatively new industry, but its growing fast and has great capacity to do harm. Like aviation and pharmaceuticals, it needs to be regulated.

A wide range of technologies and applications that fit under the rubric of artificial intelligence have begun to play a significant role in our lives and social institutions. But they can be used in ways that are harmful, which we are already starting to see.

In the robodebt affair, for example, the Australian government welfare agency Centrelink used data-matching and automated decision-making to issue (often incorrect) debt notices to welfare recipients. Whats more, the burden of proof was reversed: individuals were required to prove they did not owe the claimed debt.

The New South Wales government has also started using AI to spot drivers with mobile phones. This involves expanded public surveillance via mobile phone detection cameras that use AI to automatically detect a rectangular object in the drivers hands and classify it as a phone.

Read more: Caught red-handed: automatic cameras will spot mobile-using motorists, but at what cost?

Facial recognition is another AI application under intense scrutiny around the world. This is due to its potential to undermine human rights: it can be used for widespread surveillance and suppression of public protest, and programmed bias can lead to inaccuracy and racial discrimination. Some have even called for a moratorium or outright ban because it is so dangerous.

In several countries, including Australia, AI is being used to predict how likely a person is to commit a crime. Such predictive methods have been shown to impact Indigenous youth disproportionately and lead to oppressive policing practices.

AI that assists train drivers is also coming into use, and in future we can expect to see self-driving cars and other autonomous vehicles on our roads. Lives will depend on this software.

Once weve decided that AI needs to be regulated, there is still the question of how to do it. Authorities in the European Union have recently made a set of proposals for how to regulate AI.

The first step, they argue, is to assess the risks AI poses in different sectors such as transport, healthcare, and government applications such as migration, criminal justice and social security. They also look at AI applications that pose a risk of death or injury, or have an impact on human rights such as the rights to privacy, equality, liberty and security, freedom of movement and assembly, social security and standard of living, and the presumption of innocence.

The greater the risk an AI application was deemed to pose, the more regulation it would face. The regulations would cover everything from the data used to train the AI and how records are kept, to how transparent the creators and operators of the system must be, testing for robustness and accuracy, and requirements for human oversight. This would include certification and assurances that the use of AI systems is safe, and does not lead to discriminatory or dangerous outcomes.

While the EUs approach has strong points, even apparently low-risk AI applications can do real harm. For example, recommendation algorithms in search engines are discriminatory too. The EU proposal has also been criticised for seeking to regulate facial recognition technology rather than banning it outright.

The EU has led the world on data protection regulation. If the same happens with AI, these proposals are likely to serve as a model for other countries and apply to anyone doing business with the EU or even EU citizens.

In Australia there are some applicable laws and regulations, but there are numerous gaps, and they are not always enforced. The situation is made more difficult by the lack of human rights protections at the federal level.

One prominent attempt at drawing up some rules for AI came last year from Data61, the data and digital arm of CSIRO. They developed an AI ethics framework built around eight ethical principles for AI.

These ethical principles arent entirely irrelevant (number two is do no harm, for example), but they are unenforceable and therefore largely meaningless. Ethics frameworks like this one for AI have been criticised as ethics washing, and a ploy for industry to avoid hard law and regulation.

Read more: How big tech designs its own rules of ethics to avoid scrutiny and accountability

Another attempt is the Human Rights and Technology project of the Australian Human Rights Commission. It aims to protect and promote human rights in the face of new technology.

We are likely to see some changes following the Australian Competition and Consumer Commissions recent inquiry into digital platforms. And a long overdue review of the Privacy Act 1988 (Cth) is slated for later this year.

These initiatives will hopefully strengthen Australian protections in the digital age, but there is still much work to be done. Stronger human rights protections would be an important step in this direction, to provide a foundation for regulation.

Before AI is adopted even more widely, we need to understand its impacts and put protections in place. To realise the potential benefits of AI, we must ensure that it is governed appropriately. Otherwise, we risk paying a heavy price as individuals and as a society.

View original post here:
Airlines take no chances with our safety. And neither should artificial intelligence - The Conversation AU

Skeptical of Bitcoin Believes to be the Most Powerful CEO – BTC Wires

Mar 8, 2020 09:48 UTC

| Updated:

Mar 8, 2020 at 09:48 UTC

By Rajat Gaur

Bitcoin is believed to be the first ever cryptocurrency to be introduced and hold the leadership of the entire crypto market. Bitcoin was first curated in 2008 by pseudonymous Satoshi Nakamoto. It was known to be a decentralized and disruptive cryptocurrency that was developed to release the threads of several digital currencies that came before it including double-spending and removing the necessity for a central authority.

Despite its plans to replace all fiat currencies on earth and accounted for the title of global currency used across the world. The only technology is recognized ten years old and requires second-layer technologies to stay ahead of the competition.

Bitcoin is comparatively slow to any of the crypto space, and mostly serves the need for a store of value or means of value transfer.

The very next largest crypto asset, Ethereum provides a variety of attributes comparatively more advanced of what Bitcoin offers. At present, there are thousands of altcoins in the cryptocurrency market and every altcoin offer different than the other. The emerging asset class has introduced the most effective CEO in the crypto market to be skeptical of Bitcoins future.

Check Tweet Here

During the tweet battle, Armstrong posted about how the early internet developed, and how it can be made better with advanced protocols. It is these advances that can improve the internet for the users.

In the very next tweet, Armstrong concludes with a statement that there are many more to introduce which blockchain will project crypto adoption from 50 million users to 5 million. The Coinbase CEO also believes that the blockchain manages to improve privacy, developer tool solutions, scalability, and decentralized identity.

Bitcoin has already registered as a first-mover advantage and brand recognition, and while these features can be layered over the Bitcoin protocol, the technology behind the first-ever cryptocurrency is lacking as compared to other altcoins in the crypto space.

Read the original here:
Skeptical of Bitcoin Believes to be the Most Powerful CEO - BTC Wires

F2Pool: an Introduction to the Renowned Mining Pool – CryptoNewsZ

F2Pool can be described as the worlds leading cryptocurrency mining pool for several leading cryptocurrencies such as Bitcoin, Ethereum, and Litecoin. F2Pool is the oldest mining pool and is based in China; presently it is the largest multi-currency mining pool around the world. F2Pool mines around 17% of all the blocks. F2Pool serves more than 100 countries and it is ranked as top 3 mining pool operator by more than 20 networks. They have played an important part in securing the blockchain infrastructure and educating the global community concerning cryptocurrency mining.

Handshake, a 10.2 million dollar decentralized domain name project, is supported by popular investors which include Sequoia Capital, SV Angel, and Andreessen Horowitz. The handshake can be described as a permissionless, decentralized naming protocol where every counterpart validates and takes charge of managing the root DNS naming zone; with the end goal of developing an alternative to the present naming systems and certificate authorities.

On 5th February, KDA (Kadena) was officially launched in the F2Pool. KDA is a public blockchain with the high-performance PoW (Proof-of-work) system that includes the benefits of ChainWeb technology. Moreover, the KDA platform integrates private blockchains, public apps and several other well-matched blockchains in one place. Furthermore, KDA is a token that is utilized as compensation for miners. It is a fee that the user pays for adding the transactions in the block.

F2 is a leading mining pool, has an association of miners, where every miner contributes to the power of the computer to detect the blocks. There are more than 2 million users active in the pool, where 50% are Chinese users. Also, the reward for mining is that the users will get a reward of 3% for using the F2Pool. Succinctly, this is the exclusive pool operating for P2P payments. The withdrawal fees in the F2Pool do not go beyond 4% and it also pays daily, which are subsequently sent to F2Pools wallet.

See original here:
F2Pool: an Introduction to the Renowned Mining Pool - CryptoNewsZ

EARN IT Act ignites Section 230 tug-o-war – Politico

With help from Cristiano Lima, John Hendel and Leah Nylen

Editors Note: Morning Tech is a free version of POLITICO Pro Technologys morning newsletter, which is delivered to our subscribers each morning at 6 a.m. The POLITICO Pro platform combines the news you need with tools you can use to take action on the days biggest stories. Act on the news with POLITICO Pro.

Section 230 latest: The battle over the bipartisan EARN IT Act, which could threaten tech giants legal liability protections, will continue next week when the Senate Judiciary Committee holds a hearing on the bill probably with testimony from law enforcement officials and leaders from the tech sector.

(Another) TikTok bill: The day after Republican Sen. Josh Hawley announced plans to introduce a bill banning federal employees from using TikTok on their work devices, the House passed similar legislation from Democratic Rep. Abigail Spanberger.

Coronavirus, contd: The FCC is under pressure from Congress to use the same authority and resources it deploys for disaster response to address the threat of the coronavirus.

A message from Business Roundtable:

American consumers, their devices and data constantly travel across state lines. Without a national privacy law, consumers will have inconsistent privacy protections from state to state. Learn more at privacy.brt.org.

HELLO FRIDAY! AND WELCOME TO MORNING TECH. Im your host, Alexandra Levine. On todays coronavirus misinformation monitor: Titos lays down the law that no, pouring vodka on yourself will not protect you from COVID-19. (Per the CDC, the company wrote on Twitter, hand sanitizer needs to contain at least 60% alcohol. Tito's Handmade Vodka is 40% alcohol.)

Got a news tip? Write Alex at alevine@politico.com or @Ali_Lev. An event for our calendar? Send details to techcalendar@politicopro.com. Anything else? Full team info below. And dont forget: add @MorningTech and @PoliticoPro on Twitter.

WHATS NEXT FOR THE HOTLY CONTESTED EARN IT ACT The bipartisan rollout of the EARN IT Act on Thursday sparked widespread pushback from tech industry leaders, civil liberties groups and others, while garnering plaudits from child abuse prevention advocates and the battle over the bill is just getting started.

Next up: The Senate Judiciary Committee will hold a Wednesday hearing on the bill, which would require companies to prove they are doing enough to curb child abuse online to keep their Section 230 protections. This hearing is only the beginning, Sen. Richard Blumenthal (D-Conn.) said on Thursday. Were eager to listen to critics or anyone else who has suggestions for improvement. We take them seriously.

On deck: Chairman Lindsey Graham (R-S.C.) told Cristiano hes planning to bring in trade groups that represent the tech industry to testify at next weeks hearing but not Attorney General William Barr, who this week separately unveiled new voluntary guidelines on combating child exploitation. Blumenthal said he hopes to hear testimony from law enforcement officials and child abuse prevention advocates, in addition to leaders from the tech sector.

But will it pick up steam in the Senate? The bill already has the backing of 10 senators four Republicans and six Democrats, including the top two officials on Senate Judiciary but a number of key lawmakers said theyre still weighing its merits. "I'm open to talking to them about it," Senate Commerce Chairman Roger Wicker (R-Miss.) said Wednesday. Sen. Rob Portman (R-Ohio), who helped lead the last major push to amend Section 230, is reviewing the EARN IT Act to see if it builds upon the passage of SESTA, spokeswoman Emily Benavides said.

TIKTOKS TOUGH WEEK, CONTINUED The House passed legislation on Thursday that, in a move aimed at protecting Americans from Chinese surveillance, would ban some airport workers use of TikTok on their government-issued phones. After the TSA last month banned employees from using the Chinese-owned video app for work, Rep. Abigail Spanberger (D-Va.) included an amendment in a bipartisan bill she co-sponsored, the Rights for Transportation Security Officers Act, codifying that TSA policy.

TikTok, like other Chinese companies, is required under Chinese law to share information with the government and its institutions, Spanberger said Thursday. Because it could become a tool for surveilling U.S. citizens or federal personnel, TikTok has no business being on U.S. government-issued devices, she added. The legislation passed the day after Hawley, a tech critic and China hawk, announced plans for a similar measure to ban the use of TikTok by all federal employees on all federal government devices.

CANTWELL TO FCC: STEP UP ON CORONAVIRUS Senate Commerce ranking member Maria Cantwell (D-Wash.) is urging the FCC to respond to some of the challenges posed by COVID-19 just as it has in the past with disaster response and consider how the FCCs existing authority and programs, as well as temporary policies or rule waivers, may be used to secure the nations safety and continued well-being.

Examples she offers: Perhaps adopting temporary rules to let Red Cross shelters tap telemedicine subsidies; helping facilitate remote monitoring of patients, especially low-income ones; and finding ways to help spur at-home learning for students in areas where schools may be closing.

VAN HOLLEN TO PUSH FOR QUANTUM COMPUTING CASH Sen. Chris Van Hollen (D-Md.) expressed frustrations Thursday over what he sees as a dearth of proposed Commerce Department funding for the National Institute of Standards and Technologys quantum computing efforts. The good news I see in the NIST budget is youve increased the funding for AI, he told Commerce Secretary Wilbur Ross during an appropriations hearing. When it comes to quantum computing in the NIST budget, its flatlined. He also pressed Ross on Huawei, as John reported for Pros.

AIRWAVES BATTLE OVER 6 GHZ HEATS UP Lobbying continues apace over the FCCs forthcoming decision about what to do with the 6 GHz band (now occupied by utilities that fear disruption). This week saw new pushback to the wireless giants attempt to get the FCC to auction off a part of this prime mid-band spectrum for exclusive licensed use: California Democratic Reps. Anna Eshoo and Tony Crdenas along with Rep. G.K. Butterfield (D-N.C.) asked the FCC to reserve the whole swath of airwaves for unlicensed uses like Wi-Fi, as did hundreds of smaller wireless ISPs on Thursday in a letter to lawmakers.

Sen. Ted Cruz (R-Texas) may also wade in and side with the wireless industry, per a draft letter to the FCC circulating now. In this tentative draft, Cruz said making the whole band available for unlicensed use is in stark contrast with European countries that are divvying up the band for both licensed and unlicensed uses. A similar strategy could create a win-win scenario for both licensed and unlicensed users, the draft said.

And globally, speaking of 6 GHz: Grace Koh, who helped lead last years U.S. delegation to the World Radiocommunication Conference, recently said on a podcast that China had made a big push whenever she met with its officials bilaterally to see about using this 6 GHz band for 5G, largely due to interest involving Huawei. What did happen was that Huawei and Ericsson were not successful and China were not successful in getting the entire 6 GHz ban studied for 5G, she added.

DO CONSUMERS UNDERSTAND GOOGLE RESULTS? A federal appeals court grappled Thursday with whether average consumers know the difference between the ads and the organic search results that appear on Google. Arguing before the U.S. Court of Appeals for the 2nd Circuit, 1-800 Contacts which is seeking to reverse an FTC decision that its trademark agreements violated antitrust law contended that they dont understand.

Federal law allows a company to protect its trademark if use of the trademarked term could confuse consumers. The online contact lens retailer argued that consumers would be confused if they search for 1-800 Contacts on Google or Bing but instead see ads for other companies.

But two of the three judges on the panel werent so sure. Even an old guy who is old enough to remember Kodak and film knows the first four things you get on Google, which are labeled ad, you should disregard and move down to the next thing, said Circuit Judge Peter W. Hall, a George W. Bush appointee. Circuit Judge Gerard Lynch was also skeptical. Is that the standard, everyone has to know that? Twenty years from now when our kids are up there and doing this stuff, were not even going to be having this conversation, he said.

FTC attorney Imad Dean Abyad told the appeals court that 1-800 Contacts agreements with rivals were the same as an offline agreement to divide up market. 1-800 is claiming that digital [ad] space as its own exclusive territory and has agreed with its rivals that they would not advertise in that territory, he said. Abyad also said that 1-800 Contacts agreements were overly broad because they barred rivals from using the companys name in any kind of ad, even a comparative one. Courts have consistently found that comparative ads arent trademark violations. This is not about protecting trademarks, he said. This is about 1-800 protecting its much higher price.

A message from Business Roundtable:

American consumers, their devices and data constantly travel across state lines. Without a national privacy law, consumers will have inconsistent privacy protections from state to state.

Consumers deserve consistent privacy protections nationwide, no matter where they are or what theyre doing from banking or shopping online to reading the news or communicating with friends. The security of their personal data shouldnt depend on where they live, work or travel.

Thats why Business Roundtable CEOs, who operate in every sector of the U.S. economy and whose companies touch virtually every American consumer, are calling on Congress to pass a comprehensive, nationwide data privacy law.

Learn more at privacy.brt.org.

Barr named Will Levi as his new chief of staff, POLITICO reports. Frances Marshall, former senior counsel for intellectual property at the Justice Department's antitrust division, has joined Apple as senior standards counsel.

ICYMI: In a rare move to take down content posted by President Donald Trump, Facebook said it would remove ads that invoke the Census when directing people to the website of his reelection campaign, POLITICO reports.

Like Shazam, but for faces: Want to know the name of that stranger you ran into at a party? Or that person you see from across the restaurant? Theres an app for that, NYT reports thats precisely how some have used the controversial facial recognition app Clearview AI.

Coercion up close: A factory making computer screens, cameras and other gadgets for a supplier to tech companies including Apple and Lenovo relies on forced labor by Muslim ethnic Uighurs, the AP reports.

Broke: Anthony Levandowski, the self-driving engineer accused by Google of breaching his employment contract and misusing confidential information, filed for bankruptcy, citing a $179 million legal judgment, WSJ reports.

Kremlin watch: How Russia Is Trying To Boost Bernie Sanders' Campaign, via NPR.

Andrew Yangs next move: A political nonprofit called Humanity Forward, POLITICO reports. The core issues: a universal basic income for all Americans provided by the government, a human-centered economy and data as a property right, Yang said.

First Amazon, now Facebook: Facebook confirmed Thursday that a contractor at its Seattle office had been diagnosed with the coronavirus, Reuters reports.

Droppin like flies: LinkedIn joined the host of other tech companies including Facebook, Twitter, Apple and Netflix that have backed out of SXSW over coronavirus concerns, AdWeek reports. (Also scrubbed: The Red Hat Summit.)

Stars, theyre just like us: Twitter CEO Jack Dorsey may cancel his up to half-a-year sojourn in Africa over coronavirus concerns, Reuters reports. (But then again, as MT reported, theres a push right now to oust him from the helm of the company.)

Also on Twitter: The platform said it's expanding its rules against hateful conduct to include language that dehumanizes on the basis of age, disability or disease," CNET reports.

Not the Winklevoss twins: A start-up founded by two MIT researchers is suing Facebook, Reuters reports, alleging the social media giant has stolen and made public technology that could revolutionize the field of artificial intelligence.

Tips, comments, suggestions? Send them along via email to our team: Bob King (bking@politico.com, @bkingdc), Mike Farrell (mfarrell@politico.com, @mikebfarrell), Nancy Scola (nscola@politico.com, @nancyscola), Steven Overly (soverly@politico.com, @stevenoverly), John Hendel (jhendel@politico.com, @JohnHendel), Cristiano Lima (clima@politico.com, @viaCristiano), Alexandra S. Levine (alevine@politico.com, @Ali_Lev), and Leah Nylen (lnylen@politico.com, @leah_nylen).

TTYL.

Originally posted here:
EARN IT Act ignites Section 230 tug-o-war - Politico

Unleash the Software Magic with SUSE at the Reach of Your Computer – Tech Times

When one is talking aboutsoftwarein a business sense, one does not mean software that is linked to entertainment or something unproductive, the type of software used for business is exactly the ones SUSE has to offer! With an ecosystem of partners that work well and on the deadline to deliver the perfect enterprise-grade, the open software-defined infrastructures are that your company needs are in good hands.

(Photo : Screenshot From SUSE Official Facebook Page)SUSE.com

Looking at things on a broader scale, it is not just the product that matters but also the company itself and one way to tell if a company is doing well is how the company treats its employees! If a company is more family oriented, this goes to show that the company cares about the product as much as the profits unlike profit oriented companies wherein the product is just a means towards profits which sometimes lead those companies to underperform when it comes to their products or services.

Read Also:Is the Samsung Galaxy s 20 Ultra Worth it? DEFINITELY NOT! Experts Claim that the $1,400 is not fair!

When looking at an ideal company, SUSE is definitely one of them. Being the world's largest independent open source software company did not come easy and as this company has grown, so has the growing 1,600 employees of this company! Every single employee plays a heavy part in SUSE's success which is why the company is great when it comes to your open source software needs.

When doing business with SUSE, they do not treat you as a customer but rather as a partner. The goal of SUSE is to make sure that the system works perfectly for you and your company as well. The announced revenue for the FY2020 has grown 12 percent YoY which is a great 67 percent surge in cloud revenue! Achieving this much for an independent company took quite a lot of work and research but the brilliance of SUSE is definitely showing.

The modernized, simplified, and accelerated tradition and cloud native applications have been what SUSE has offered to the market as their IT landscape is beneficial to about any type of platform. With the company both growing in skill and size, SUSE is celebrating its success together with its company for the good returns reported of Q1 of 2020.

Read Also:PS5 Last Console: Could a Price Hike be the End of Sony's PlayStation Legend?

When dealing with corporate materials, the demand for an enterprise-grade software does increase as the basic software does not do you any good anymore. Building your very own software from scratch would take years to perfect and instead of spending that time wisely on the growth of your company, you might find yourself lost trying to develop an enterprise-grade software yourself. Let SUSE do it for you. Leave the complicated stuff to the experts while you focus on things that truly matter like the expansion of your company. SUSE is here to help you with your software needs not just as a business, but as a partner.

2018 TECHTIMES.com All rights reserved. Do not reproduce without permission.

Read more:
Unleash the Software Magic with SUSE at the Reach of Your Computer - Tech Times

What the 2020 election means for encryption – The Verge

This is a living guide to encryption: what it is, what it isnt, why its controversial, and how it might be changed. This guide will be updated as events warrant.

Encryption is the process of scrambling information so only the intended recipients can decipher it. An encrypted message requires a key a series of mathematical values to decrypt it. This protects the message from being read by an unwanted third party. If someone without the key tries to hack in and read the message, theyll see a set of seemingly random characters. Using modern encryption techniques, extracting the original message without the key is nearly impossible.

That basic process is a fundamental building block of network security, ensuring that information can travel over the public internet without being intercepted in transit. Without some form of encryption, it would be impossible to implement basic online services like email, e-commerce, and the SSL system that verifies webpages.

While most uses of encryption are uncontroversial, the wide availability of techniques has opened up new political questions around lawful access. Presented with a warrant for a particular users information, businesses are legally required to turn over all the information they have. But if that information is encrypted and the company doesnt have the key, there may be no way to work back to the original data.

Some products hold copies of user keys and decrypt data when served with a warrant, including Gmail, Facebook pages, and most cloud storage providers. But messaging apps like WhatsApp, Telegram, and Signal do not, and the device encryption used by iOS also makes the phones local data inaccessible. That approach has both privacy and security benefits: since the data is not available outside of the local device, the apps are far more resilient to breaches and centralized attacks.

In 2014, James Comey, the then-director of the FBI, wrote a memo spelling out his concerns about encryption. Those charged with protecting our people arent always able to access the evidence we need to prosecute crime and prevent terrorism even with lawful authority, he wrote.

Comey went on to warn that encryption would make it more difficult for law enforcement to catch suspected criminals. If communications are encrypted by default, he said, the government cant monitor and collect communications, even if a judge allows them to do so. Encryption, he summarized, will have very serious consequences for law enforcement and national security agencies at all levels. Sophisticated criminals will come to count on these means of evading detection. Its the equivalent of a closet that cant be opened. A safe that cant be cracked. And my question is, at what cost?

The governments position on encryption hasnt evolved a whole lot in the intervening years. Attorney General William Barr and Sen. Lindsey Graham (R-SC) argued last year that hardened encryption makes it difficult to figure out when messaging platforms are used to coordinate crimes. If a large-scale terrorist attack is carried out, the government needs to act quickly to understand the national security risks. Hardened encryption could make this discovery process harder.

In 2016, in the wake of the San Bernardino shooting, the FBI asked Apple to hand over information from the suspects iPhone. At first, the company complied, giving the FBI data from the suspects iCloud backup. Then the FBI demanded access to the phones local storage. This would have involved Apple deploying an entirely new version of iOS to the device, which the company refused to do. In a statement, a company spokesperson said: We believed it was wrong and would set a dangerous precedent.

The FBI responded by trying to force Apple to help, citing the All Writs Act of 1789. Just before a hearing on this case, however, the FBI was able to unlock the iPhone using an anonymous third-party company. The phone did not contain much new information the FBI hadnt already had, but the conflict escalated the fight between tech companies and the government over encryption.

In 2019, after the shooting at the Pensacola Naval Air Station, the government again asked for Apples assistance unlocking the suspects iPhone. Apple did not comply, but it did hand over data from the suspects iCloud backups. In response to Apples refusal to unlock the shooters iPhone, President Donald Trump tweeted: We are helping Apple all of the time on TRADE and so many other issues, and yet they refuse to unlock phones used by killers, drug dealers and other violent criminal elements.

A week later, it was revealed that the company had dropped plans to allow users to encrypt their iCloud backups after the FBI argued the move would harm future investigations.

In March 2019, Facebook CEO Mark Zuckerberg published a memo laying out his vision for a new privacy-focused social network. In it, he stated the companys plan to roll out encryption across its various messaging apps. People expect their private communications to be secure and to only be seen by the people theyve sent them to not hackers, criminals, over-reaching governments, or even the people operating the services theyre using, he wrote.

The news set off a firestorm of criticism from certain politicians most notably, AG Barr. In a letter to the company, Barr, along with officials in the United Kingdom and Australia, wrote, Companies should not deliberately design their systems to preclude any form of access to content, even for preventing or investigating the most serious crimes. They added that encryption put people at risk by severely eroding a companys ability to detect and respond to illegal content and activity, such as child sexual exploitation and abuse, terrorism, and foreign adversaries attempts to undermine democratic values and institutions, preventing the prosecution of offenders and safeguarding of victims. They asked Facebook to stop the encryption rollout. Facebook did not comply with this request.

Republicans seem to want US tech companies to comply with law enforcement in the event of a major national security attack. They do not want US tech companies to make accessing user data more complicated through end-to-end encryption. In his letter to Facebook, Barr asked Zuckerberg to allow law enforcement to obtain lawful access to content in a readable and usable format, as reported by The New York Times.

Most Democratic presidential candidates are supportive of end-to-end encryption. When asked whether the government should be able to access Americans encrypted conversations, Sen. Bernie Sanders (I-VT) said: [I] firmly [oppose] the Trump administrations efforts to compel firms to create so-called backdoors to encrypted technologies. Sen. Elizabeth Warren (D-MA) did not answer directly, but she said that the government can enforce the law and protect our security without trampling on Americans privacy. Individuals have a Fourth Amendment right against warrantless searches and seizures, and that should not change in the digital era. During his primary run, former South Bend, Indiana mayor Pete Buttigieg said, End-to-end encryption should be the norm. Former New York City mayor Mike Bloomberg, in an op-ed from 2016, argued against end-to-end encryption and said tech companies shouldnt be above the law in refusing court orders to hand over user data.

Section 230 of the Communications Decency Act protects websites from lawsuits if a user posts something illegal. Theres been a large debate about whether companies should continue to have these protections, with various lawmakers proposing plans to change or amend Section 230.

In January, one proposed change called Eliminating Abusive and Rampant Neglect of Interactive Technologies Act (EARN IT) sought to strip tech companies of their Section 230 protections if they didnt comply with new rules for finding and removing content related to child exploitation. And while the bill, titled the National Strategy for Child Exploitation Prevention, didnt lay out many specifics, complying with these rules would likely mean not encrypting some user data.

Apple has taken the lead on the issue so far, and it has been careful to valorize law enforcement and lawful access provisions, while firmly opposing a backdoor. As CEO Tim Cook framed it in an open letter at the start of the San Bernardino case, Apple is willing to do everything it can including turning over iCloud logs and other user data but unlocking device encryption is a step too far. Up to this point, we have done everything that is both within our power and within the law to help [the FBI], Cook wrote. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.

For the most part, other tech companies have lined up behind Google with the Facebook-owned WhatsApp leading the way. In response to Barrs letter in 2019, Will Cathcart, head of WhatsApp, and Stan Chudnovsky, who works on Messenger, said the company was not prepared to build the government a backdoor in order to access user messages. Cybersecurity experts have repeatedly proven that when you weaken any part of an encrypted system, you weaken it for everyone, everywhere, they wrote. It is simply impossible to create such a backdoor for one purpose and not expect others to try and open it.

Still, many tech companies that rely on government contracts have had to walk a more politically delicate line. Microsoft supported Apple publicly during the San Bernardino case, but more recent statements from Microsoft CEO Satya Nadella have taken a softer line. In January 2020, Nadella expressed opposition to backdoors but optimism about legislative or other technical solutions, saying, We cant take hard positions on all sides.

As tech companies like Facebook continue to move forward with large-scale encryption projects, more major changes could come in the form of legislation aimed at helping or hurting large-scale encryption initiatives. In 2019, Rep. Ted Lieu (D-CA) reintroduced a 2016 bill called the Ensuring National Constitutional Rights for Your Private Telecommunications Act (ENCRYPT), which would create a national standard for encrypted technology. Rep. Zoe Lofgren (D-CA), along with a bipartisan coalition, also introduced the Secure Data Act, which would stop federal agencies from forcing tech companies to build backdoors into their products, thereby weakening encryption. Finally, theres still the draft of the National Strategy for Child Exploitation Prevention, which would make it much harder for tech companies to encrypt their products.

See original here:
What the 2020 election means for encryption - The Verge

Why SSL Encryption Will not Become a Victim of its Own Success – Infosecurity Magazine

At the start of 2020, there are some technologies originally developed only with the very best of intentions that seem to have a darker side, challenging us to come up with new ways to harness and handle their capabilities.

One of these technologies is encryption, which was developed years ago as a way to enhance the security of digital data and data streams and is now deployed in countless consumer products.

The internet has been an important accelerator behind the use of encryption technology. As a result, more than 80 per cent of todays global internet traffic is encrypted. WhatsApp, for example, uses encryption technology to reassure its users that their messages can only be read by the intended recipient. In a world in which cyber-criminals are active 24/7, trying to get their hands on as much data as possible, this level of security is an essential feature of online data exchange.

300 million attacks per monthHowever, the prevalence and success of encryption technology has not escaped the attention of internet data thieves. For years, cyber-criminals have been adopting all kinds of disguises to continue their pursuit of targets.

One of their most recent tricks is to send malevolent code in encrypted format in an attempt to sidestep traditional security programs, which are incapable of viewing the contents of encrypted data packets or are deliberately designed not to in order to protect users privacy. In some cases, a security solution may simply not have enough capacity to check the content of all encrypted traffic without grinding to a halt. Criminals are already deploying encrypted threats at huge scale. In 2019, the Zscaler ThreatLabZ team recorded almost 300 million of these kinds of attacks per month!

Certificate authoritiesMany organizations believe that they are protected from attacks on SSL encrypted data because they use a public key infrastructure (PKI). A PKI provides the technology that is required to encrypt internet traffic, including a component known as a certificate authority.

Certificate authorities are the parties responsible for managing and securing the unique keys and providing websites with the certificates that act as the key to the browsers lock. There are many certificate authorities that do a great job and do everything they can to ensure that communication is secure. But, in principle, anyone can set up a PKI infrastructure and issue certificates.

There are many certificate authorities that have a good reputation and that execute high-level checks and verification processes, but there are many others that arent as well regarded, who are known for issuing certificates to bad actors without any checks. As a result, it is now very easy for these bad actors to construct their own encrypted websites that, at least at first glance, can look entirely legitimate.

This means that a digital transaction may appear secure when, in fact, it is anything but. SSL/TLS encryption is a guarantee of confidentiality and integrity, giving users the assurance that their data cannot be viewed or manipulated while in transit. That little lock shown in your browser doesnt tell you anything about the intentions of the person, or the system that you are communicating with.

A dilemma for CISOsThese developments have produced a complicated dilemma for many CISOs. They dont need to worry about whether or not to use encryption for data in transit. That question has already been answered, because encryption significantly enhances security and is often mandatory anyway. The challenge lies in the incoming data traffic that is already encrypted.

While most CISOs understand that inspecting encrypted data can further boost security, some remain unsure as to whether or not to actually do it. Sometimes, the company may not have the technology needed to check incoming encrypted data effectively; sometimes, the doubt stems from uncertainty in relation to the employees rights to privacy.

This uncertainty ensures that the status quo is maintained, and that encrypted data traffic is accepted without question even though the organization has no idea what a data packet contains or whether it could cause harm to the company or its employees.

The General Data Protection Regulation (GDPR) introduced in mid-2018 is one of the reasons why many CISOs doubt the legitimacy of measures to scan encrypted data traffic. Although the regulation does not set out exactly which preventive measures organizations should implement to be considered compliant, it is very clear on one thing: organizations are responsible for providing a secure digital work environment for their employees.

If an organization has no idea what data is coming into its systems and what the impact of it could be, it is not doing everything it could to facilitate a secure digital working environment as described in Article 32 of GDPR.

For any CISOs who have concerns about privacy, remember this: during inspection, the reports and logs (or, more accurately, the files generated from them) can be configured to show only metadata to operators. All PI fields are blocked out. This approach provides enough information to perform a technical check on the data.

If this check suggests that an incident has occurred to justify the disclosure of the PI data, you can initiate a process to gain insight into the obfuscated personal data.

This process applies only in exceptional circumstances, for example, if someone is suspected of leaking data or if you need to know whose systems have been compromised by a hacking attempt. Often, representatives from HR or the legal team are involved in these kinds of processes. Organizations can also set out their processes in privacy policies, which employees are expected to be aware of and understand.

The solution: the security cloudOrganizations are increasingly opting to send and receive all their data traffic via a security cloud. These services have sufficient capacity to analyze vast amounts of data, including encrypted data, in very short timeframes before forwarding it on to end users.

One of the main advantages of this way of working is that the process of decryption and inspection takes place in the cloud, which means that organizations do not need to make huge investments in processing power and that they only receive data that has been approved by the cloud security provider.

Thanks to cloud technology, organizations can continue to benefit from the power of encryption, remain compliant with regulations, such as GDPR, and assure their employees that their privacy and data will be protected across all their devices.

Follow this link:
Why SSL Encryption Will not Become a Victim of its Own Success - Infosecurity Magazine

Encryption Flaws Leave Millions of Toyota, Kia, and Hyundai Cars Vulnerable to Key Cloning – Gizmodo

A 2014 Toyota Land Cruiser, one of the models listed as affected by the vulnerability.Photo: Yoshikazu Tsuno (AFP/Getty Images)

Millions of cars with radio-enabled keys made by Toyota, Hyundai, and Kia may be vulnerable to hijacking thanks to a flaw in their encryption implementation, Wired reported this week, citing the results of a KU Leuven in Belgium and University of Birmingham study.

The cars in question use Texas Instruments DST80 encryption, but the way it was built into them means that a hacker could potentially use a relatively inexpensive Proxmark RFID reader/transmitter device near the key fob to trick the car into thinking they have a legitimate key, Wired wrote. While other models of car have proven vulnerable to hacking via relayin which hackers use radio transmitters to extend the range of a cars key fob until the original key is in rangethis method requires that the attacker come within close proximity of the fob and scan it with the RFID device. That would provide enough information to determine the encryption key, clone it using the same RFID device, and use that to disable a part called the immobilizer, which prevents a car from starting without a key in the vicinity.

With the immobilizer disabled, the only obstacle remaining would be the ignition barrel (i.e., key slot) that actually starts the engine. This only requires classic-era car theft techniques like hotwiring or substituting the key for a screwdriver.

The attack is made possible because the encryption keys used by the cars were easily discovered by reverse-engineering the firmware, the researchers wrote. In Toyotas case, the encryption key was based on a serial number also broadcast with the fob signal, while the Kia and Hyundai cars in question used just 24 random bits of protection (DST80, as implied by the name, supports up to 80). University of Birmingham computer science professor Flavio Garcia told Wired that identifying the correct 24 bits is a couple of milliseconds on a laptop. However, the researchers did not publish certain information about how they cracked the encryption.

Hyundai told Wired that none of the affected models are sold in the U.S. and that it continues to monitor the field for recent exploits and [makes] significant efforts to stay ahead of potential attackers. Toyota told the site that the described vulnerability applies to older models, as current models have a different configuration and is low risk.

The full list of affected models is below, including Toyota Camry, Corolla, RAV4, and Highlander models; the Kia Optima, Soul, and Rio; and multiple Hyundai hatchbacks. (The Tesla S used to be vulnerable, but Tesla has updated the firmware, according to Wired.) The researchers noted that this list is non-exhaustive, meaning more models could be affected.

Per Wired, the researchers say the findings are relevant to consumers because although the method is rather technically involved, it can be circumvented by methods like attaching a steering lock when necessary. Some of the cars could also potentially be reprogrammed to remove the vulnerability, though the team told Wired that the Tesla S was the only car on the list they were aware had the capability to do so.

[Wired]

More here:
Encryption Flaws Leave Millions of Toyota, Kia, and Hyundai Cars Vulnerable to Key Cloning - Gizmodo

US threatens to pull big techs immunities if child abuse isnt curbed – TechCrunch

The Department of Justice is proposing a set of voluntary principles that take aim at tech giants in an effort to combat online sexual abuse.

The principles are part of a fresh effort by the government to hold the tech companies accountable for the harm and abuse that happens on their platforms, amid the past two years of brewing hostilities between the government and Silicon Valley. But critics also see it as a renewed push to compel tech companies to weaken or undo their warrant-proof encryption efforts under the guise of preventing crime and terrorism.

U.S. Attorney General William Barr announced the proposals at the Justice Department on Thursday with international partners from the U.K., Canada, Australia and New Zealand.

The principles, built by the five countries and tech leaders including Facebook, Google, Microsoft and Twitter aim to incentivize internet companies and social media giants to do more to prevent child sexual abuse on their platforms.

Barr said he hopes that the principles set new norms across the tech industry to make sure theres no safe space on the internet for offenders to operate.

The principles come ahead of anticipated bipartisan legislation to Congress the so-called EARN IT Act, which reports say could effectively force the tech companies hands by threatening to pull their legal immunities for what their users post if the companies fail to aggressively clamp down on online child sexual abuse.

Sens. Lindsey Graham (R-SC) and Richard Blumenthal (D-CT) announced the legislation shortly after the Justice Department presser ended.

The bill got quick rebuke from Senate colleague, Ron Wyden (D-OR), calling the bill deeply flawed and counterproductive bill.

This bill is a transparent and deeply cynical effort by a few well-connected corporations and the Trump administration to use child sexual abuse to their political advantage, the impact to free speech and the security and privacy of every single American be damned, said Wyden.

Barr warned that the government is analyzing the impact of Section 230 of the Communications Decency Act, which protects tech platforms from legal liability for content created by their users.

Under Barr, the Justice Department has taken a particular interest in dismantling Section 230. Last month, the Justice Department hosted a workshop on Section 230, arguing that the immunity it provides interferes with law enforcement and needs to be reexamined.

We must also recognize the benefits that Section 230 and technology have brought to our society, and ensure that the proposed cure is not worse than the disease, Barr said last month.

Any change to Section 230, widely regarded as the legal underpinning of all online platforms, could radically alter the landscape of the modern internet and give the government more power to control online speech. Privacy advocates view the governments interest in wielding Section 230 as a cudgel and existential threat to the internet as we know it.

Last month, Wyden, one of Section 230s co-authors, condemned the Trump administrations scrutiny of the law and argued that repealing the law would not be a successful punishment for large tech companies. The biggest tech companies have enough lawyers and lobbyists to survive virtually any regulation Congress can concoct, Wyden wrote. Its the start-ups seeking to displace Big Tech that would be hammered by the constant threat of lawsuits.

U.K. Security Minister James Brokenshire lauded the initiatives existing six tech partners, encouraging the rest of the industry to fall in line. Its critical that others follow them by endorsing and acting on these principles. The minister claimed that plans to encrypt tech platforms are sending predators back into the darkness and away from artificial intelligence advances that can expose them.

Barr also questioned if disappearing messages or certain encryption tools appropriately balance the value of privacy against the risk of safe havens for exploitation?

But privacy groups remain wary of legislative action, fearing that any law could ultimately force the companies to weaken or break encryption, which government officials have for years claimed helps criminals and sexual predators evade prosecution.

End-to-end encryption has become largely the norm in the past few years since the Edward Snowden revelations into the vast surveillance efforts by the U.S. and its Five Eyes partners.

Apple, Google and Facebook have made encryption standard in its products and services, a frequent frustration for investigators and prosecutors.

But last year, the Five Eyes said it would contemplate forcing the matter of encryption if tech giants wouldnt acquiesce to the pacts demands.

The government has called for responsible encryption, a backdoor-like system that allows governments to access encrypted communications and devices with a key that only it possesses. But security experts have universally panned the idea, arguing that there is no way to create a secure backdoor without it somehow being vulnerable to hackers.

The bill has already received heavy opposition. Facebook said that child safety is a top priority, but warned that the EARN IT Act would roll back encryption, which protects everyones safety from hackers and criminals.

Its a similar anti-encryption bill to one that Sens. Dianne Feinstein (D-CA) and Richard Burr (R-NC) introduced in 2016, which would have forced tech companies to build backdoors in its systems. The bill failed.

The Electronic Frontier Foundation said the bill would undermine the law that undergirds free speech on the internet. Firefox browser maker Mozilla said the bill creates problems rather than offering a solution.

The law enforcement community has made it clear this law is another attempt to weaken the encryption that is the bedrock of digital security, said Heather West, Mozillas head of Americas policy. Encryption ensures our information from our sensitive financial and medical details to emails and text messages is protected.

Without it, the world is a far more dangerous place, said West.

See original here:
US threatens to pull big techs immunities if child abuse isnt curbed - TechCrunch