From Snowden to Crypto AG: the biggest leaks since 2010 – Business Insider – Business Insider

Since the Cold War, more than 120 countries reportedly trusted a Swiss company Crypto AG to safeguard their most secret messages. However, few of them knew that, according to a leaked report published last week by the Washington Post, the encrypted communications device maker was actually controlled by the CIA and its German counterpart.

The two intelligence agencies reportedly rigged the devices to let them listen in on political and military leaders, other spies, and even private companies and make millions of dollars in profit along the way. Crypto AG helped them spy on uprisings in Latin American countries, Middle Eastern dictators, the Vatican and even the United Nations.

While the CIA report called the operation "the intelligence coup of the century," other major leaks over the past decade have shed light on some closely held state secrets.

Whistleblowers like Edward Snowden and Chelsea Manning, by sharing confidential documents with journalists and organizations like WikiLeaks, have helped expose everything from the NSA's surveillance of millions of American citizens to offshore tax havens used by world leaders and the ultra wealthy.

Here are just some of the most significant things we've learned as a result of recent leaks.

More:
From Snowden to Crypto AG: the biggest leaks since 2010 - Business Insider - Business Insider

What Cities Can Learn from the Nation’s Only Privacy Commission – Governing

Perhaps no city cares about the privacy of its residents as much as Oakland.

Last year, the California city became one of just a handful around the country that have banned municipal use of facial recognition technology. That came on top of an earlier ordinance that put limits on surveillance technology.

Those laws were largely the handiwork of Oaklands Privacy Advisory Commission, a citizen-led board that can review any and all city policies and regulations through a privacy lens. Other cities have privacy policies or staff in place, while a few have ad hoc groups to address particular issues, such as smart city policies. No other city has a standing group with such a broad charter.

We know the activities that are going on in the city of Oakland, says Tracy Rosenberg, advocacy director for Oakland Privacy, a nonprofit watchdog group. Oakland is considering buying some drones. Im not happy about it, but I know about it, I can weigh in on it, and theres a body looking at the issues. Its not like were going to find out just by looking up when theyre already flying over our heads.

Tracy Rosenburg, advocacy director for Oakland Privacy.

In 2020, data privacy bills are being debated in numerous states many modeled on a California law that took effect this year. Meanwhile, the tech industry is lobbying Congress to put a national standard in place. The European Union already has stricter data protection policies in place than the U.S.

While privacy can seem like an issue thats national or even international in scope, issues such as surveillance often play out at the local level, particularly in law enforcement.

There is a really important role for local institutions, says Chris Calabrese, vice president for policy at the Center for Democracy and Technology, a civil liberties group based in Washington. That is, figuring out how their localities are using surveillance technologies. Its actually not unusual for a state or a town to get a grant from DHS [Department of Homeland Security] to get a drone or surveillance without any discussion by the city council.

Thats exactly what happened in Oakland. The city had received DHS funding for a Domain Awareness Center (DAC), which collected and analyzed surveillance data from throughout the city. It began as a security project at the port and airport, but crept out to encompass the entire city, collecting information from 700 cameras in public housing and schools, outdoor gun detection microphones, license plate readers and other sensors.

News about the DAC broke thanks to someone with a temp job in the citys technology office. If that sounds like Oakland had its own version of Edward Snowden, the National Security Agency contractor who made international news with revelations of mass data collection at the federal level well, the Oakland story happened to break just after Snowden himself burst on the scene.

They have the great fortune of hitting the public safety agenda in the middle of June 2013, which was two weeks after somebody named Edward Snowden hit the front pages of newspapers, says Brian Hofer, who chairs the Oakland Privacy Advisory Commission.

Realizing that the issues Snowden had brought to the fore were playing out right in their own hometown had a galvanizing effect in Oakland. City officials were already highly familiar with the large aggregation of activist groups who call Oakland home. Many of those groups were already on high alert. At the time, the Occupy movement was still fresh. The city had also been rocked by the killing of Oscar Grant by a transit officer, which formed the basis of the 2013 movie Fruitvale,and anticipated the complaints about police violence that would erupt in 2014 in Ferguson, Mo.

There was just all this activist energy in Oakland at that exact time, Hofer recalls, and it was just like the match that lit the gasoline fire.

Brian Hofer, chair of the Oakland Privacy Advisory Commission.

At that time, Hofer had never before set foot in Oakland City Hall, but he became part of the coalition of opposition that grew in response to the Domain Awareness Center, which entailed not just lobbying but street protests. The citys project ended up dying a slow death, becoming limited to the port, which pulled all its funding and staffing for it.

Activists then pushed for a standing body to deal with privacy matters so that such a project couldnt sneak up on them again. These kinds of technological projects, smart city projects in some form or another keep coming and you cant always mobilize the entire city to react, Rosenberg says. We were lucky, but in the future it might be harder.

After a long struggle, the city council narrowly agreed in 2016 to create the standing commission. Hofer now consults formally and informally with other cities that are looking to put policies in place safeguards as concerns about the potential dark sides of technology use garners more attention. Through monthly conference calls, he trades ideas with municipal privacy professionals in several cities.

Im still flying around to conferences where the Domain Awareness Center is still the case study, and that was from 2014, Hofer says. Its still the shot heard round the world like you can actually say no to the surveillance state, and also come up with a reasonable solution that still addresses public safety concerns.

The ban on facial recognition and limits on surveillance have attracted most of the attention, but the commission has also worked on a broad range of policies, clarifying general data usage rules and reviewing agreements with outside agencies such as DHS, ATF and the FBI.

Each member of the city council appoints a commissioner. That has sometimes made recruitment a challenge, since members have to come from within relatively small districts. The commission has nevertheless attracted people with a range of backgrounds, including civil rights attorneys, a retired police officer and the occasional technology company employee. In drafting some policies, the commission has relied on outside expertise from entities such as the University of California, Berkeley and the American Civil Liberties Union.

The commission constantly encounters pushback, both from technology companies who warn that constraints will hamper innovation and from city staff who worry about increased paperwork or other administrative burdens. Hofer does his best to address such concerns, arguing that an insistence on transparency and an airing of issues does not mean projects will fail to move forward.

He feels that his most convincing argument comes from building consensus, both within the commission and among city staff. Since the commission started four years ago, its seen a grand total of one no vote and three abstentions. Every other position has been taken unanimously.

Ive always said that Im going to resign as chair if theres ever a 5-4 vote that I have to send to the council, Hofer says. Were an advisory body, were supposed to be subject matter experts, and if I send a 5-4 vote up, that means we didnt do our job.

The commmission is now working on an umbrella set of privacy principles that will govern all kinds of non-law-enforcement activities, such as business licensing and tax collection. Hes concerned, given the ever-evolving technological landscape, that information collected innocuously at first could eventually be used in ways that put people in harms way.

Thats certainly been a theme with various uses of technology. License plate readers were initially brought in for parking enforcement, but the data they collect is then stored and used for broader surveillance. The same with smart streetlights, which may first be seen as just a way to cut the electricity bill, but can be equipped with surveillance cameras and sensors. Oakland's commission can ask vendors who come in promising labor and cost savings for the city about the potential privacy issues raised by their products and services.

Four years after its founding, the Oakland privacy commission remains unique in allowing citizens to help draw the lines when it comes to the full range of privacy concerns. In many cities, privacy remains under the purview of government officials themselves, which may or may not provide a forum for hearing citizens concerns. There can also be turf protection, with police commissions, for example, insisting that they should handle any privacy issues that might arise.

The list of technologies that will affect privacy at the local level, from biometrics to doorbell cameras, will only continue to grow. Governments tend to be reactive, but it might be time for cities and counties to try to get out in front.

Our argument is its really a lot easier if you do it on the front end and set up a system for what you can see coming, Rosenberg says. If youre not actively doing something to be on top of it, youre going to throw up your hands, let everything come through and be dealing with privacy and usage concerns on the back end.

View original post here:
What Cities Can Learn from the Nation's Only Privacy Commission - Governing

Signal becomes European Commissions messaging app of choice in security clampdown – The Verge

The European Commission has told its staff to switch to the encrypted Signal messaging app in a move thats designed to increase the security of its communications. Politico reports that, earlier this month, a message on the commissions internal messaging boards notified employees about the change. Signal has been selected as the recommended application for public instant messaging, the message to the EUs executive branch says.

According to Politico, Signal will not be used for all communication. Encrypted emails will be used to send non-classified but sensitive information, and classified documents use tighter security measures still. Signal, meanwhile, is intended to be used for external communications between staff and people outside the organization.

The initiative comes as the EU is attempting to lock down the security of its communications in the wake of high-profile hacks. In June 2018, BuzzFeed News reported that the European Unions embassy in Moscow had been hacked and had information stolen from its network. Later that year, The New York Times reported that the EUs diplomatic communications network had been hacked over the course of a three-year period in a display of the remarkably poor protection given to official communications.

The European Commission is not the only governmental body to tell its staff to switch to Signal. Last December, The Guardian reported that the UKs ruling party, the Conservatives, told its MPs to switch to the service from WhatsApp. At the time, there was speculation that the switch was done in order to take advantage of Signals disappearing messages feature to stop leaks like those the party saw while using WhatsApp. However, a party spokesperson claimed it was because its recent influx of newly elected MPs meant that it had exceeded WhatsApps maximum group size.

Signal is generally considered to be one of the most secure messaging apps available. Its open source, uses end-to-end encryption by default, and unlike WhatsApp, it doesnt store any message metadata or use the cloud to back up messages. Edward Snowden said at one point that he uses it every day, and it even has the backing of one of WhatsApps original co-founders.

The rest is here:
Signal becomes European Commissions messaging app of choice in security clampdown - The Verge

How to Protect Bitcoin for Your Heirs With the Push of a Dead Mans Button – Yahoo Finance

What happens to your bitcoin after you die?

This is more than just a philosophical question: It could involve a substantial amount of currency.

The question of crypto and the Great Beyond is what prompted about 20 or so developers to get together in London recently to experiment with repurposing the current lightning protocol to send private messages as a dead mans button, a system that cant be censored and would keep your crypto safe for your heirs.

Related: Bitcoins Coronavirus Selloff Throws Cold Water on Safe-Haven Argument

Lightning Labs infrastructure engineer Joost Jager has been exploring using lightning for messaging over the past year. At the Advancing Bitcoin conference in London, Jager hosted a workshop to explore building a dead mans button with lightning. The mission was to show that lightning can be used as messaging system as well as a payment network.

These buttons are not new. At the workshop, Jager noted Edward Snowden, the National Security Agency whistleblower, used one in case he died before journalists could reveal the contents of the documents he wanted to make public.

The goal of the workshop was to explore one of lightnings relatively new features, keysend (formerly known as spontaneous payments). Its so experimental it isnt even described in the lightning specifications yet. But it does offer a way to send data (called custom records in LND, the lightning implementation Jager works on) along with a transaction.

Heres how it might work: Imagine a user who wants to pass on a bitcoin (BTC) inheritance. That user would communicate with a service, pushing a button that would send a message every week or so to signify that the user is still alive.

Related: $10K Proving a Tough Nut to Crack for Bitcoins Bulls

If the button isnt pressed one week, it is assumed the bitcoin user is dead or incapacitated and its time for the bitcoin to be passed on, at which point the service automatically dispenses a secret, which can be used to retrieve the crypto.

Beyond that, Jager thought some additional features should be added, even if they could make the program trickier to implement. The program should maintain the privacy of the sender and the receiver, he said, and should allow the sender to get proof the service still has the secret.

Developers split into small groups to think about how to build a service that would meet all of these and other goals. The workshop developers came up with some ideas, which Jager published to GitHub. He included a rough implementation, which puts several of the ideas into practice, though he said the code is extremely limited and does not implement everything described.

This design isnt necessarily the best way forward, Jager said, but its a proof of concept he hopes can inspire other implementations.

Jager told CoinDesk the primary reason he chose the dead mans button for the workshop was it is complex enough of a use case that it can show off what lightning can do as a messaging system.

But he also thinks a dead mans button could be a real use case for lightning down the road.

Many people try to arrange their crypto inheritance and need to make up their minds about who they trust. This could be an alternative, assuming that wrinkles are ironed out and the whole process is hidden underneath a user-friendly shell, he said. This is unlikely to happen short term, but I hope people see the possibilities.

Lawyer Pamela Morgan, an expert on crypto inheritance and author of a book dedicated to helping people develop a plan to pass on their crypto, agrees with Jager the technology is far from ready. But she said she would not encourage users to put any money into any experimental dead mans button systems just yet.

Dead mans switches are fun projects that excite our imaginations but fail to solve the complex and multidisciplinary challenges of crypto asset inheritance distribution. Relying on such solutions for something as important as inheritance is likely to cause catastrophic loss, she told CoinDesk.

However, she said the technology has promise. Since few crypto enthusiasts have any sort of a plan for what to do with their currencies after they are gone, shes happy to see people exploring ways to make crypto inheritance a more common practice.

If adding a dead mans switch makes more people actually do inheritance planning for their bitcoin, then Im all for it because so few people actually do anything, she told CoinDesk.

In the meantime, Jager is pressing on with beefing up lightnings messaging system to make it easier to send messages across the network.

Correction (Feb. 24, 22:52 UTC): This article has been updated to clarify the intent of the workshop.

See the original post here:
How to Protect Bitcoin for Your Heirs With the Push of a Dead Mans Button - Yahoo Finance

Bernie Sanders Is the Only Leading Presidential Candidate Publicly Opposing the Patriot Act – In These Times

Many Democrats are still acquiescing to a George W. Bush-era policy that has been in place for nearly 20 years.

There is still broad bipartisan support for the CDR program, bringing significant risk that Democrats could cut a deal for reforms with significantly less teethand more loopholesthan SAPRA.

Three key provisions of the USA Patriot Act, which give the Trump administration broad surveillance powers, are set to expire on March 15 unless Congress votes to reauthorize them. Sen. Bernie Sanders (I-Vt.) is the only leading democratic presidential candidate in Congress who is publicly opposing them.

I voted against the Patriot Act in 2001, 2006, 2011 and 2015. I strongly oppose its reauthorization next month, he tweeted on February 11. I believe that in a democratic and constitutional form of government, we cannot sacrifice the civil liberties that make us a free country.

One provision is section 215, the bulk metadata collection program exposed by Edward Snowden. This provision underwent modest post-Snowden reforms in 2015, but its essence remains largely intact in the call detail records (CDR) program. The program authorizes the NSA to seize call records of people deemed a targetand the people those targets communicate with. In 2017 and 2018, this provision allowed the government to collect more than 968 million records. The government recently shut down the CDR program, admitting to its overreach, but the legal authority to reinstate it at any time remains.

This CDR program was shuttered by the government because of massive over-collection of millions of Americans records, Sandra Fulton, government relations director for Free Press, tellsIn These Times. At this point, eliminating the CDR program is low-hanging fruit for any reform that is at all acceptable. According to Fulton, even if the CDR program is currently shuttered, keeping it on the books is a problem, because the government could reactivate it at any time. If we find a program that's being an abuse, the government doesn't just get to keep it, she says.

The other two senators among the leading Demoratic candidates, Elizabeth Warren (D-Mass.) and Amy Klobuchar (D-Minn.), have not made similar statements publicly opposing the reauthorization, and neither returned In These Times request for comment.

Sen. Klobuchar voted to reauthorize the Patriot Act in 2011, while Sanders did not (Warren was not yet in the Senate). Both Klobuchar and Warren voted in favor of the USA Freedom Act in 2015, which imposed limited reforms on the Patriot Act; Sanders voted no, citing the inadequacy of the reforms. Warren did, however, vote no on2018 on a bill to extend the NSAs powers to carry out warrantless surveillance for another six years, as did Sanders. Klobuchar voted yes.

Speaking publicly against the Patriot Act could have a significant impact at a time Democrats are still acquiescing to a George W. Bush-era policy that has been in place for nearly 20 years. Last November, Democrats voted overwhelmingly for a measure granting a three-month extension of the three Patriot Act provisions, included in a House resolution to prevent a government shutdown, infuriating civil rights activists. Only 10 Democrats in the House voted against the reauthorization, among them Reps. Alexandria Ocasio-Cortez (N.Y.), Ilhan Omar (Minn.), Ayanna Pressley (Mass.) and Rashida Tlaib (Mich.), known as the squad. But Congressional Progressive Caucus (CPC) co-chairs Reps. Pramila Jayapal (D-Wash.) and Mark Pocan (D-Wisc.), and vice chairs, Reps. Ro Khanna (Calif.) and Barbara Lee (Calif.),all voted for it. (Neither Sanders, Warren nor Klobuchar were present for the Senate vote.)

As Sam Adler-Bell previously reported, the CPC said the extension was necessary to negotiate for better reforms, butthose progressives who voted yes caught considerable heat from activists. While we would oppose these authorities under any administration, history demonstrates that mass surveillance disproportionately impacts communities of color, immigrants, and other marginalized groups that Donald Trump is actively targeting, the activist organization Demand Progress said in a statement.

Likely in response to criticism, the CPC now says it doesnt plan to acquiesce to Bush-era spy powers so easily in mid-March.

For far too long, Congress has permitted blatant, unconstitutional violations of Americans Fourth Amendment rights under the PATRIOT Act, co-chairs Jayapal and Pocan told In These Times via email. Any long-term reauthorization of this legislation must contain meaningful and substantial reforms to these legal authorities, as proposed in the Safeguarding Americans Private Records Act (SAPRA), in order to secure our support.

Introduced by Sens. Ron Wyden (DOre.) and Steve Daines (RMont.) and Reps. Zoe Lofgren (DCalif.), Warren Davidson (ROhio), and Pramila Jayapal (DWash.),SAPRA, introduced in the House by on January 24 by Rep. Zoe Lofgren (D-Calif.), would rescind authority for the CDR program. It has attracted support from a coalition of civil rights and privacy organizations, among them Color Of Change, Committee of Concerned Scientists and Indivisible.

However, the organizations note that the reform has shortcomings. In a letter, the coalition said that SAPRA does not, for instance, prohibit backdoor searches under Section 702, a loophole that poses a dangerous threat to Americans privacy by allowing the government to search through communications collected under Section 702 of FISA seeking information about Americans without a warrant. Further, it reauthorizes the so-called lone wolf authority, which has never been used and should be repealed just like the Section 215 CDR program. This lone wolf authority allowsthe government to wiretap someone who is not a U.S. personand not a part of a terrorist organizationbut deemed by the United States to be helping international terrorism (it is believedthat this provision has never been used).

Nonetheless, David Segal, the executive director of Demand Progress, tells In These Timesthat SAPRA is the only genuine reform bill in play.

Whatever this bills shortcomings, its almost certain to face opposition not only from the Trump administration, but from the Democratic Party leadership. House Speaker Nancy Pelosi (D-Calif.) played a significant role in November in pushing Democrats to endorse a reauthorization of the Patriot Actwith no reformsby slipping it into the funding bill. And impeachment manager Rep. Adam Schiff (D-Calif.), who boosted his public profile by emphatically declaring that President Trump is dangerous to this country, was among the yes votes for full reauthorization of that presidents spy powers.

There is still significantbipartisan support for the CDR program, bringing significant risk that Democrats could cut a deal for reforms with less teethand more loopholesthan SAPRA.

A Sanders spokesperson noted to In These Times that the senator has been a supporter of Wyden's efforts to reform the Patriot Act and cosponsored his bipartisan USA RIGHTS Act. The spokespersonindicated that Sanders opposes the current iteration of the Patriot Act but would likely support Wyden's SAPRA legislation in the Senate, as it goes much further to protect privacy and civil liberties than a sunset of Section 215.

By coming out now against the mass surveillance powers, Sanders appears to besignaling to the CPC that it should find its backbone on this issue. And those who stay silent are implicitly encouraging the opposite.

This piece has beenupdated to include remarks from a spokesperson for Sanders that was sent following publication.

Visit link:
Bernie Sanders Is the Only Leading Presidential Candidate Publicly Opposing the Patriot Act - In These Times

highlandcountypress.com – The Highland County Press

By U.S. Rep. Warren DavidsonR-Ohio

When our Founding Fathers established this American republic, a wise group insisted that our Constitution include the Bill of Rights to ensure that the federal government they created could not infringe on the natural rights of Americans.

The First and Second Amendments protect speech and the right to bear arms, respectively. After these essential freedoms, however, comes an amendment that most consider obsolete: the Third, which prohibits the federal government from quartering soldiers in Americans homes during peacetime.

On its face, Americans shouldnt have to worry about the Third Amendment. The Founding Fathers ban on quartering addresses a problem we no longer encounter. But considered in tandem with the Fourth Amendment the right to keep private property and documents secure against illegal searches a different picture emerges.

Taken together, the Third and Fourth Amendments dovetail to form a right to privacy as it applies to the home, your property and your person. The government cannot take up residence in your home, and it cannot look through your private effects without a warrant.

Yet, thats exactly what the government has done. Technology developed for national security purposes has worked its way into phones, computers, and tablets, compromising not only the American right to privacy but also the sanctity of our homes.

In less than a month, Congress can make our Founding Fathers proud and reverse this invasive legal trend. Section 215 of the Foreign Intelligence Surveillance Act (FISA) is set to expire on March 15. FISA was originally created in 1978 and massively expanded after 9/11 under the USA Patriot Act. The emphasis is on foreign there are supposed to be safeguards to protect American citizens from being targeted with these statutes. However, the U.S. government secretly has secretly taken up residence in Americans smartphones and laptops.

The scope of abuse from this troublesome law gained notoriety in 2013 when whistleblower Edward Snowden went public about the National Security Agencys warrantless warehousing of millions of Americans phone records and geolocation data. Congress attempted to fix the obvious legal problems by passing the USA Freedom Act in 2015, but we know the story of abuse doesnt end there.

In 2016, the FBI spied on a member of President Trumps campaign staff, abusing the lax standards and selective enforcement mechanisms of FISA, using stories of Russian meddling in the general election to prop up Clinton-sourced opposition research linked to Russia.

Separately, an October 2018 Foreign Intelligence Surveillance Court (FISC) found more widescale abuse. The FISC opinion was declassified on Oct. 8, 2019. Among other findings, it shows that the FBI made 3.1-million warrantless searches of Americans data in 2017 alone. Some 57,000 individuals were subject to illegal searches by the FBI in April 2018. The year before, the FBI searched over 70,000 email addresses and phone numbers. At no point did a court issue a warrant and these individuals were never notified that they were subject to such a search.

Worse, no one batted an eye after the FISA court made these abuses public -- despite the fact that such abuses strike at the very heart of our civil liberties and the spirit of our constitutional republic.

Fortunately, an unlikely group of legislators from both chambers in Congress has banded together to introduce legislation that will finally protect the 3rd and 4th Amendments and overhaul FISA.

In the House, I joined my colleagues Reps. Zoe Lofgren, D-Calif., Pramila Jayapal D-Wash., Matt Gaetz, R-Fla., Earl Blumenauer, D-Ore. and Ted Yoho, R-Fla., to work with Sens. Ron Wyden, D-Ore., and Steve Daines, R-Mont., to introduce the Safeguarding Americans Private Records Act (SAPRA).

This bipartisan legislation ends the National Security Agency's mass phone record program and prohibits warrantless searches of GPS, web browsing, and search engine history. It also makes the FISA court more accountable by requiring the court to notify American citizens if theyve been investigated under FISA and forcing the court to disclose all opinions within six months of issuance.

SAPRA also creates public reporting requirements about the extent to which intelligence agencies illegally used FISA to surveil Americans or target activity protected under the First Amendment.

My fellow cosponsors and I can only hope that as the evidence has piled on that more of our colleagues will support our bipartisan legislation. With the March 15 deadline approaching, we are in a unique position to correct this prior constitutional malpractice and systemic abuse.

For too long, Congress has allowed an overzealous intelligence community to operate with limited oversight in the name of national security. America can (and must) sustain the worlds preeminent intelligence capabilities without infringing on the rights of American citizens. SAPRA will help bring this era of congressional negligence to an end and usher in a newfound appreciation for the power and purpose of those oft-forgotten Third and Fourth Amendments.

Go here to read the rest:
highlandcountypress.com - The Highland County Press

Americans, data, elections, encryption and the matter of trust – Security Boulevard

It was a massive tech fail. A systemic disaster. A debacle.

When we hear such words, we tend to assume theres been yet another cybersecurity breach. This time, however, it was something different. But it didnt do much to increase the publics trust in digital technology. In fact, it seemed to do quite the opposite.

The setting was Iowa. The timing was Feb. 3, and then Feb. 4, and then Feb. 5, 6.

The culprit was a smartphone app that didnt enable caucus chairs to report voting results. The backup hotline system didnt work well either, requiring long holds and some hang-ups. That prevented the campaigns and media from receiving and reporting those results to the public. At least one candidate apparently viewed this as an opportunity to get free air time. Others were furious. Meanwhile, the delay helped feed conspiracy theories. And the situation created further distrust of political institutions, electronic voting and digital technology in general.

Were just two months into 2020. But at least one other significant event impacting consumer trust in digital technology occurred this year. This one is largely seen as a positive development. Im referring to the California Consumer Privacy Act. The CCPA took effect Jan. 1 of this year. This newly enacted rule gives California residents greater control over their personal data. Under CCPA, these individuals can request and expect to receive the data organizations have on them. California residents can demand organizations delete their data. People who live in California also can forbid organizations from sharing their data with third parties.

Our research indicates such measures may engender consumer trust in technology and organizations using it. We surveyed more than 1,000 Americans as part of our research effort. Forty percent said their trust is higher when they can request their data be deleted. Forty-one percent said a feeling of personal data control equates to a greater sense of trust.

Twenty-eight percent believe they have more control over their data than they did a year ago. We learned that 26% feel they have less control of their data, or none at all. And 46% think they have the same personal data control as a year ago.

As in politics, the nation is divided in this arena.

More than half of the survey group said they were willing to accept personal data security risk to do online shopping (60%) or banking (55%) or to make digital payments (54%). More than half (54%) said they are not willing to do the same for the convenience of online voting.

A third (33%) said they are less confident about U.S. election security now than they were during the last presidential election year. More than half the country 59% said they are unsure or definitely will not trust the 2020 election results.

The nCipher results also suggests that Americans are pretty evenly divided on whether electronic voting (30%), paper ballots (35%) or a combination of the two (30%) are best. At least that was the breakdown prior to the Iowa caucuses, when paper ballots saved the day.

At this time in which government and other organizations clearly need to build trust in how they handle and secure data, it may be useful to revisit advice from our former leaders. President Theodore Roosevelt famously said: Speak softly and carry a big stick. In todays digital world, its important to secure and safeguard the privacy of personal data. Encryption and key management in this scenario can act as the big stick.

Nearly half (49%) of Americans said they trust that a company is safeguarding their personal data when it uses encryption. About a third said encrypted ballots (31%) and/or encrypted voter registration data (33%) would increase their trust in election security. Strong data security in the form of encryption can build or rebuild trust in our governments and democracy, in businesses, and in the technologies that Americans use every day.

Americans can play their part in cybersecurity and personal data privacy by practicing good password hygiene. But, as most of us know, thats not always easy.

Nearly three-fourths (74%) of Americans said it is somewhat, very or just plain frustrating when they have to log in to applications at work multiple times a day. More than three-fourths (78%) said they have had to change their password because they forgot it on at least a few occasions. More than a fourth (28%) said they use the same passwords for work and personal uses.

These types of challenges helped inform Entrust Datacards decision to release its Passwordless SSO Authentication solution, which turns employee smartphones into biometrics-protected virtual smart cards that allow instant proximity-based login to both workstations and applications. The solution eliminates passwords and puts an end to the risk of bad actors stealing user credentials and compromising critical information.

Outside of the workplace, the average person can more effectively and securely shoulder the burden of passwords by using a password manager app. And Americans can protect themselves and their neighbors by following security and data privacy best practices.

There is at least one part of this FDR inaugural address that applies here. This is preeminently the time to speak the truth, the whole truth, frankly and boldly.

The truth is that we must all do our part to protect and secure data and devices. Doing so will go a long way in building trust in our always-on, data-driven world.

Please click here for more information about nCiphers security solutions. If youre attending RSA, visit nCipher and Entrust Datacard at booth S-2139.

You can also follow nCipher on Twitter, LinkedIn, and Facebook.

Read more from the original source:
Americans, data, elections, encryption and the matter of trust - Security Boulevard

Backdoor to encryption back on agenda in absurdly named bill – 9to5Mac

An absurdly named bill is set to form the latest attempt to create legislation requiring tech giants to provide a backdoor to encryption.

The Eliminating Abuse and Rampant Neglect of Interactive Technologies Act of 2019 (EARN IT Act) is co-sponsored by Lindsey Graham (R-SC), chairman of the Senate Judiciary Committee, and Senator Richard Blumenthal (D-CT)

The acronym is intended to suggest that tech companies should be required to earn the right to Section 230 protections, which mean that companies proving communication platforms cant be held legally liable for things posted by users.

Reuters reports that the bill seeks to impose conditions on this protection, and that providing a backdoor to encryption is believed to be one of them.

The bill threatens this key immunity unless companies comply with a set of best practices, which will be determined by a 15-member commission led by the Attorney General []

The sources said the US tech industry fears these best practices will be used to condemn end-to-end encryption a technology for privacy and security that scrambles messages so that they can be deciphered only by the sender and intended recipient. Federal law enforcement agencies have complained that such encryption hinders their investigations.

Online platforms are exempted from letting law enforcement access their encrypted networks. The proposed legislation provides a workaround to bypass that, the sources said.

Graham (pictured above) has previously criticized Apple for using strong encryption in iPhones, and suggested that the company either needs to voluntarily provide a backdoor or have one forced on it by law.

Committee chairman Senator Lindsey Graham (R-SC) warned the representatives of the tech companies, Youre gonna find a way to do this or were going to do it for you.

Graham didnt appear to understand the contradictory stance he was taking, saying on the one hand that he appreciated that people cannot hack into my phone while at the same time asking Apple to create a vulnerability that would inevitably be discovered by others and used to do just that.

Apple has persistently come under government pressure to compromise the privacy of iPhone owners, the San Bernardino, California, case being the highest-profile example, followed by the more recent Pensacola, Florida, shooting. Weve previously outlined the arguments for Apples stance, both before and after the San Bernardino shooting.

Currently, the company appears to have opted for a compromise: Refusing to do anything to weaken iPhones, but deliberately using a weaker encryption method for iCloud backups. Apple doesnt use end-to-end encryption for these, meaning it holds a key and is able to provide a copy of most data stored on an iPhone when served with a court order to do so.

It had been suggested that Apple abandoned plans to adopt end-to-end encryption for iCloud backups after pressure from the FBI, though doubt was soon cast on this version of events.

Image: CNN

FTC: We use income earning auto affiliate links. More.

Check out 9to5Mac on YouTube for more Apple news:

Read the original here:
Backdoor to encryption back on agenda in absurdly named bill - 9to5Mac

No Backdoor on Human Rights: Why Encryption Cannot Be Compromised – Bitcoin News

In April 2019, the UK issued an Online Harms White Paper to announce its campaign to rein in harmful speech on social media sites such as Facebook and TikTok. The public consultation period has ended and a full consultation response is expected in Spring 2020. (Initial Consultation Response here.) Legislation to criminalize freedom of speech will follow quickly.

Also read: Cryptocurrency Is Agorism in Action

The United Kingdom has become the first Western nation to move ahead with large-scale censorship of the internet Boris Johnson has unveiled rules that will punish internet companies with fines, and even imprisonment, if they fail to protect users from harmful and illegal content. Couched in language that suggests this is being done to protect children from pedophiles and vulnerable people from cyberbullying, the proposals will place a massive burden on small companies. Further, they will ultimately make it impossible for those not of the pervasive politically correct ideology to produce and share content. Mark Angelides, Britain allows the internet to be censored, a warning for the U.S.

The bills exact language is not known, but its thrust is clear. Internet companies with user-generated content will need to enforce anti-harm rules in order to avoid fines, imprisonment, or their sites being blocked. Home Secretary Priti Patel explained, It is incumbent on tech firms to balance issues of privacy and technological advances with child protection.

The main target of attack is end-to-end encrypted (E2EE) messages that can be read only by a sender and a recipient by using unique cryptographic keys as decoders. Third parties cannot access the content. E2EE is the most effective privacy tool that is both easy to use and available to everyone, often for free. To comply with UK law, however, companies will need to eschew encryption or to install backdoorsportals that allow someone to enter a system in an undetected manner.

Angelidess warning to the U.S. is timely because Congress is considering a similar measure: the EARN It Act. Again, the Acts justification is to protect children and to thwart evil-doers. After all, who else needs encryption? According to the United Nations, everyone.

In 2015, the UN issued a report on encryption and anonymity in the context of human rights. The report found encryption to be key to the right of privacy. In turn, privacy enabled freedom of speech through which people could explore basic aspects of their identity, including religion and sexuality. The reports author David Kaye cautioned against using backdoors because of the unprecedented capacity of authorities, companies, criminals, and the malicious to attack peoples ability to share information safely. Kaye acknowledged the alleged need of law enforcement to read encrypted messages but on a case-by-case basis rather than blanket approach.

This a long-held position for the UN. In 2016, Zeid Raad Al Hussein, UN High Commissioner for Human Rights, published a warning entitled Apple-FBI case could have serious global ramifications for human rights. Zeid cautioned:

Encryption tools are widely used around the world, including by human rights defenders, civil society, journalists, whistle-blowers and political dissidents facing persecution and harassment Encryption and anonymity are needed as enablers of both freedom of expression and opinion, and the right to privacy. It is neither fanciful nor an exaggeration to say that, without encryption tools, lives may be endangered. In the worst cases, a Governments ability to break into its citizens phones may lead to the persecution of individuals who are simply exercising their fundamental human rights.

Amnesty International agrees. A 2016 article, Encryption: A Matter of Human Rights, argued, Forcing companies to provide backdoors to the encryption deployed constitutes a significant interference with users rights to privacy and freedom of expression. Given that such measures indiscriminately affect all users online privacy by undermining the security of their electronic communications and private data, Amnesty International believes they are inherently disproportionate, and thus impermissible under international human rights law.

Why, then, are states rushing to crack open encryption? Because information is power. It is a prerequisite to demanding money and imposing social control. For decades, surveillance functioned from the shadows but now it openly demands access to peoples thoughts and lives. Who else but evil-doers would say no?

U.S. Attorney General William Barr has been loud in his demand that law enforcement be able to access encrypted communicationsusually through a backdoor. Barr wants this access even when there is no cybersecurity risk or alleged crime. He may soon get what he wants so badly.

The EARN It ActEliminating Abusive and Rampant Negligent of Interactive Technologies Actwould establish a National Commission on Online Child Exploitation Prevention to be headed by Barr, who has the authority to overrule it to become a one-man power. As well as child exploitation prevention, the Act asserts a vague mandate and for other purposes. This is a blank check, with only the elimination of election misinformation being specifically mentioned. Republican Lindsey Graham and Democratic Richard Blumenthal are pushing the measure in the Senate on a bipartisan basis.

The draft bill does not mention encryption, but it requires tech companies to assist law enforcement in identifying, reporting, and removing or preserving evidence about child exploitation and for other purposes. E2EE would make it impossible for those companies to provide such assistance.

The EARN It Act would de facto prohibit the E2EE offered by services such as WhatsApp; it would short circuit Facebooks plans to encrypt its messaging apps; companies like Apple would be in legal jeopardy if they refused to insert backdoors in their software and devices.

Legal jeopardy is the Acts enforcement mechanism. A non-compliant tech company would lose Section 230 immunity in both civil and criminal courts for child exploitation and for as-yet-unspecified offenses that occur on its site or over its platform. The free-speech champion Electronic Frontier Foundation (EFF) explained the significance of Section 230 of the Communications Decency Act; it is the most important law protecting free speech online. The protection is based on distinguishing between a platform and a publisher. Section 230 states, No provider or user of an interactive computer service [platform] shall be treated as the publisher or speaker of any information provided by another information content provider.

A platform provides services, tools, and products with which users create their own content; it bears no more legal responsibility for this content than a phone company does for the conversations that flow over it. By contrast, a publisher edits or otherwise controls content, which makes it legally liable.

EFF continued, Section 230 enforces the common-sense principle that if you say something illegal online, you should be the one held responsible, not the website or platform where you said it (with some important exceptions) Without it, social media as we know it today wouldnt exist And it doesnt just protect tech platforms either: if youve ever forwarded an email, thank Section 230 that you could do that without inviting legal risk on yourself.

EARN It not only strips immunity from non-compliant companies, it also weakens the standard by which they can be sued. It is now necessary for a plaintiff to prove that a company knew an offense was occurring in order to sue; EARN It would require a plaintiff only to show that the company acted recklessly. In a keynote address at the 2019 International Conference on Cyber Security, A.G. Barr defined E2EE as inherently irresponsible. The costs of irresponsible encryption that blocks legitimate law enforcement access is ultimately measured in a mounting number of victims men, women, and children who are the victims of crimes crimes that could have been prevented if law enforcement had been given lawful access to encrypted evidence. To Barr, the mere presence of backdoor-free E2EE constitutes recklessness.

The targets of EARN It seem to be the internet giants that have aroused bipartisan rage. At a recent Senate Judiciary Committee hearing entitled Encryption and Lawful Access: Evaluating Benefits and Risks to Public Safety and Privacy, Apple and Facebook were attacked for using warrant-proof encryption that prevented authorities from investigating terrorism, organized crime and child sexual exploitation. Internet giants might not be the main victims of EARN It, however.

EFF explained, Undermining Section 230 does far more to hurt new startups than to hurt Facebook and Google. 2018s poorly-named Allow States and Victims to Fight Online Sex Trafficking Act (FOSTA)the only major change to Section 230 since it passed in 1996was endorsed by nearly every major Internet company. One consequence of FOSTA was the closure of a number of online dating services, a niche that Facebook set about filling just weeks after the law passed. The legal need to screen or filter content placed smaller companies at a competitive disadvantage with the likes of Google.

Unfortunately, an ongoing backlash against Big Tech may propel EARN It through Congress. Moreover, Congress undoubtedly wants to have better control over social media before the 2020 elections. The EARN It Act will arrive with a cry of Save our children! But its impact will be to stifle freedom of speech across the spectrum, to hobble small businesses, and to make all users more vulnerable to criminals, including agents of the state.

Op-ed disclaimer: This is an Op-ed article. The opinions expressed in this article are the authors own. Bitcoin.com is not responsible for or liable for any content, accuracy or quality within the Op-ed article. Readers should do their own due diligence before taking any actions related to the content. Bitcoin.com is not responsible, directly or indirectly, for any damage or loss caused or alleged to be caused by or in connection with the use of or reliance on any information in this Op-ed article.

Images courtesy of Shutterstock.

Did you know you can verify any unconfirmed Bitcoin transaction with our Bitcoin Block Explorer tool? Simply complete a Bitcoin address search to view it on the blockchain. Plus, visit our Bitcoin Charts to see whats happening in the industry.

Wendy McElroy is a Canadian individualist anarchist and individualist feminist. She was a co-founder of the Voluntaryist magazine and modern movement in 1982, and has authored over a dozen books, scripted dozens of documentaries, worked several years for FOX News and written hundreds of articles in periodicals ranging from scholarly journals to Penthouse. She has been a vocal defender of WikiLeaks and its head Julian Assange.

Read this article:
No Backdoor on Human Rights: Why Encryption Cannot Be Compromised - Bitcoin News

cloudAshur, hands on: Encrypt, share and manage your files locally and in the cloud – ZDNet

Cloud storage and collaboration services like Dropbox are convenient, but not every business is comfortable with the level of security provided. If employees are sharing files with customer information or details of your next product launch, how do you make that more secure? You can hope that employees use a strong password and don't get phished; you can hope that they use multi-factor authentication (MFA); or you can use an identity service like Okta or AzureAD that wraps those services in a single sign-on system and enforces MFA.

Or if you want to be a bit more hands-on about it and get more control over where and when employees can work on cloud files, iStorage's cloudAshur (pronounced 'assure') is a 99 (ex. VAT) rugged hardware key for PCs and Macs that stores encryption keys (AES-ECB or AES-XTS 256-bit) and authenticates the computer when you plug it into a USB port (USB-B rather than USB-C).

Give each employee a key and the cloudAshur software, and both local files and files stored in the cloud and shared with colleagues via cloudAshur can be encrypted. They can only be viewed or edited after the physical key is placed into a USB port, a 7-15 digit PIN typed in on the keypad, and a username and password entered into the cloudAshur software to sign into the cloud account. An attacker who successfully phishes for the cloud storage credentials will only see encrypted .IST files that they can't open or even preview -- and so will the user until they plug in the USB key, enter the PIN and sign in.

The inconvenience of having to do all that just to get some work done is balanced by the way cloudAshur brings together files from different cloud services. You see an extra cloudAshur drive in Explorer or the Finder with virtual folders for each cloud service you use, with the files that have been shared with you, and you drag files you want to encrypt into the folder.

The PIN-protected cloudAshur USB dongle from iStorage lets you share enrypted files with other users -- so long as they have matching devices and have logged into the client app.

You can use cloudAshur individually, to protect your own files, and set it up yourself. But if you want to share encrypted files with colleagues, they need their own cloudAshur that's been provisioned with the same encryption key as yours. That means buying the iStorage KeyWriter software, which uses one cloudAshur as the master key and clones the encryption keys to more cloudAshur devices for other people to use.

You can clone cloudAshur dongles from a master device using the KeyWriter software.

If you do that, your organisation can also use the iStorage cloudAshur Remote Management Console (RMC) software to manage users and devices. This gives an admin much more control: you can see who is using the devices and where they are, (including a log of times and files accessed) and if you see unauthorised use you can disable the cloudAshur remotely. You can also set the times and physical locations where the keys can be used, if you want to limit them to business hours and business locations. You can only set one location , using a postcode and a radius around it, which isn't convenient if you want to allow people to work from your different office locations but not from home (and there are no exceptions for VPN connections).

You can also add extra security with the cloudAshure RMC software; encrypting file names so they don't give away any clues, blacklisting known bad IP addresses (annoyingly, you can only do that individually, rather than by specifying the far shorter list of IP addresses you want to allow) and blocking specific file types. The latter is referred to as 'blacklisting', which is confusing when it's next to the IP control setting; we'd also like to see iStorage join other vendors in moving to less contentious terms like 'block' and 'approve'.

The cloudAshur Remote Management Console (RMC) lets you manage users and devices.

Getting the PIN wrong ten times in a row locks the device. You can use the RMC software to change how many wrong attempts you want before this brute-force protection kicks in, and you can use the admin PIN to create a new user PIN. You can also set a one-time recovery PIN that you can give a remote user so they can create their own new PIN. Getting the admin PIN wrong ten times in a row deletes the user PINs and the encryption key. You can't set up the device without changing the default admin PIN -- a fiddly sequence of pressing the shift and lock keys on the device individually and in combination and watching the three colour LEDs blink or turn solid. Even with the limitations of a numeric keyboard, this seems unnecessarily complex.

If someone loses a device or leaves the company without giving it back, you can remotely kill the cloudAshur hardware; you can also temporarily disable a key if it's misplaced (and having both options stops users delaying reporting a key they hope to track down because having to get it reset or replaced will be inconvenient). You can also reset and redeploy a key, so if someone leaves the company you can safely reuse their key (and at this price, you'll want to).

A security system isn't much use if it can be physically cracked open and tampered with. The cloudAshur packaging comes with security seals over both ends of the box, although we were able to peel them off carefully without leaving any marks on the packaging, so a really dedicated adversary who managed to intercept your order could replace them with their own security seal.

The case is extruded aluminium that would be hard to open without leaving marks: iStorage says the design meets FIPs Level 3 for showing visible evidence of tampering and the components are coated in epoxy resin so they can't be swapped out.

The number keyboard is polymer coated to stop the keys you use for your PIN showing enough wear to give attackers a hint. The keys have a nice positive action, so you know when you've pressed them, and the lanyard hole on the end is large enough to fit onto a keyring or security badge lanyard. There's an aluminium sleeve to protect the key from water and dirt -- the device is IP68 rated. The sleeve also stops the battery getting run down if the keypad gets knocked in your bag.

Using cloudAshur isn't particularly complicated, but it is a bit more work than just using a cloud storage service. There are drawbacks like the inability to see previews in the cloud site to check you're opening the right file, and not being able to work offline -- even with a cloud service that syncs files to your device. And any mistakes about the times and locations where people can work could inconvenience employees on business trips.

The biggest threat with cloudAshur may not be hackers but employees who find it too much extra work and just don't encrypt files. This means you'll need to explain why you're asking them to carry a dongle and jump through these extra hoops.

Overall, cloudAshur is fairly well designed and offers a useful security boost -- as long as you can persuade employees to actually use it.

RECENT AND RELATED CONTENT

diskAshur2 and datAshur Pro, First Take: Secure but pricey mobile drives

Kingston IronKey D300 encrypted USB flash drive gets NATO Restricted Level certification

IronKey D300: Ultra durable USB flash drive with built-in encryption

Enterprise companies struggle to control security certificates, cryptographic keys

Google Cloud sets out new encryption controls as it looks to grow in Europe

Read more reviews

Read more from the original source:
cloudAshur, hands on: Encrypt, share and manage your files locally and in the cloud - ZDNet