Zoom Begins Rollout of End-To-End Encryption – My TechDecisions – TechDecisions

Zoom next week will begin rolling out its end-to-end encryption offering as a technical preview for 30 days as the company seeks feedback form its users, the company announced during its two-day virtual Zoomtopia event.

The enhanced encryption for both free and paid users comes after Zoom in May announced plans to build an end-to-end encryption (E2EE) model into the popular videoconferencing platform to increase meeting security. In a press release, the company says this initial roll out is the first of four phases in releasing the E2EE model.

Zoom earlier this year took 90 days to address security concerns with the platform after reports of meeting hijackers easily joining calls as usage skyrocketed in the early days of the COVID-19 lockdown. The company added on new security features like better meeting controls, stronger password protections, and enhanced encryption.

According to the company, its E2EE uses the same GCM encryption currently offered to Zoom users, but where those encryptions live has changed. Zooms cloud typically generates encryption keys and distributes them to meeting participants using Zoom apps as they join. With this new offering, the meetings host generates encryption keys and uses public key cryptography to distribute keys to the other meeting participants.

Read Next: Zoom Publishes Draft of Encryption Design

That turns Zooms servers into oblivious relays that never see the encryption keys required to decrypt meeting content, according to the company.

All participants must have the setting enabled to join a call that is end-to-end encrypted. Hosts can enable the setting at the account, group and user level, and can be locked at the account or group level, according to the company.

In the first phase, all participants must join from the Zoom desktop client, mobile app or Zoom Rooms.

End-to-end encryption is another stride toward making Zoom the most secure communications platform in the world, said Zoom CEO Eric S. Yuan in a statement. This phase of our E2EE offering provides the same security as existing end-to-end-encrypted messaging platforms, but with the video quality and scale that has made Zoom the communications solution of choice for hundreds of millions of people and the worlds largest enterprises.

At least in this version, enabling E2EE will disable some features, like joining before the host, cloud recording, streaming, live transcription, breakout rooms, polling, 1:1 private chat and meeting reactions.

The company is planning to roll out better identity management and E2EE SSO integration as part of the second phase, which is tentatively scheduled for 2021.

See more here:
Zoom Begins Rollout of End-To-End Encryption - My TechDecisions - TechDecisions

Is Signal secure? How the messaging app protects privacy – Business Insider – Business Insider

You might know that Signal is a popular messaging app that bills itself as being very secure, offering end-to-end encryption for a very high level of privacy.

It's not necessarily obvious, though, what all that means, and how Signal's technology affords any more protection than other messaging apps.

Signal offers end-to-end encryption, which essentially means that your messages are scrambled into an unintelligible collection of characters before leaving your device and are not decrypted back into meaningful content until reaching the Signal app on the recipient's device.

The Signal app boasts more privacy than its competitors. Signal

These encrypted messages can only be unlocked using a key that is shared between the two private conversations. No one else has access to the key or can decrypt the message not even the developers of the Signal app.

Because there is no "back door" to decrypting Signal messages, Signal can't decrypt messages for the government, for example, even under subpoena not because of policy, but because it's not technically possible.

Signal's encryption algorithm isn't proprietary or even unique. The encryption software used by Signal is open-source (and used by other messaging apps, including WhatsApp) and available for download on GitHub. This actually allows Signal to be more secure, because the open-source software is subject to public scrutiny by developers and security experts. It exposes bugs, flaws, and vulnerabilities sooner than if the software were closed and proprietary.

While the encryption software in Signal might not be unique, the app still has privacy advantages over other messaging apps. Signal records no data about its users or the conversations taking place within the app.

This is in contrast to other apps, like Apple iMessage and WhatsApp, to name two examples, which often store significant amounts of metadata, such as who you spoke to and detailed time logs of when those conversations occurred.

In a recent blog post, Signal creator Matthew Rosenfeld (known online as Moxie Marlinspike) explains that the Federal government used a subpoena in 2016 to access Signal's user data.

But as Rosenfeld writes, "there wasn't (and still isn't) really anything to obtain. The only Signal user data we have, and the only data the US government obtained as a result, was the date of account creation and the date of last use not user messages, groups, contacts, profile information, or anything else."

Go here to see the original:
Is Signal secure? How the messaging app protects privacy - Business Insider - Business Insider

U.S., UK and other countries warn tech firms that encryption creates ‘severe risks’ to public safety – CNBC

David Goddard | Getty Images News | Getty Images

LONDON Lawmakers from countries within the Five Eyes intelligence-sharing alliance have warned tech firms that unbreakable encryption technology "creates severe risks to public safety."

Ministers from the U.S., U.K., Canada, Australia and New Zealand published a statement Sunday calling on the tech industry to develop a solution that enabled law enforcement to access tightly encrypted messages.

"We urge industry to address our serious concerns where encryption is applied in a way that wholly precludes any legal access to content," the statement, which was signed by U.S. Attorney General William Barr and U.K. Home Secretary Priti Patel, said.

The statement, published on the website of the U.S. Department of Justice, was also signed by India and Japan, which are not part of the Five Eyes alliance.

Technology companies like Apple and Facebook encrypt user's communications "end-to-end," meaning that only users can access their own messages. It applies to written messages, as well as audio and video communications.

While citizens benefit from additional privacy, law enforcement agencies see end-to-end encryption as a barrier to their investigations and have been calling on tech companies to introduce backdoors that would give law enforcement agencies access.

"We call on technology companies to work with governments on reasonable, technically feasible solutions," the governments said.

They added that end-to-end encryption poses a "significant challenges to public safety, including to highly vulnerable members of our societies like sexually exploited children."

Although the nations did concede that some forms of encryption "play a crucial role in protecting personal data, privacy, intellectual property, trade secrets and cyber security."

Ultimately, they said they wanted to develop a solution with the tech firms that enabled users to continue communicating privately and securely, but also allow law enforcement and tech firms to monitor criminal activity.

Last year, a group of companies including Apple, Microsoft and WhatsApp opposed a proposal by British spy agency GCHQ that would enable spooks to access people's encrypted messages.

Under the proposal, GCHQ suggested adding "ghost" recipients to suspicious message threads that the sender and the receiver would be oblivious to.

In an open letter published last May, tech firms and privacy groups said such a feature would "threaten fundamental human rights."

Continue reading here:
U.S., UK and other countries warn tech firms that encryption creates 'severe risks' to public safety - CNBC

Homomorphic encryption tools find their niche – CSO Online

Organizations are starting to take an interest in homomorphic encryption, which allows computation to be performed directly on encrypted data without requiring access to a secret key. While the technology isnt new (it has been around for more than a decade), many of its implementations are, and most of the vendors are either startups or have only had products sold within the past few years.

While it's difficult to obtain precise pricing, most of these tools arent going to be cheap: Expect to spend at least six figures and sign multi-year contracts to get started. That ups the potential risk. Still, some existing deployments, particularly in financial services and healthcare, are worth studying to see how effective homomorphic encryption can be at solving privacy problems and delivering actionable data insights. Lets look at a few noteworthy examples.

With AML, you want to be able to correlate and query activities by the criminals across multiple banks but cant reveal who the targets are due to privacy regulations. Homomorphic encryption offers the ability to get this information without disclosing who the subject of the query is and instead hides this data from the entity that is processing the query. These bank-to-bank transactions are a natural fit for homomorphic encryption. Resolving some of these fraud cases could take months, but with homomorphic encryption they can be resolved within minutes.

That brings up another important point for homomorphic encryption: Because the encryption algorithms use problem-solving complex mathematics, they take more time to process transactions than non-encrypted methods. That isnt a surprise to anyone who has worked in the data encryption space, and the slower processing has been considered a roadblock to adoption. Homomorphic encryption vendors refute this notion.

Excerpt from:
Homomorphic encryption tools find their niche - CSO Online

Could homomorphic encryption be the solution to big data’s problem? – Siliconrepublic.com

Helical founder Eric Hess discusses how homomorphic encryption could change the way data is transferred and processed securely.

While advances in data analytics have enabled businesses to gain expanded insight into large structured and unstructured datasets, these advances have come with increased privacy and misappropriation risks.

Exercising greater control over the life cycle of data and confidentiality agreements has mitigated these risks but outsourcing of sensitive or regulated components of data processing to third parties is still widely viewed as fraught with risk.

If all sensitive data or data processes and algorithms could be shared with or processed by any third party (including competitors) subject to the providers controls, however, it would open up unimagined avenues of enterprise collaboration, specialisation and integration.

Homomorphic encryption solves for this significant gap and, while commercial viability is still a challenge, compelling use cases are emerging. In the coming years, any organisation endeavouring to become a centre of excellence in big data analytics will have no choice but to embrace homomorphic encryption.

Encryption is a digital safe where information is secured while locked inside. Plaintext data is converted to ciphertext using an algorithm that is sufficiently complicated to make the data unreadable without a decryption key. It can be stored and transmitted in this format and recipients can decrypt it, provided they have the key. Once encrypted data is needed for analysis, compliance or any other use case, it must be converted back to plaintext, which can sacrifice security.

Homomorphic encryption addresses this core weakness by allowing analysis on data in its ciphertext form. Craig Gentry, an early homomorphic encryption innovator, described the process as manipulating the contents of a locked box through gloves that are accessed through ports on the outside of the box.

One party places and locks contents in the box for a third party to manipulate without seeing what they are working on. The box is returned to the controller when the processor has completed the assigned task and custody is never surrendered.

Gentrys dissertation made homomorphic encryption attainable with one major barrier: computational overhead. Processing ciphertext creates a lot of overhead as the calculations are performed bit by bit. IBM has improved processing overhead, claiming it now runs 75 times faster than before, and a wide range of alternative schemes have further improved processing speeds.

Spurred by the collaborative models being deployed in connection with potential Covid-19 vaccines and treatments, homomorphic encryption will likely experience the highest relative rates of adoption and innovation in clinical research.

Homomorphic encryption can provide a mechanism for the life sciences industry to continue protecting intellectual property while leveraging the collaborative benefits from Covid-19 in other medical research.

Use cases will also be compelling for financial services, where data analytics defines the success or failure of algorithms and is becoming increasingly important as relative high-frequency trading advantages become more elusive. National security and critical infrastructure also provide early compelling use cases.

Encrypted processing will create new opportunities, applications and even industries

New opportunities will be created for data controllers (those with custody of data) to engage with data processors, as well as collaborative opportunities where the parties are both controllers and processors of data. Collaborative opportunities not only offer the benefits of specialisation but the promise of data collectives as well, where members will be able to define terms of use and disclosed outputs among its members.

Data collectives are not a new concept to securities markets. For example, in 2005 the US Securities and Exchange Commission mandated regulated security markets to act jointly to disseminate consolidated information on quotations and transactions in securities markets.

Now, homomorphic encryption could empower competitive financial firms to not only provide alternatives to these sources, but innovate collectively to create their own proprietary market data products.

For all the promise of machine learning, the process of training and tuning machine learning applications requires big datasets.

Industry collectives could aggregate encrypted data and assign processes to collective members or vendors. Not only would this permit greater specialisation, but the collective dataset would accelerate machine learning in a way that additional computing power or PhDs cannot.

A recent IBM case study leveraging machine learning on a homomorphically encrypted database sought to predict whether bank customers would likely need a loan in the near future. A machine learning algorithm selected the most relevant variables for predicting loan status. The algorithm was trained on both encrypted and unencrypted data to measure accuracy and efficiency. The result was a near identical rate of accuracy and a manageable level of slowdown a persuasive positive indicator for the arrival of homomorphic encryptions commercial viability.

Homomorphic encryption will also accelerate the movement of big data analytics to cloud environments. Organisations leveraging big data have been reticent about cloud security since downloading big datasets from the cloud for processing can be impractical.

On the other hand, performing data processing for their most sensitive data in the cloud also requires storing the data encryption key in the cloud, making an organisations security only as strong as the cloud environment. With homomorphic encryption, processing can occur in ciphertext form in the cloud with encryption keys stored offline.

Many initiatives endeavouring to harness the power of big data have struggled with resource limitations, current technologies and regulations. Take, for example, financial regulators who struggle with the burdens of monitoring financial audit trails across multiple markets, asset types and participants.

Aggregating and disseminating this data to regulators is critical for surveillance, but creates a treasure trove of highly sensitive, unencrypted data while it is processed, and this occurs across multiple regulators.

This big data problem and the risk that this information will be used to engage in manipulative trading or even destabilise financial markets will only continue to grow unless encryption is deployed throughout the datas life cycle. In fact, regulators only require audit trails related to red flags that their surveillance algorithms identify, which can all be done in a fully encrypted format.

The competing concerns of privacy regulation and the value of data analytics is also an issue that the healthcare industry has struggled with.

Fragmentation of health information is compounded by privacy concerns, which are a significant roadblock to data sharing and has prevented the integration of health data that could facilitate better health outcomes.The utility of digital health information systems could be greatly enhanced by the deployment of homomorphic encryption.

Encrypted processing will create new opportunities, applications and even industries by greatly minimising intellectual property and regulatory concerns. It may even turn competitors into collaborators.

Homomorphic encryption will also force a re-examination of baseline assumptions related to confidentiality and security.How will restrictions on disclosure apply to encrypted processing by third parties? What are appropriate access controls where the entire life cycle of data is encrypted?What is reasonable security for processors of such data?

Privacy regulation will need to be re-examined in light of personal information being mined in an encrypted format. If an organisation is prohibited from sharing or selling data, what are the legal implications of their sharing and processing encrypted data that is never exposed?

Lastly and importantly, how will we know that the technologies we are deploying to accomplish not only homomorphic encryption but homomorphically encrypted processes are complying with the applicable laws, standards and obligations? Solutions will need to be auditable by design.

Homomorphic encryption is about more than big data. It is about solving for trust with tools that have never been available before and for which no similar workaround existed.

By Eric Hess

Eric Hess is the founder ofHess Legal Counsel and Helical. Hess Legal advises securities and digital asset firms on contract, security and privacy, governance, technology licensing and financing issues. Helical offers a cybersecurity-as-a-service platform.

Read the original here:
Could homomorphic encryption be the solution to big data's problem? - Siliconrepublic.com

AeroVironment and Viasat to aim to improve radio encryption for Puma AE – Flightglobal

Up against increasingly sophisticated electronic warfare threats from countries such as Russia and China, drone maker AeroVironment and satellite communications company Viasat are partnering to develop better encrypted radio communications for the Puma AE reconnaissance unmanned air vehicle (UAV).

The two companies are working together under a contract granted through the US Army Reconfigurable Communications for Small Unmanned Systems initiative, AeroVironment said on 15 October. Viasat is the prime contractor for the award.

The two companies will seek to strengthen the communications and transmission security of AeroVironments Digital Data Link radios currently used by the US Army by converting them into a Type 1 crypto communication system for video and data transmission, says AeroVironment.

AeroVironments Digital Data Link is a small, manportable digital radio that controls the companys hand-launched tactical UAVs. A Type 1 crypto communication system is equipment classified or certified by the National Security Agency for encrypting and decrypting classified and sensitive national security information.

The US Army, which is one of the main operators of AeroVironments tactical drones, is pushing to network its various UAVs, aircraft, vehicles and soldiers so that battlefield information can be quickly shared. However, existing tactical communications systems have already been shown to be vulnerable to electronic warfare, including jamming and spoofing. Transmissions have also been used in conflict zones, for example in eastern Ukraine, to geolocate targets for attacks.

The Puma AE is a small fixed-wing UAV used for short-range intelligence, surveillance and reconnaissance. Depending on the ground antennae used, the drone can fly out to 32.3nm (60km) and can carry electro-optical and infrared cameras within a gimbal.

See more here:
AeroVironment and Viasat to aim to improve radio encryption for Puma AE - Flightglobal

Western governments double down efforts to curtail end-to-end encryption – The Daily Swig

Security community resists anti-encryption push as counter-productive

ANALYSISWestern governments have doubled down on their efforts to rein-in end-to-end encryption, arguing that the technology is impeding investigations into serious crimes including terrorism and child abuse.

In a joint statement (PDF) published over the weekend the Five Eyes (FVEY) intel alliance countries of Australia, Canada, New Zealand, the UK, and US were joined by India and Japan in calling for tech firms to enable law enforcement access to content upon production of a warrant.

The governments also want tech firm such as Apple and Facebook to consult with them on design decisions that might help or hinder this outcome.

The statements signatories call for tech firms to embed the safety of the public in system designs, thereby enabling companies to act against illegal content and activity effectively with no reduction to safety, and facilitating the investigation and prosecution of offences and safeguarding the vulnerable.

How might this work? GCHQ recently came up with a proposal for adding an extra party into an end-to-end encrypted chat via a ghost feature, a pointer to the sort of approaches intel agencies have in mind.

Security experts have pushed back against the proposals, arguing that they inevitably undermine the privacy and integrity of end-to end encryption the current gold standard for secure comms.

In end-to-end encryption systems the cryptographic keys needed to encrypt and decrypt communications are held on the devices of users, such as smartphones, rather than by service providers or other technology providers. Users therefore dont have to trust their ISPs or service providers not to snoop.

Popular instant messaging apps WhatsApp, iMessage, and Signal have placed E2E encryption in the hands of the average smartphone user.

So if governments come knocking with requests for the keys normally necessary to decrypt encrypted communications, then theres nothing to hand over.

Western government say they support the development of encryption in general, as a means to secure e-commerce transactions and protect the communications of law-abiding businesses and individuals its just E2E encryption they have an issue with. Governments have long argued that E2E encryption is hampering the investigation of serious crimes, at least on a larger scale.

Malware can be used by law enforcement against individuals targeted in surveillance operations, a tactic which if successful gives access to content without needing to break encryption.

And police in countries such as the UK, for example, already have the ability to compel disclosure of encryption secrets from suspects.

As the anonymous privacy activist behind the Spy Blog Twitter account noted: UK already has law for disclosure of plaintext material, regardless of encryption tech, but they want to do it in secret, in bulk.

The tweet referenced the Regulation of Investigatory Powers Act 2000 Part III, which deals with the investigation by law enforcement of electronic data protected by encryption.

Security experts were quick to criticize the latest government moves as a push to mandate encryption backdoors, supposedly accessible only to law enforcement. Several compared it to failed government encryption policies of the 1990s.

These included efforts to control the US export of encryption technologies and attempts to mandate key escrow.

Katie Moussouris, chief exec of Luta Security and an expert in bug bounties, tweeted: The 1st time they did this (look up crypto wars), it weakened e-commerce and all other web transactions for over a decade, enabling crime. I wish we didnt have to repeat these facts.

Encryption of any type can be viewed as a branch of applied mathematics but arguments that anyone can implement encryption in a few lines of code miss the point that what governments are seeking is to make encryption tools inaccessible to the broader public, according to noted cryptographer Matthew Green.

One thing thats different this time around compared to the first crypto wars is that governments have more levers to apply pressure on tech firms, including app store bans. Last month, for instance, the Trump administration threatened to ban TikTok in the US over supposed national security concerns unless owners Byte Dance sold the technology to a US firm.

Green noted: The current administration has demonstrated that app store bans can be used as a hammer to implement policy, and you can bet these folks are paying attention.

I also think that sideloading capability is likely to be eliminated (or strongly discouraged) in a regime where encryption bans are successful, he added.

Cryptographer Alex Muffett expressed fears that the government proposals might eventually result in non-compliant social networks [getting] banned under criminal law.

End-to-end encryption is a key tool towards securing the privacy of everyone on the planet, as the world becomes more connected. It must not be derailed, instead the police should be better funded for traditional investigation, Muffett said on Twitter.

RELATED Are we building surveillance into systems, or are we building in security?

Read more from the original source:
Western governments double down efforts to curtail end-to-end encryption - The Daily Swig

Encryption Backdoor? The Trump Administration Wants It. – The National Interest

Theres been a battle going on for the last several years, across multiple presidential administrations, between the government and the big tech companies, about encryption.

To simplify a complex issue, several major tech companies, including Apple with the iPhone, offer end-to-end encryption, which gives only users the ability to access their own devices.

Various law enforcement entities have made it clear over the years that they would like to have a way around such encryptionknown as a back doorwhen it comes to conducting investigations into crime, as well as terrorism. Apple, and other tech companies, have long resisted such efforts.

Most notably, that company and the government had a standoff in 2015, over government efforts to unlock an iPhone belonging to one of the San Bernadino shooters, a fight that was repeated earlier this year in the case of a phone belonging to the Pensacola shooter. However, the FBI was eventually able to unlock both the San Bernadino and Pensacolas phones, with the help of third parties, and law enforcement and prosecutors are often able to access the iCloud data of criminal targets, with use of subpoenas, something that users agree to when they sign up for iClouds terms of service.

Now, the Justice Department has teamed up with its counterparts in several other countriesknown as the Five Eyes to author a letter with concerns about end-to-end encryption, and offering a potential solution.

The letter, described as an international statement, was authored by U.S. Attorney General William Barr, British Home Secretary Priti Patel, Australian Minister for Home Affairs Peter Dutton, New Zealand Minister of Justice Andrew Little and Canadian Minister of Public Safety Bill Blair. Also signed to the letter are India and Japan, with no particular individual listed.

The statement says that the undersigned support strong encryption, but that they are concerned that particular implementations of encryption technology, however, pose significant challenges to public safety, including to highly vulnerable members of our societies like sexually exploited children.

The letter recommends that technology companies work with governments to take certain steps: Embed the safety of the public in system designs, enable law enforcement access to content in a readable and usable format where an authorisation is lawfully issued, is necessary and proportionate, and is subject to strong safeguards and oversight, and engage in consultation with governments and other stakeholders to facilitate legal access in a way that is substantive and genuinely influences design decisions.

Apple and other tech companies have consistently opposed such efforts, but they have not responded to the most recent statement.

In the event that a new administration takes power in January, its unclear whether a Biden Administration would take a different posture than that taken by the Trump Department of Justice. Biden has not addressed the issue during the campaign, although he was vice president during the San Bernadino affair, and Wired reported eight years ago that Biden, as a senator in 1991, added language to an anti-terrorism bill that would have required providers of electronic communications services and manufactures of electronic communications services shall ensure that communications systems permit the government to obtain theplaintext contentsof voice, data, and other communications when appropriately authorized by law.

Stephen Silver, a technology writer for The National Interest, is a journalist, essayist and film critic, who is also a contributor to Philly Voice, Philadelphia Weekly, the Jewish Telegraphic Agency, Living Life Fearless, Backstage magazine, BroadStreet Review and Splice Today. The co-founder of the Philadelphia Film Critics Circle, Stephen lives in suburban Philadelphia with his wife and two sons.Follow him on Twitter at @StephenSilver.

Image: Reuters

See the rest here:
Encryption Backdoor? The Trump Administration Wants It. - The National Interest

Dutton pushes against encryption yet again but oversight at home is slow – ZDNet

(Image: APH)

"We, the undersigned, support strong encryption, which plays a crucial role in protecting personal data, privacy, intellectual property, trade secrets and cybersecurity," wrote a bunch of nations on the weekend -- the Five Eyes, India, and Japan.

As a statement of intent, it's right up there with "Your privacy is very important to us", "Of course I love you", and "I'm not a racist but...".

At one level, there's not a lot new in this latest International statement: End-to-end encryption and public safety.

We like encryption, it says, but you can't have it because bad people can use it too.

"Encryption is an existential anchor of trust in the digital world and we do not support counter-productive and dangerous approaches that would materially weaken or limit security systems," the statement said.

"Particular implementations of encryption technology, however, pose significant challenges to public safety, including to highly vulnerable members of our societies like sexually exploited children."

The obviously important law enforcement task of tackling child sexual abuse framed the rest of the statement's two substantive pages too.

End-to-end encryption should not come at the expense of children's safety, it said. There was only a passing mention of "terrorists and other criminals".

This statement, like all those that have come before it, tries, but of course, fails to square the circle: A system either is end-to-end encryption, or it isn't.

According to renowned Australian cryptographer Dr Vanessa Teague, the main characteristic of this approach is "deceitfulness".

She focuses on another phrase in the statement, where it complains about "end-to-end encryption [which] is implemented in a way that precludes all access to content".

"That's what end-to-end encryption is, gentlemen," Teague tweeted.

"So either say you're trying to break it, or say you support it, but not both at once."

What's interesting about this latest statement, though, is the way it shifts the blame further onto the tech companies for implementing encryption systems that create "severe risks to public safety".

Those risks are "severely undermining a company's own ability to identify and respond to violations of their terms of service", and "precluding the ability of law enforcement agencies to access content in limited circumstances where necessary and proportionate to investigate serious crimes and protect national security, where there is lawful authority to do so".

Note the way each party's actions are described.

Law enforcement's actions are reasonable, necessary, and proportionate. Their authorisation is "lawfully issued" in "limited circumstances", and "subject to strong safeguards and oversight". They're "safeguarding the vulnerable".

Tech companies are challenged to negotiate these issues "in a way that is substantive and genuinely influences design decisions", implying that right now they're not.

"We challenge the assertion that public safety cannot be protected without compromising privacy or cybersecurity," the statement said.

The many solid arguments put forward explaining why introducing a back door for some actors introduces it for all, no they're mere assertions.

"We strongly believe that approaches protecting each of these important values are possible and strive to work with industry to collaborate on mutually agreeable solutions."

This too is an assertion, of course, but the word "belief" sounds so much better, doesn't it.

As your correspondent has previously noted, however, the fact that encryption is either end-to-end or not may be a distraction. There are ways to access communications without breaking encryption.

One obvious way is to access the endpoint devices instead. Messages can be intercepted before they're encrypted and sent, or after they've been received and decrypted.

In Australia, for example, the controversial Telecommunications and Other Legislation Amendment (Assistance and Access) Act 2018 (TOLA Act) can require communication producers to install software that a law enforcement or intelligence agency has given them.

Providers can also be made to substitute a service they provide with a different service. That could well include redirecting target devices to a different update server, so they receive the spyware as a legitimate vendor update.

Doubtless there are other possibilities, all of which avoid the war on mathematics framing that some of the legislation's opponents have been relying on.

While Australia's Minister for Home Affairs Peter Dutton busies himself with signing onto yet another anti-encryption manifesto, progress on the oversight of his existing laws has been slow.

The review of the mandatory data retention regime, due to be completed by April 13 this year, has yet to be seen.

This is despite the Parliamentary Joint Committee on Intelligence and Security having set itself a submissions deadline of 1 July 2019, and holding its last public hearing on 28 February 2020.

The all-important review of the TOLA Act was due to report by September 30. Parliament has been in session since then, but the report didn't appear.

A charitable explanation would be that the government was busy preparing the Budget. With only three parliament sitting days, and a backlog of legislation to consider, other matters had to wait.

A more cynical explanation might be that the longer it takes to review the TOLA Act, the longer it'll be before recommended amendments can be made.

Those amendments might well include having to implement the independent oversight proposed by the Independent National Security Legislation Monitor.

Right now the law enforcement and intelligence agencies themselves can issue the TOLA Act's Technical Assistance Notices and Technical Assistance Requests. One imagines they wouldn't want to lose that power.

Meanwhile, the review of the International Production Orders legislation, a vital step on the way to Australian law being made compatible with the US CLOUD Act, doesn't seem to have a deadline of any kind.

In this context, we should also remember the much-delayed and disappointing 2020 Cyber Security Strategy. That seems to have been a minimal-effort job as well.

For years now, on both sides of Australian politics, national security laws have been hasty to legislate but slow to be reviewed. The question is, is it planned this way? Or is it simply incompetence?

Excerpt from:
Dutton pushes against encryption yet again but oversight at home is slow - ZDNet

How to use private conversations on Skype to send encrypted calls and messages – Business Insider India

If you use the Skype mobile app or desktop app, you can have private text conversations and voice calls. Private conversations have full end-to-end encryption, so they're more secure than standard messages.

Because they are designed to be secure, private conversations aren't copied or shared between devices in the same account, so you can't continue a private conversation on a different device.

3. In the pop-up window, choose the contact you want to chat with.

Advertisement

1. Open the Skype app and tap the New Chat icon (it's shaped like a pencil).

4. If this is the first time you're having a private conversation with this person, an invitation will be sent automatically, and the conversation will start after the invite is accepted. If you've already had a private conversation with this person on this phone, you can start messaging right away.

1. Start or continue a private text conversation.

3. In the pop-up window, choose "Private call."

1. Open the private conversation on your phone or computer.

Go here to read the rest:
How to use private conversations on Skype to send encrypted calls and messages - Business Insider India