Image: Getty Images
The Parliamentary Joint Committee on Intelligence and Security (PJCIS) in December kicked off an inquiry into extremist movements and radicalism in Australia, considering, among other things, the role of social media, encrypted communications platforms, and the dark web in allowing such activity.
The New South Wales Police Force told the committee that online propaganda continues to instruct, recruit, inspire, cause fear, and encourage attacks. It said this remains a significant driver for global terrorism and the targeting of crowded places in Western countries.
"Extremist groups, across all ideologies have consistently demonstrated a willingness to harness new technologies to amplify their messages, reach new audiences, and coordinate activities," NSW Police said [PDF].
"Digital platforms, including social media, encrypted messaging applications, live-streaming platforms, and the dark web are able to be used effectively by extremist groups. These innovations have allowed new types of communities to emerge, where ideological affinity overcomes a lack of physical proximity.
"Internet-enabled technologies have provided an accessible, low-cost means to establish, engage and empower like-minded groups across divides."
It said that where platforms associated with extremist groups and implicated in terror attacks have been taken down by their hosts, rather than resulting in the demise of these platforms it has simply displaced them, emerging in altered forms and with new hosts.
"Pushing extremists to the fringes of the internet, away from mainstream users, could be a positive but it presents a different set of challenges for law enforcement and intelligence agencies," NSW Police added.
Also providing a submission [PDF] to the inquiry, Facebook said the existence of terrorist or extremist groups within society inevitably leads to terrorist or extremist activity online. The social media giant detailed its work in removing terrorist or extremist activity, but told the PJCIS it must consider not just how to prevent the violent manifestations of extremism, but also how to combat hate, labelling it the root cause for extremism.
On encrypted communications, Facebook said end-to-end encryption is the best security tool available to protect Australians from cybercriminals and hackers, but it also poses a legitimate policy question: "How to ensure the safety of Australians if no one can see the content of messages except the sender and the receiver?"
"The solution is for law enforcement and security agencies to collaborate with industry on developing even more safety mitigations and integrity tools for end-to-end encrypted services, especially when combined with the existing longstanding detection methods available to law enforcement," it wrote.
"We already take action against a significant number of accounts on WhatsApp (a fully end-to-end encrypted messaging service) for terrorism reasons, and we believe this number could increase with greater collaboration from law enforcement and security agencies."
See also: Home Affairs concerned with Facebook's plans to create world's 'biggest dark web'
It said it's committed to working with law enforcement, policymakers, experts, and civil society organisations to develop ways of detecting bad actors without needing access to the content of encrypted messages.
It added the creation of backdoors is not the way forward.
Similarly detailing its approach to removing terrorist or extremist activity across its platforms to the PJCIS, Google said [PDF] it also engages in ongoing dialogue with law enforcement agencies to understand the threat landscape, and respond to threats that affect the safety of our users and the broader public.
Google receives approximately 4,000 requests each year for user data from Australian law enforcement agencies.
The search giant also said encryption is a "critically important tool in protecting users from a broad range of threats".
"Strong encryption doesn't create a law free zone -- companies can still deploy several anti-abuse protections using metadata, behavioural data, and new detection technologies -- without seeing the content of messages encrypted in transit (thereby respecting user privacy)," it wrote.
"While we are unable to provide to law enforcement the unencrypted content of messages encrypted in transit, we are still able to provide a wealth of data and signals that in some instances have proven richer than content data. Metadata such as call location, associated phone numbers, frequency and length of call/text are logged on our servers and can be shared with law enforcement/intelligence when provided with a valid court order."
Offering similar summaries of the work it does in countering terrorist or extremist activity on its platform, Twitter told the PJCIS its goal is to protect the health of the public conversation, and to take immediate action on those who seek to spread messages of terror and violent extremism.
"However, no solution is perfect, and no technology is capable of detecting every potential threat or protecting societies and communities from extremism and violent threats on their own," Twitter said [PDF]. "We know that the challenges we face are not static, nor are bad actors homogenous from one country to the next in how they evolve, behave, or the tactics they deploy to evade detection."
The Office of the Australian eSafety Commissioner told the committee that its research on young people and social cohesion showed 33% of young people have seen videos or images promoting terrorism online, and over 50% of young people had seen real violence that disturbed them, racist comments, and hateful comments about cultural or religious groups.
It told the PJCIS it believes the best tactic to prevent terrorist or extremist activity is education.
"Especially in the context of this inquiry, it is important to consider the structural, systemic, and social factors that may lead someone to be attracted to, and engage in, negative or dangerous activity online," its submission [PDF] said. "A whole of community approach and systems approach is therefore needed to understand and address the underlying drivers of this behaviour, as well as provide diversion and alternative pathways to support and assistance.
"Giving individuals the skills and strategies to prevent and respond to harmful experiences online and engage online in ways likely to promote safe and positive online experiences."
Read more: