Tech giants urged to report algorithm harm to online safety – The New Daily

Posted: March 18, 2022 at 7:48 pm

Live

Social media companies should be forced to report to the federal government on how they use algorithms and how they address harm online, a parliamentary committee has recommended.

The proposal was one of 26 recommendations made in the final report of the social media and online safety committee, which was handed down on Tuesday.

The committee also recommended social media companies be mandated to set the highest privacy settings as a default for people under 18.

A digital safety review was also proposed, which would further examine all online safety legislation and government programs in the space.

Federal funding should be increased to support victims of technology-facilitated abuse, the report recommended.

The committee also called for more work by the eSafety Commissioner, who would examine how social media companies prevent pile-on attacks or harms across multiple online platforms.

Committee chair Lucy Wicks said the report was a crucial step to making online spaces safer.

For too long, social media platforms have been able to set the rules, enabling the proliferation of online abuse, she said.

The balance of responsibility for the safety of users online, which until recently has been primarily on users, must be flipped to ensure that social medial platforms bear more of the burden of providing safety for their users.

The inquiry was launched in late 2021, and heard from nearly 60 witnesses, which saw large tech companies be questioned.

The report said the algorithms of social media companies needed further investigation to determine the scale of the harm they caused, as well as how they could be regulated.

While algorithms play a key role in the basic function of multiple types of online services, it is clear that they have the potential to enormously accentuate online harm, the report said.

More transparency is required of social media companies to demonstrate that these concerns are being addressed.

Although the report considered social media companies having a statutory requirement to report on how they were minimising harm from algorithms online, a review would be needed to determine how it would be achieved.

Ms Wicks said any response in how to protect online users needed to evolve quickly, given the nature of social media platforms.

Social media companies have to take responsibility to enforce their terms of service, prevent recidivism of bad actors, prevent pile-ons or volumetric attacks (and) prevent harms across multiple platforms, she said.

The recommendations in this report are an important next step in making our online world and social media platforms safer for all.

Although the report was unanimous, Labor members of the committee said the countrys online safety framework should be updated to enable action taken against group hate speech.

A glaring gap in the existing Australian regulatory regime compared to other nations is dealing with hate speech targeting groups, Labor members said in the report.

This is particularly concerning in the context of this report being handed down at the time of the third anniversary of the Christchurch terrorist atrocity We have yet to address the online material that normalises hate and radicalises people to commit these acts of real-world violence.

-AAP

See the original post here:

Tech giants urged to report algorithm harm to online safety - The New Daily