Investigating Social Media Harm Is A Good Idea, But Parliament Is About To See How Complicated It Is To Fix


(MENAFN- The Conversation) Barely a day has gone by this month without politicians or commentators talking about online
harms.

There have been multiple high-profile examples spurring on the conversation. There was the circulation of video
s of Bishop Mar Mari Emmanuel being stabbed in the Sydney church attack. The normalisation of violent content online
has also been central to the discussion of the domestic violence crisis.

Then, of course, there's the expressions of disdain for the Australian legal
system by X (formerly Twitter) owner Elon Musk.

Inevitably, there are calls to“do something” and broad public appetite for changes in regulations . A new parliamentary committee will explore what that change should look like, but will have to contend with a range of legal
, practical and ethical obstacles along the way.

Read more: Elon Musk is mad he's been ordered to remove Sydney church stabbing video
s from X. He'd be more furious if he saw our other laws

Ten busy days

On May 1 and May 10, the government
made two major announcements.

The first was a Commonwealth response to some of the online
harms identified by National Cabinet . At the May 1 meeting, the Commonwealth promised to deliver new measures to address violent online
pornography and misogynistic content targeting children and young people. This included promised new legislation to ban deepfake pornography and to fund a pilot project on age-assurance technologies.


Communications Minister
Michelle Rowland and Financial Services Minister
Stephen Jones announced the new committee. Bianca De Marchi/AAP

The second was an announcement establishing a Joint Parliamentary Select Committee to look into the influence and impacts of social media
on Australian society. The government
wants the committee to examine and report on four major issues:

  • The decision of Meta to abandon deals under the News Media and Digital Platforms Bargaining Code

  • the important role of Australian journalism, news and public-interest media
    in countering misinformation and disinformation on digital platform
    s

  • the algorithms, systems and corporate decision-making of digital platform
    s in influencing what Australians see, and the impacts of this on mental health

  • other issues in relation to harmful or il legal
    content disseminated over social media
    , including scams, age-restricted content, child sexual abuse and violent extremist material.

    However, the final terms of reference will be drafted after consultation with both the Senate crossbench and the opposition, so they may change a bit.

    Why would they do this?

    Asking the committee to review the Meta decision is an odd move.

    In practice, Financial Services Minister
    Stephen Jones can“designate” Meta without a referral to the parliament. That is, the minister can decide all of the obligations of the News Media Bargaining Code apply to Meta.

    However, a sounding by the committee may help to ensure Meta keeps concentrating on the issue. It also provides the opportunity to restate the underlying principles behind the code and the parlous state of much of the Australian news media
    .

    In relation to harmful or il legal
    content disseminated over social media
    , there is already a review of the Online Safety Act underway. The terms of reference seem to ask the committee to provide input into the review.

    Read more: This week's changes are a win for Facebook, Google and the government
    - but what was lost along the way?

    The issue of misinformation and disinformation has also been the subject of review. The government
    released a draft of a proposed bill to combat misinformation and disinformation in June 2023. It would give the Australian Communications and Media Authority (ACMA) power to enforce an industry code, or to make one if the industry cannot.

    That draft was criticised by the opposition at the time. However, there have been shifts since then and the committee might be a vehicle for the introduction of an amended version of the bill.

    An age-old issue

    Online age verification is a simple idea that is hard to implement unless there are significant consequences for non-compliance on a service provider.

    Work in this area by the UK's communications regulator, Ofcom, and the UK Information Commissioner's Office are often cited as leading practice. However, the commissioner's website notes“age assurance is a complex area with technology
    developing rapidly”.


    Measures to limit children's access to social media
    will be investigated by the committee. Shutterstock

    One approach is for the minor to identify themselves to a platform
    by uploading a video
    or to send a photograph of their ID. This is entirely contrary to the eSafety Commissioner's messaging on online
    safety. The Commissioner advises parents to make sure children do not share images or video
    s of themselves and to never share their ID.

    In practice, the most effective age identification for minors requires parents to intervene. This can be done by using software to limit access or by supervising screentime. If children and teenagers can get around the rules simply by borrowing a device from a school friend, age verification might not do much.

    As the International Association of Privacy Professionals has found , age verification and data protection are far harder than they look. It is particularly difficult if the age barrier is not one already in place – such as the adult rights that those over the age of 18 possess – but rather a seemingly arbitrary point in the mid-teens. Other than online
    , the most important age to verify is 18 for things such as alcohol sales and credit. It is also the age at which contracts can be enforced.

    Countries vs companies

    One issue that is often raised about social media
    platform
    s is how Australia can deal with a global business.

    Here, the European approach in the Digital Markets Act provides some ideas. The act defines companies with a strong market position as“gatekeepers” and sets out rules they must follow. Under the act, important data must be shared as directed by the user to make the internet fairer and to ensure different sites and software can communicate with each other. It also calls for algorithms to be made more transparent, though these rules are a bit more limited.


    European Commissioner for Europe fit for the Digital Age, Margrethe Vestager, helps administer the Digital Markets Act. Virginia Mayo/AP

    In doing so, it limits the power of gatekeeper companies, including Alphabet (Google), Amazon
    , Apple, ByteDance (TikTok), Meta and Microsoft.

    Obviously, Australia can't harness the collective power of a group of nations in the same way the European Union does, but that doesn't preclude some of the measures from being useful here.

    There is considerable public support for government
    s to“do something” about online
    content and social media
    access, but there are both legal
    and practical obstacles to imposing new laws.

    There is also the difficulty of getting political consensus on such measures, as seen with the debate surrounding the misinformation bill.

    But it's clear in Australia, both citizens and government
    s have been losing patience with letting tech companies regulate themselves and shifting responsibility to parents.

    • Microsoft
    • Social media
    • Google
    • Internet
    • Mental health
    • Algorithm
    • Amazon
    • Online abuse
    • Misinformation
    • Online abuse of women
    • Social media
      regulation
    • eSafety Commissioner
    • Online harm
    • Media Bargaining
    • parliamentary committees
    • Meta
    • Social media
      laws
    • X (formerly Twitter)


    The Conversation

    MENAFN16052024000199003603ID1108220161


  • The Conversation

    Legal Disclaimer:
    MENAFN provides the information “as is” without warranty of any kind. We do not accept any responsibility or liability for the accuracy, content, images, videos, licenses, completeness, legality, or reliability of the information contained in this article. If you have any complaints or copyright issues related to this article, kindly contact the provider above.