Submission to the Proposals Paper for Mandatory Guardrails for AI in High-Risk Settings

20 November 2024

The rise of artificial intelligence (AI) is an opportunity for Australia to reinvigorate its economy. Embracing AI as a productivity-increasing reform will create a strong economic future for Australia.

The Business Council of Australia (BCA) represents around 130 of the largest companies operating in Australia. Our membership supports the positive role that innovation can play in expanding our economy, creating jobs and delivering benefits for all Australians.

We have consistently supported the safe and responsible use of AI and welcome the engagement with the Department of Industry, Science and Resources and the opportunity to provide this submission on the Proposals Paper, Introducing mandatory guardrails for AI in high-risk settings (Proposals Paper).

AI is the next technological development to offer a step change in the way that we do business. It can make tasks easier for workers, reduce complexity and costs to businesses, and provide for a range of entirely new services to be developed.

In 2018, CSIRO estimated that digital innovation could deliver $315 billion in gross economic value to Australia by 2028. Generative AI alone could add $115 billion to Australia’s economy annually by 2030. CSIRO estimated that 68 per cent of Australian businesses had already integrated AI into their operations, and the McKinsey Global Survey found that 65 per cent of businesses are now regularly using generative AI.

In contrast, as the AI opportunity has risen, Australia’s economy has fallen. Increased red tape, an inefficient tax system, and uncertainty around project approvals have meant Australia is missing out on investment and economic growth opportunities. Business investment remains at relatively low levels as a share of the economy. On a per person basis, GDP growth fell 0.4 per cent over the June 2024 quarter and fell 1.5 per cent over the year. This was the sixth consecutive quarterly decline in average living standards.

Boosting business investment as a share of GDP from its near 30-year lows and keeping capital in Australia is critical to growing our productivity, improving our standard of living, and ensuring Australia is positioned to have a strong base of high-paying, secure jobs into the future.

AI provides the opportunity to achieve these outcomes and assist in improving our current economic standing. Critical to this will be ensuring AI uptake and encouraging both digital innovation and investment. Excessive or poorly designed regulations will put this opportunity at risk.

AI technology must also uphold safety, security and dignity. There must be public trust in the technology. Those creating and deploying AI will have to maintain their social licence by addressing concerns about harm.

However, a regulatory approach aimed at ensuring AI safety must be specifically targeted at risk, without redundant or onerous obligations. The technology that underlies AI may be complex, but the regulatory approach adopted need not be.

AI is not new. Australian businesses have used AI systems for years—ranging from automated decision-making processes, speech to text technologies, machine vision, and more recent advances in generative AI. The recent growth in capability and adoption of generative AI have introduced new risks and opportunities in AI. But it is fundamentally similar to what has come before.

Through this AI development, Australians have remained protected by our framework of existing, technology-neutral regulations, such as consumer law, privacy protections and financial system regulation.

Australia must learn from the experiences of other countries to ensure our regulatory approach is best practice, internationally interoperable and in line with consensus-based, internationally recognised technical standards.

The BCA supports a regulatory approach to address genuine, high risks arising from the increased use and capability of AI. But the role our existing regulations play—and where they may be falling short—must be made clear. Australia must be deliberate about who in the AI value chain would be regulated and why.

Most importantly, we must avoid regulation that undermines Australian businesses’ ability to safely and responsibly develop or deploy AI systems and thus stifles the growth of the digital economy in Australia.

The BCA supports a regulatory approach to address genuine, high risks arising from the increased use and capability of AI. But the role our existing regulations play—and where they may be falling short—must be made clear. Australia must be deliberate about who in the AI value chain would be regulated and why.

Most importantly, we must avoid regulation that undermines Australian businesses’ ability to safely and responsibly develop or deploy AI systems and thus stifles the growth of the digital economy in Australia. 

Read our full report here.

Share

Latest news


Submissions

Submissions

Submissions