Connect with us

Hi, what are you looking for?

Tech & Science

Op-Ed: To regulate AI or not to regulate — Australia decides existing laws will cover AI

Wait and see is actually a realistic approach for now, but AI cannot be allowed to become a sacred cow. Regulate as required.

Photo courtesy of rawpixel.com on Freepik.
Photo courtesy of rawpixel.com on Freepik.
Photo courtesy of rawpixel.com on Freepik.

Regulation of AI is a mixed message of possible current risks and a whole future class of tech which could be problematic. Globally, it’s a “deregulate now and let the future deal with the inevitable messscenario.

Australia was considering formal regulation but has decided to stay with the current legal framework. This does make at least some legal sense at the moment, if not in future. If you suffer any kind of legal injury, existing laws can do the job. The problem is that this applies largely to civil law. There’s not much of a compliance element.

That means that behaviors aren’t regulated. Justice comes after the event, and there are no preventative elements regulating the nature of AI networks. Big Tech was worried about “complexity” in regulation.

Since most of these companies come from the US, which has been deregulated to the obscene state the country is in now, that’s quite a laugh. Perish the thought that consumers and other likely recipients of collateral damage should get a word in edgewise or otherwise.

However, in all fairness, legislating for the unknown is pretty thankless. Most legislation and regulations always need to be tweaked heavily before they actually work. AI could cause problems that have never happened before, with ease. Even the legal terminology would have to be evolved to describe it.

“Thou shalt not …We don’t know what” isn’t likely to be much help as a regulatory frame of reference, either.

An enforcement regime which equates to “Thou shouldn’t have done that thou naughty trillionaires, have a few more million Get Out of Jail Free cards from your idiotic gerbil-like political representatives” isn’t overly impressive, either.

“Thou shalt not crash every global market and bankrupt the world with thy little AI pet” makes a bit more regulatory sense, but existing laws do cover that to some degree.

Big tech does have a point, for a change. Big Tech is nearly as obnoxious and toxic as Big Pharma in its own way. Their argument is really about trying to define their liabilities. Never mind destroying the world, what if they have to pay for it?  Unlimited liability does have something to say about such unrestricted risk.

That’s easy. Any sane law would hold that AI parent liability extends to the amount of damage. It would need to include owners of the technologies, those at fault in the use of the tech, and anyone trying to benefit from AI crime, AI fraud, etc.

There are reasonable precautionary requirements for any AI system that fit most existing laws. That doesn’t mean the disasters won’t happen.

For example –

Automated selling on stock markets can already cost billions in a fraction of a second. Does your AI system have an off switch? It needs one. It also needs a recovery function, which could involve every single transaction. A simple glitch in an LLM prompt could trigger a global crash.

Does your AI know not to turn off power and water to major cities? It needs to know that. This can be managed by contractual compliance, to a point. The wheels fall off that argument when the AI doesn’t comply. The damage will definitely be done whether anyone likes it or not.

Then there’s “lazy” AI. If you’ve ever noticed how much repetition and filler go into AI texts and video scripts, you’ll know the problem. It’s worthless content. It’s called “productive” but doesn’t actually produce anything of any value, even at a hallucinatory level. If you pay for unsalable AI garbage, what are your legal options? None, so far.

Wait and see is actually a realistic approach for now, but AI cannot be allowed to become a sacred cow. Regulate as required.

__________________________________________________________

Disclaimer
The opinions expressed in this Op-Ed are those of the author. They do not purport to reflect the opinions or views of the Digital Journal or its members.

Digital Journal
Written By

Editor-at-Large based in Sydney, Australia.

You may also like:

Business

Canadian startups are navigating a market where investors expect clarity in how the business works, what has changed, and why decisions make sense.

Business

Either the US gets on the ball, or the future is looking very nasty indeed.

Entertainment

The two-part documentary “My Nightmare Stalker: The Eva LaRue Story” is streaming on Paramount+.

Business

The future of a euro sculpture in Frankfurt is secure after 'years of discussions', according to city officials - Copyright AFP Jim WATSONA landmark...