Australian Expert Warns About The Risks Of The Dependence On Artificial Intelligence

In a recent public speech, Mike Burgess, Director-General of the Australian Signals Directorate (ASD), the Australian government cybersecurity agency, warned about the potential risks of uncontrolled artificial intelligence (AI) and asked the world to think more deeply about the new technologies and their impact in our lives.

According to a report published on The Sydney Morning Herald, Mr Burgess urged people not to “sleepwalk” into this issue and stay alert. “It is right the world embraces artificial intelligence, but we must embrace this with our eyes wide open. We should not sleepwalk into this, where we suddenly find ourselves in the world that is controlled by software and very few people understand how it works”, Mr Burgess asserted. “How much of our world will be outsourced AI? How much of our brain power and decision-making will we hand over?” he wondered.

The Australian expert acknowledged the benefits of AI, but he also stated that its risks are “serious”. “It’s great… for productivity and the economy and society, but with those same great benefits come serious risks that require serious thinking… and I don’t think we’ve done enough of that thinking to date”, he commented.

However, in spite of his concern, Mr Burgess admitted that his agency will work on projects that include the use of artificial intelligence to “defend Australia” from any threat. “ASD will use artificial intelligence to maintain our capability edge, to defend Australia from rogue threats and help advance Australia’s national interests”, he expressed. “In the spirit of transparency, there wouldn’t be an intelligence agency on this planet that would not be thinking today about how AI could be exploited, ASD included”, he added.

Along these lines, the head of the ASD said that his agency will not only use AI for defensive purposes, but also “to find vulnerabilities or find chinks in the armour” of rival networks.

For more information:


There are 0 comments on this post

Leave A Comment