Why a high use of AI and low literacy is a threat to your organisation

John O'Neill

Almost half of all Australian employees (49%) are already using AI regularly at work although less than one quarter (24%) have had any formal training.

And less half of Australian employees feel they have the skills to use these tools effectively – compared with a much higher global average of 60 per cent.

These were among the findings of a University of Melbourne and KPMG study conducted last year of 48,340 people across 47 countries.

Australia also trailed most countries in realising the benefits of AI – 55 per cent of Australians reported experiencing benefits, compared with 73 per cent globally.

The risk isn’t that your people won’t use AI. They already are. And the gap isn’t capability. It’s understanding and preparation.

The cost of low literacy

And there are real risks when employees use AI without understanding what it can and can’t do – and without the organisational guardrails to catch the gaps.

AI tools don’t retrieve information the way a search engine does. They predict content based on patterns learned from training data. That means they can be confidently wrong. They can also reflect biases in their training data. And they can hallucinate.

They can produce outputs that look authoritative – but aren’t.

In low-literacy environments, those outputs can get used without question. The commercial and legal liability has already materialised for early movers who didn’t manage it.

Business Leader looking confused

Three things organisations can do now

  • Build baseline literacy at the leadership level first – you can’t govern what you don’t understand, and leaders set the tolerance for risk.
  • Find out how AI is actually being used in your organisation right now. Informal adoption is almost always ahead of policy – and that gap is where the liability lives.
  • Match the tool to the task. ChatGPT, Gemini, Claude, and DeepSeek were built differently, trained differently, and excel at different things. Platform choice is a strategic decision, not a preference.

The platforms are different

Let’s consider four AI platforms – ChatGPT, Gemini, Claude, and DeepSeek. They are not the same tool with different logos. Each was built with a different philosophy, trained on different data, and designed to excel at different kinds of work.

But platform capability alone should not determine organisational adoption – it needs to be chosen in the context of your business strategy, including defining the layer of AI capability your organisation is building toward.

BUILDING AI LITERACY IN YOUR ORGANISATION?
Komosion delivers executive AI literacy programs in partnership with university business school academics. If your leadership team is already using AI but hasn’t been formally prepared for it,
we can help.