Dynamic Business Logo

via pexels

The five AI questions every Australian business owner should be asking right now

About 75% of corporate boards still lack real AI expertise. Without a framework, small businesses are at a disadvantage. These five questions are a practical place to start.

What’s happening: INSEAD Corporate Governance Centre and KPMG International have jointly released a global set of AI Governance Principles for Boards, built around five areas: strategy, security, workforce, trustworthy AI, and how AI changes leadership itself.

Why this matters: The same five principles designed for boardrooms map almost exactly onto decisions small business owners are already making daily.

Two of the world’s most credible business institutions sat down together and wrote a framework for how organisations should govern artificial intelligence. The result, released this week, was aimed squarely at corporate boards.

But read it carefully and something becomes clear: every question it raises for a director with a fiduciary duty applies just as directly to a business owner with five staff and a ChatGPT subscription.

On 14 April, INSEAD Corporate Governance Centre and KPMG International published their AI Governance Principles for Boards, a five-part framework drawing on the perspectives of board directors worldwide. Its stated goal is to help leaders oversee AI responsibly across strategy, security, workforce management, trust, and how AI changes decision-making itself.

The report’s opening finding sets the tone. According to the KPMG Global AI Pulse Survey, nearly three quarters of boards are perceived to have only moderate or limited AI expertise. In other words, even the people nominally in charge of major organisations are still working out how to handle AI responsibly.

Even big business is still guessing

That finding matters because it reframes the conversation for smaller operators. The assumption has often been that governance is something large businesses do and small ones get to skip. The INSEAD and KPMG framework pushes back on that directly, describing the principles as adaptable across industries and jurisdictions, with governance challenges that are increasingly shared regardless of company size.

“In a period of rapid AI acceleration, the board’s strategic role is becoming more consequential than ever,” said Steve Chase, Global Head of AI and Digital Innovation at KPMG International. “Strong governance provides the confidence organisations need to invest, scale, and execute AI across markets at speed. Trust in AI, and in the governance behind it, is what turns ambition into durable value.”

For a small business, that translates plainly: if you cannot explain how you are using AI, or what happens when it goes wrong, you are carrying risk you probably have not priced in.

The Australian data bears this out. A Deloitte Access Economics report published in November 2025, surveying more than 1,000 Australian SMBs, found that while two thirds of SMBs are using AI, just 5% are fully enabled to realise its potential benefits. A fully AI-enabled business was defined as one with an AI strategy embedded in core processes, staff training in place, and a centralised data system. Separately, Pitcher Partners’ Business Radar 2025 report found that 72% of Australian businesses are actively engaging with AI, while only 13% have made it a true strategic priority with dedicated budgets and scaling plans.

Using AI and governing AI are two very different things. The gap between them is where risk quietly accumulates.

The five questions

The INSEAD and KPMG framework covers five principles. Each one translates into a direct question for any business owner.

Do you know why you are using each AI tool, and what it is actually doing?

The first principle covers strategic oversight. For a small business, this means being deliberate rather than reactive. The most common mistake, according to advisers working with SMEs on AI investment, is buying based on hype rather than business need. A sounder approach starts with identifying a genuine operational problem, understanding what it currently costs in time or revenue, and then assessing whether a specific AI tool can measurably improve the outcome, while meeting basic requirements around data, compliance, and accountability.

Do you know what data your AI tools are touching, and who else can see it?

The second principle covers technology and security oversight, and it is where many small businesses are most exposed without knowing it. In early 2025, a contractor working for an Australian organisation uploaded personal information, including names, contact details, and health records, into an AI system. The result was a serious data spill classified as a notifiable data breach, according to the Australian Cyber Security Centre. The ACSC advises businesses of all sizes to establish a clear internal AI use policy and define what data cannot be uploaded into AI platforms. That is not a large-business problem. It is a problem for any business with staff and an AI subscription.

Are your staff trained to question AI outputs, not just accept them?

The third principle addresses workforce transformation and the preservation of human judgement. The INSEAD and KPMG framework is explicit that productivity gains from AI must be balanced against keeping people genuinely in the loop. Also in 2025, a lawyer used AI to prepare a court document that included fabricated legal cases. Those cases were submitted without verification, the error was discovered, and the lawyer was subsequently barred from practice, according to the Australian Cyber Security Centre. This is not an argument against AI. It is an argument for making sure your team understands that AI outputs require human checks, particularly for anything consequential.

Does the way you use AI reflect how you want your business to be seen?

The fourth principle is about building trustworthy AI that aligns with a business’s values and its obligations to customers. For a small business, this is less abstract than it sounds. It comes down to whether your customers would be comfortable knowing how their data is used, whether your AI tools treat people fairly, and whether you could explain your decisions if asked. Research cited in a Dynamic Business report in November 2024, drawing on data from compliance firm Vanta, found that only 54% of Australian businesses have a formal AI policy, which is 11 percentage points lower than in the UK. Not having a policy is itself a position, and not a defensible one as regulatory scrutiny increases.

Are you keeping yourself genuinely informed, or just delegating all of it to the tool?

The fifth principle examines how AI changes the work of leaders themselves. For a board, this means directors can no longer treat AI as purely a technical matter. For a business owner, it means exactly the same thing. Research from TrendAI, published in March 2026 and drawing on a survey of 3,700 business and IT decision-makers across 23 countries, found that 67% had felt pressured to approve AI despite known security concerns. Almost one in five Australian respondents described those concerns as extreme but said they had been overridden to keep pace with competitors. Staying informed is not about becoming a technology expert. It is about asking enough of the right questions to make decisions you can stand behind.

When something goes wrong

The INSEAD and KPMG report frames accountability as a central priority, not a compliance detail. That framing matters because it shifts the question from “are we covered?” to “are we responsible?”

For Australian SMEs, the regulatory environment is also tightening. Privacy law changes passed in December 2024 and the Australian Government’s National AI Plan, released in December 2025, both signal that regulators intend to ask not just whether organisations use AI, but how they govern it. The National AI Plan also consolidated SME support within the National AI Centre and framed responsible governance as part of Australia’s broader economic strategy.

Annet Aris, Senior Affiliate Professor of Strategy and Academic Director of INSEAD Corporate Governance Centre, described the framework as suitable for organisations regardless of their level of AI maturity. That phrase is doing important work. It means there is no minimum size requirement for responsible AI use.

The practical questions small business owners face when evaluating AI tools are governance questions, whether or not they are labelled that way. Who is accountable when something goes wrong? What data are we using and why? Are we checking the outputs? Can we explain our decisions to a customer or a regulator if asked?

Link to report: https://www.insead.edu/system/files/2026-04/ai-principles-for-boards-report-2026.pdf

Keep up to date with our stories on LinkedInTwitterFacebook and Instagram.

Yajush Gupta

Yajush Gupta

Yajush writes for Dynamic Business and previously covered business news at Reuters.

View all posts