Despite the high enthusiasm around artificial intelligence (AI), 80.7% of organisations globally are in the “AI dust” quadrant on the NCS AI+DR Matrix.
The matrix categorises companies into four quadrants based on their combined AI and digital resilience (DR) levels, and is part of the study NCS commissioned to the International Data Corp (IDC). DR is a measure of how robust and adaptable IT systems are and their ability to recover from disruptions.
Those in the “AI dust” quadrant are in the very early stages of AI adoption and are likely to fall behind due to their inability to either harness AI or maintain robust digital operations. Meanwhile, companies in the “Game Changers” quadrant are well-positioned to achieve long-term and sustained success with AI and other innovative technologies.
Case in point: “Game Changers” were revealed to have gained 1.3 times more revenue and 1.4 times more cost savings than those in the other quadrants. Another IDC study also found that the top 5% of organisations realised an average return of US$8 for each dollar invested in AI.
“Game changers have a well-defined AI strategy, and are able to fund and execute it. [Their AI strategy] involves the business, tech and substantial parts of the organisation… so it translates into multiple AI use cases that instead of a single AI application,” Wynthia Goh, senior partner at NEXT, NCS, tells DigitalEdge.
Planning for AI success
See also: AI plays peacemaker for the soul of hybrid work
Since AI is widely considered to be a general-purpose technology, businesses should first build an AI opportunity map. “They should decide which part of their organisations to start AI with and map AI use cases for that team or department. Thereafter, they can prioritise the use cases based on the projected impact on the organisation, such as by solving the biggest pain points,” Goh advises.
She continues: “They can also turn to partners like NCS to conduct a health check and gain a better understanding of their digital resiliency status, before embarking on their AI journey or scaling their AI efforts.”
Having an AI centre of excellence (COE) will help organisations scale their use of AI effectively, too. “Most companies are in the early stage of their AI journey and may not be familiar with what’s possible so the idea of the AI COE is to concentrate resources, expertise and knowledge to manage the AI conversation within the organisation and drive business outcomes. For example, the COE will orchestrate the effort to build an opportunity map and figure out what needs to be done after that,” Goh explains.
See also: Why companies are turning to CISOaaS
She also highlights the need for business and tech teams to be involved in AI discussions to ensure alignment, as well as strong support from senior management. “The AI journey requires significant changes, which requires activating resources in many parts of the organisation. So, the senior management plays a very important role in setting the direction; they need to endorse and show visible support [for the company’s AI strategy/efforts].”
AI for revenue generation
Most companies experimenting with AI today are focused on “low-hanging fruits”, or use cases aimed at improving productivity and reducing operational costs. However, as their AI strategies mature, the real test will be using AI to drive revenue growth.
[To successfully use AI for revenue generation], organisations need to deeply understand the AI applications they’re developing and build the muscles to deal with the risks [that come along with it]. They must know how to respond if something goes wrong, and ensure governance frameworks are in place to safeguard any AI tools interacting with the public.
Wynthia Goh, senior partner at NEXT, NCS
Businesses, she emphasises, must also design the (end-user) experience or find the customer pain point before figuring out how AI can help instead of being tech-led.
Additionally, Goh believes AI needs to be sustainable in terms of having a lower carbon footprint and cost of operating for it to become mainstream.
“Our clients are concerned that deploying AI at scale will require a lot of energy and translate to higher operating costs. To address this, large language models (LLMs) will need to become more energy efficient as they become bigger in size. Also, LLMs may not be necessary in all cases – it makes more sense to use small language models in generative AI applications handling very specific tasks. So, organisations should also look at cost optimisation [when designing their AI strategy or deploying AI at scale],” she says.
In short, organisations that map out their AI opportunities with clarity and align the right skills and technology to seize them will be best positioned to become game changers and maximise returns on their AI investments.