top of page

How closed AI LLMs will impact Australia & New Zealand data centre operations

Despite all the hype, and the rise of Nvidia’s share price, the cost of using LLMs [Large Language Models] for general business is still high due to their resource-intensive nature. But some AI models, including more closed, lightweight models are emerging meaning the impact on data centre operators is still not clear but may favour a hybrid approach. 

AI consulting firm AIOrdinate builds products and solutions in two specific domains – digital customer experience and Generative AI. As such, the company is uniquely placed to describe the key attributes data centre operators need to be considering when hosting AI in contrast to colo services. 

Founder and principal ordinate AIOrdinate, Shashank Shekhar said there are already several exciting AI/ML initiatives making waves in the ANZ region. “There is innovation of AI/ML within precision agriculture, where AI/ML is used to optimise crop yields, water usage etc,” he said. “In healthcare and medical research,  AI-powered tools assisting in disease diagnosis, treatment planning, and drug discovery, personalised medicine.” He used the example of New Zealand’s Precision Driven Health which is developing AI-powered tools for personalised cancer treatment. 

“Finance and fintech identify and prevent financial fraud using AI and ML algorithms,” he said. For example, ANZ Bank utilises AI to identify suspicious transactions and to prevent fraud. 

Shekhar identifies Responsible AI as a key theme to evolve in Gen AI and LLM models. “As the capabilities of Generative AI and LLMs continue to advance rapidly, there arises a critical need to address ethical considerations, bias mitigation, transparency, accountability, and the potential societal impacts of these technologies,” he told W.Media.  “This involves not only technical measures such as bias detection and fairness assessments but also broader considerations around data governance, user consent, and regulatory compliance.” 

DCs, AI and the sovereignty issue

Both Australia and New Zealand, like other markets, currently rely heavily on overseas LLMs and these LLMs offer benefits like access to cutting-edge technology and economies of scale. However, Shekhar says that relying solely on foreign-developed AI raises concerns about data privacy, security and potential biases which reflect the developers’ cultures and values. There’s growing recognition in both countries of the need to develop sovereign AI capabilities. He said this could involve: investing in R&D, supporting local startups; and developing national AI strategies.

“For example, Australia established the Australian Institute for Machine Learning (AIML) and New Zealand launched the AI Alliance,” he said. “Both governments are [also] formulating national AI strategies outlining goals and directions for the future of AI in their respective countries. For example, in Australia, The Defence Science and Technology Group is developing AI for military applications. In New Zealand, the Ministry of Business, Innovation and Employment (MBIE) is funding AI research projects.”

Shekhar said that while there’s a push for sovereign AI development, it’s unlikely that either country will completely abandon overseas LLMs. “A more practical approach might involve a hybrid strategy, utilising both local and international resources while mitigating risks associated with over-reliance on foreign technology,” he said. 

How will closed LLM models impact DCs?

Shekhar told W.Media while closed LLM models can be more efficient, secure, and customised than general-purpose LLMs, they may also have some limitations in terms of scalability, diversity and innovation. 

“The impact of lighter closed models on the data centre market and how they use data centres depends on several factors, such as the size, complexity of the model, the type and source of the data, the frequency and purpose of the use and so on,” he said. “Apart from reduced demand for large data centres, there will be increased demand for edge computing, but importantly reduced energy consumption and carbon footprint.”

Even so, building and deploying custom LLMs is time-consuming due to a lack of expertise, high initial costs, and ongoing expenses associated with curating and maintaining quality data. Additionally, the expense of constructing a general-purpose LLM supporting multiple tasks exceeds that of developing task-specific, smaller LLMs. 

“Relying on external companies for LLM infrastructure poses production risks and further adds to the overall expense, particularly when using models from providers like OpenAI or Google with recurring costs, said Shekhar. “There is a pressing necessity to capitalise on task-specific LLMs capabilities while also making judicious decisions regarding the utilisation of pre-trained models versus training a completely new model and choosing between Retrieval-Augmented Generation (RAG) approaches and fine-tuning strategies.”

He added: “Companies with large data corpus stand to benefit in the long run by opting for on-premises or private cloud deployment over relying solely on APIs from top individual LLMs, such as GPT-4.

No AI factories in ANZ, yet

AI factories are emerging – data centres dedicated to a single application and customer. But in ANZ, just like the food sector produces multiple products on a single production line, data centres will remain hybrid for now. Shekhar said initially there will be flexible infrastructure options available to host AI. “For example, AWS already offers dedicated instances, reserved instances, and Outposts, providing customisable solutions for specific workloads,” he said. “They might enhance these options with additional configurations for AI workloads, like high-performance GPUs, specialised networking, and AI-optimised cooling systems.”

He also sees companies like AWS partnering with companies building AI factories to offer joint solutions and leverage each other’s expertise. They might also develop reference architectures and best practices for building AI factories on AWS. As a result, new managed services will emerge. “AWS might introduce managed services like AI cluster management, resource provisioning and performance optimisation specifically for AI factories. This would ease the burden on customers and ensure optimal resource utilisation.”

While AI factories represent a growing trend, it’s unlikely to replace multi-tenant data centres entirely. “Multi-tenant data centres offer economies of scale, making them more cost-effective for many applications,” he said. “They offer a wider range of options and resources, catering to diverse computing needs beyond AI factories.”

In the longer term, he added there is scope for modular data centres, edge computing services, and specialised AI hardware are developing. 

[Author: Simon Dux]

0 views0 comments


bottom of page