Skip to content

Why countries are in a race to build AI factories in the name of sovereign AI

Why countries are in a race to build AI factories in the name of sovereign AI

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More


Now that AI has become a fundamentally important technology, and the world has gravitated toward intense geopolitical battles, it’s no wonder that “sovereign AI” is becoming a national issue.

Think about it. Would the U.S. allow the data it generates for AI to be stored and processed in China? Would the European Union want its people’s data to be accessed by big U.S. tech giants? Would Russia trust NATO countries to manage its AI resources? Would Muslim nations entrust their data for AI to Israel?

Nvidia has earmarked $110 million to help countries foster AI startups to invest in sovereign AI infrastructure, and plenty of countries are investing in AI infrastructure on their own. That’s some real money aimed at jumpstarting the world when it comes to embracing AI. The question becomes whether this discussion is a lot of thought leadership to enable a sales pitch, or whether nations truly need to embrace sovereign AI to be competitive with the rest of the world. Is it a new kind of arms race that makes sense for nations to pursue?

A wake-up call

Digital rendering of Nvidia’s Jensen Huang

Jensen Huang, CEO of Nvidia, pointed out the rise of “sovereign AI” during an earnings call in November 2023 as a reason for why demand is growing for Nvidia’s AI chips. The company noted that investment in national computer infrastructure was a new priority for governments around the world.

“The number of sovereign AI clouds is really quite significant,” Huang said in the earnings call. He said Nvidia wants to enable every company to build its own custom AI models.

The motivations weren’t just about keeping a country’s data in local tech infrastructure to protect it. Rather, they saw the need to invest in sovereign AI infrastructure to support economic growth and industrial innovation, said Colette Kress, CFO of Nvidia, in the earnings call.

That was around the time when the Biden administration was restricting sales of the most powerful AI chips to China, requiring a license from the U.S. government before shipments could happen. That licensing requirement is still in effect.

As a result, China reportedly began its own attempts to create AI chips to compete with Nvidia’s. But it wasn’t just China. Kress also said Nvidia was working with the Indian government and its large tech companies like Infosys, Reliance and Tata to boost their “sovereign AI infrastructure.”

Meanwhile, French private cloud provider Scaleway was investing in regional AI clouds to fuel AI advances in Europe as part of a “new economic imperative,” Kress said. The result was a “multi-billion dollar opportunity” over the next few years, she said.

Huang said Sweden and Japan have embarked on creating sovereign AI clouds.

“You’re seeing sovereign AI infrastructures, people, countries that now recognize that they have to utilize their own data, keep their own data, keep their own culture, process that data, and develop their own AI. You see that in India,” Huang said.

He added, “Sovereign AI clouds coming up from all over the world as people realize that they can’t afford to export their country’s knowledge, their country’s culture for somebody else to then resell AI back to them.”

Nvidia itself defines sovereign AI as “a nation’s capabilities to produce artificial intelligence using its own infrastructure, data, workforce and business networks.”

Keeping sovereign AI secure

Credit: VentureBeat using DALL-E
Credit: VentureBeat using DALL-E

In an interview with VentureBeat in February 2024, Huang doubled down on the concept, saying, “We now have a new type of data center that is about AI generation, an AI generation factory. And you’ve heard me describe it as AI factories. Basically, it takes raw material which is data, transforms it with these AI supercomputers and Nvidia builds and it turns them into incredibly valuable tokens. These tokens are what people experience on the amazing” generative AI platforms like Midjourney.

I asked Huang why, if data is kept secure regardless of its location in the world, does sovereign AI need to exist within the borders of any given country.

He replied, “There’s no reason to let somebody else come and scrape your internet, take your history, your data. And a lot of it is still locked up in libraries. In our case, it’s Library of Congress. In other cases, national libraries. And they’re digitized, but they haven’t been put on the internet.”

He added, “And so people are starting to realize that they had to use their own data to create their own AI, and transform their raw material into something of value for their own country, by their own country. And so you’re going to see a lot. Almost every country will do this. And they’re going to build the infrastructure. Of course, the infrastructure is hardware. But they don’t want to export their data using AI.”

The $110 million investment

Shilpa Kolhatkar (left) of Nvidia speaks with Jon Metzler of U.C. Berkeley.

Nvidia has earmarked $110 million to invest in AI startups helping with sovereign AI projects and other AI-related businesses.

Shilpa Kolhatkar, global head of AI Nations at Nvidia, gave a deeper dive on sovereign AI at the U.S.-Japan Innovation Symposium at Stanford University. The July event was staged by the Japan Society of Northern California and the Stanford US-Asia Technology Management Center.

Kolhatkar did the interview with Jon Metzler, a continuing lecturer at the Haas School of Business at the University of California, Berkeley. That conversation focused on how to achieve economic growth through investments in AI technology. Kolhatkar noted how Nvidia has transformed itself from a graphics company to a high-performance computing and AI company long before ChatGPT arrived.

“Lots of governments around the world are looking today at how can they capture this opportunity that AI has presented and they [have focused] on domestic production of AI,” Kolhatkar said. “We have the Arab nations program, which kind of matches the AI strategy that nations have in place today. About 60 to 70 nations have an AI strategy in place, built around the major pillars of creating the workforces and having the ecosystem. But it’s also around having already everything within the policy framework.”

AI readiness?

Examples of generative AI by Getty Images.
Examples of generative AI by Getty Images.

Nvidia plays a role in setting up the ecosystem and infrastructure, or supercomputers. The majority of Nvidia’s focus and its engineering efforts is in the software stack on top of the chips, she said. As a result, Nvidia has become more of a platform company, rather than a chip company. Metzler asked Kolhatkar to define how a country might develop “AI readiness.”

Kolhatkar said that one notion is to look at how much computing power a country has, in terms of raw AI compute, storage and the energy related to power such systems. Does it have a skilled workforce to operate the AI? Is the population ready to take advantage of AI’s great democratization so that the knowledge spreads well beyond data scientists?

When ChatGPT-3.5 emerged in Nov. 2022 and generative AI exploded, it signaled that AI was really finally working in a way that ordinary consumers could use to automate many tasks and find new information or create things like images on their own. If there were errors in the results, it could be because the data model wasn’t fed the correct information. Then it quickly followed that different regions had their own views on what was considered correct information.

“That model was trained primarily on a master data set and a certain set of languages in western [territories],” Kolhatkar said. “That is why the internationalization of having something which is sovereign, which is specific to a nation’s own language, culture and nuances, came to the forefront.”

Then countries started developing generative AI models that cater to the specificities of a particular region or particular nation, and, of course, the ownership of that data, she said.

“The ownership is every country’s data and proprietary data, which they realized should stay within the borders,” she said.

AI factories

Nvidia’s notion of AI factories.

Nvidia is now in the process of helping countries create such sovereign infrastructure in the form of “AI factories,” Kolhatkar said. That’s very similar to the drive that nations ignited with factories during the Industrial Revolution more than 100 years ago.

“Factories use raw materials that go in and then goods come out and that was tied to the domestic GDP. Now the paradigm is that your biggest asset is your data. Every nation has its own unique language and data. That’s the raw material that goes into the AI factory, which consists of algorithms, which consists of models and out comes intelligence,” she said.

Now countries like Japan have to consider whether they’re ahead or falling behind when it comes to being ready with AI factories. Kolhatkar said that Japan is leading the way when it comes to investments, collaborations and research to create a successful “AI nation.”

She said companies and nations are seriously considering how much of AI should be classified as “critical infrastructure” for the sake of economic or national security. Where industrial factories could create thousands of jobs in a given city, now data centers can create a lot of jobs in a given region as well. Are these AI factories like the dams and airports of decades ago?

“You’re kind of looking at past precedents from physical manufacturing as to what the multiplier might be for AI factories,” Metzler said. “The notion of AI factories as maybe civic infrastructure is super interesting.”

National AI strategies?

Cerebras Condor Galaxy at Colovore Data Center
Cerebras Condor Galaxy at Colovore Data Center

Metzler brought up the notion of the kind of strategies that can happen when it comes to the AI race. For instance, he noted that maybe smaller countries need to team up to create their own larger regional networks, to create some measure of sovereignty.

Kolhatkar said that can make sense if your country, for instance, doesn’t have the resources of any given tech giant like Samsung. She noted the Nordic nations are collaborating with each other, as are nations like the U.S. and Japan when it comes to AI research. Different industries or government ministries can also get together for collaboration on AI.

If Nvidia is taking a side on this, it’s in spreading the tech around so that everyone becomes AI literate. Nvidia has an online university dubbed the Deep Learning Institute for self-paced e-learning courses. It also has a virtual incubator Nvidia Inception, which has supported more than 19,000 AI startups.

“Nvidia really believes in democratization of AI because the full potential of AI can not be achieved unless everybody’s able to use it,” Kolhatkar said.

Energy consumption?

AI power consumption

As for dealing with the fallout of sovereign AI, Metzler noted that countries will have to deal with sustainability issues in terms of how much power is being consumed.

In May, the Electric Power Research Institute (EPRI) released a whitepaper that quantified the exponential growth potential of AI power requirements. It projected that total data center power consumption by U.S. data centers alone could more than double to 166% by 2030.

It noted that each ChatGPT request can consume 2.9 watt-hours of power. That means AI queries are estimated to require 10 times the electricity of traditional Google queries, which use about 0.3 watt-hours each. That’s not counting emerging, computation-intensive capabilities such as image, audio and video generation, which don’t have a comparision precedent.

EPRI looked at four scenarios. Under the highest growth scenario, data center electricity usage could rise to 403.9 TWh/year by 2030, a 166% increase from 2023 levels. Meanwhile, the low growth scenario projected a 29% increase to 196.3 TWh/year.

“It’s about the energy efficiency, sustainability is pretty top of mind for everyone,” Kolhatkar said.

Nvidia is trying to make each generation of AI chip more power efficient even as it makes each one more performant. She also noted the industry is trying to create and use sources of renewable energy. Nvidia also uses its output from AI, in the form of Nvidia Omniverse software, to create digital twins of data centers. These buildings can be architected with energy consumption in mind and with the notion of minimizing duplicative effort.

Once they’re done, the virtual designs can be built in the physical world with a minimum of inefficiency. Nvidia is even creating a digital twin of the Earth to predict climate change for decades to come. And the AI tech can also be applied to making physical infrastructure more efficient, like making India’s infrastructure more resistant to monsoon weather. In these ways, Kolhatkar thinks AI can be used to “save the world.”

She added, “Data is the biggest asset that a nation has. It has your proprietary data with your language, your culture, your values, and you are the best person to own it and codify it into an intelligence that you want to use for your analysis. So that is what sovereignty is. That is at the domestic level. The local control of your assets, your biggest asset, [matters].”

A change in computing infrastructure

Nvidia Blackwell has 208 billion transistors.
Nvidia Blackwell has 208 billion transistors.

Computers, of course, don’t know national borders. If you string internet cables around the world, the information flows and a single data center could theoretically provide its information on a global basis. If that data center has layers of security built in, there should be no worry about where it’s located. This is the notion of the advantage of computers of creating a “virtual” infrastructure.

But these data centers need backups, as the world has learned that extreme centralization isn’t good for things like security and control. A volcanic eruption in Iceland, a tsunami in Japan, an earthquake in China, a terrorist attack on infrastructure or possible government spying in any given country — these are all reasons for having more than one data center to store data.

Besides disaster backup, national security is another reason driving each country to require their own computing infrastructure within their borders. Before the generative AI boom, there was a movement to ensure data sovereignty, in part because some tech giants overreached when it came to disintermediating users and their applications that developed personalized data. Data best practices resulted.

Roblox CEO Dave Baszucki said at the Roblox Developer Conference that his company operates a network of 27 data centers around the world to provide the performance needed to keep its game platform operating on different computing platforms around the world. Roblox has 79.5 million daily active users who are spread throughout the world.

Given that governments around the world are coming up with data security and privacy laws, Roblox might very well have to change its data center infrastructure so that it has many more data centers that are operating in given jurisdictions.

There are 195 nation states in the world, and if the policies become restrictive, a company might conceivably need to have 195 data centers. Not all of these divisions are parochial. For instance, some countries might want to deliberately reduce the “digital divide” between rich nations and poor ones, Kolhatkar said.

There’s another factor driving the decentralization of AI — the need for privacy. Not only for the governments of the world, but also for companies and people. The celebrated “AI PC” trend of 2024 offers consumers personal computers with powerful AI tech to ensure the privacy of operating AI inside their own homes. This way, it’s not so easy for the tech giants to learn what you’re searching for and the data that you’re using to train your own personal AI network.

Do we need sovereign AI?

Nvidia humanoid robots.
Nvidia humanoid robots.

Huang suggested that countries perceive it as needed so that a large language model (LLM) can be built with knowledge of local customs. As an example, Chernobyl is spelled with an “e” in Russian. But in Ukraine, it’s spelled “Chornobyl.” That’s just a small example of why local customs and culture need to be taken into account for systems used in particular countries.

Some people are concerned about the trend as it drives the world toward more geographic borders, which in the case of computing, really don’t or shouldn’t exist.

Kate Edwards, CEO of Geogrify and an expert on geopolitics in the gaming industry, said in a message, “I think it’s a dangerous term to leverage, as ‘sovereignty’ is a concept that typically implies a power dynamic that often forms a cornerstone of nationalism, and populism in more extreme forms. I get why the term is being used here but I think it’s the wrong direction for how we want to describe AI.”

She added, “‘Sovereign’ is the wrong direction for this nomenclature. It instantly polarizes what AI is for, and effectively puts it in the same societal tool category as nuclear weapons and other forms of mass disruption. I don’t believe this is how we really want to approach this resource, especially as it could imply that a national government essentially has an enslaved intelligence whose purpose is to reinforce and serve the goals of maintaining a specific nation’s sovereignty — which is the basis for the great majority of geopolitical conflict.”

Are countries taking Nvidia’s commentary seriously or do they view it as a sales pitch? Nvidia isn’t the only company succeeding with the pitch.

AMD competes with Nvidia in AI/graphics chips as well as CPUs. Like Nvidia, it is seeing an explosion in demand for AI chips. AMD also continues to expand its efforts in software, with the acquisition of AI software firms like Nod.AI and Silo AI. AI is consistently driving AMD’s revenues and demand for both its CPUs and GPUs/AI chips.

Cerebras WSE-3
Cerebras WSE-3

Cerebras Systems, for instance, announced in July 2023 that it was shipping its giant wafer-size CPUs to the technology holding group G42, which was building the world’s largest supercomputer for AI training, named Condor Galaxy, in the United Arab Emirates.

It started with a network of nine interconnected supercomputers aimed at reducing AI model training time significantly, with a total capacity of 36 exaFLOPs, thanks to the first AI supercomputer on the network, Condor Galaxy 1 (CG-1), which had 4 exaFLOPs and 54 million cores, said Andrew Feldman, CEO of Cerebras, in an interview with VentureBeat. Those computers were based in the U.S., but they are being operated by the firm in Abu Dhabi. (That raises the question, again, of whether sovereign AI tech has to be located in the country that uses the computing power).

Now Cerebras has broken ground on a new generation of Condor Galaxy supercomputers for G42.

Rather than make individual chips for its centralized processing units (CPUs), Cerebras takes entire silicon wafers and prints its cores on the wafers, which are the size of pizza. These wafers have the equivalent of hundreds of chips on a single wafer, with many cores on each wafer. And that’s how they get to 54 million cores in a single supercomputer.

Feldman said, “AI is not just eating the U.S. AI is eating the world. There’s an insatiable demand for compute. Models are proliferating. And data is the new gold. This is the foundation.”

Leave a Reply