Nvidia and Foxconn are collaborating to establish “AI factories,” which are data centers dedicated to advancing the development of robotics, self-driving cars, and generative AI services. This move aligns with the growing trend of companies investing in data centers to capitalize on the excitement surrounding AI. By leveraging purpose-built AI, enhanced customer experiences can be achieved. Let’s explore how one company is implementing this technology.

The surge in factory establishments, witnessed particularly in the United States, may soon be followed by the emergence of “AI factories.” Recently, Nvidia announced an expansion of its longstanding partnership with Foxconn, renowned for its involvement in iPhone assembly. Together, they aim to develop a “new class of data centers” that will facilitate the acceleration of the AI industrial revolution.

These new data centers built by Foxconn will utilize Nvidia’s GPUs and CPUs to drive advancements in autonomous systems like industrial robots and self-driving cars, as well as provide generative AI services similar to OpenAI’s ChatGPT. Jensen Huang, CEO of Nvidia, highlighted the emergence of a novel kind of production: intelligence. He referred to the resulting data centers as AI factories, which possess the capability to support intensive training efforts. Such training will enhance the AI technology behind their new machines, making self-driving cars, for instance, smarter over time.

Huang emphasized how these vehicles would accumulate life experiences and collect extensive data, which would then be sent to the AI factory. The AI factory, in turn, would enhance the software and update the entire fleet of AI-powered vehicles. However, details regarding the specific locations, commencement of construction, and investment amounts for building these data centers have not been disclosed by Nvidia or Foxconn.

The announcement of AI factories coincides with major tech companies like Microsoft investing billions of dollars in AI initiatives, including the construction of new data centers, as they strive to tap into the potential of generative AI.

Jonathan Gray, President and COO of investment management company Blackstone, highlighted the well-publicized AI race and estimated that major tech firms would invest around $1 trillion in the next five years. A significant portion of this investment will be directed towards data centers.

Apart from the substantial costs associated with AI-driven data centers, concerns have been raised about their environmental impact. The energy requirements of these data centers are considerable. Research conducted at the University of Massachusetts revealed that training large AI models consumes energy equivalent to the carbon dioxide emissions generated by approximately 300 round-trip flights between San Francisco and New York. Furthermore, chipmakers like Nvidia predict that the energy usage associated with the AI industry could match the levels produced by countries the size of Sweden and the Netherlands by 2027.

Generative AI also necessitates substantial amounts of water for cooling data servers. For instance, Microsoft reportedly used 700,000 liters (approximately 185,000 gallons) of fresh water to train OpenAI’s GPT-3 language model in its data centers. Google’s recent environmental report indicated that it consumed an amount of water equivalent to 37 golf courses in 2022 to cool its servers. Additionally, a study found that ChatGPT servers consume approximately one 16.9-ounce water bottle for every 20 to 50 questions answered.

In summary, Nvidia and Foxconn’s collaboration on AI factories aims to drive advancements in robotics, self-driving cars, and generative AI services. While companies invest heavily in data centers to harness the potential of AI, concerns regarding environmental impact, including energy consumption and water usage, remain relevant considerations.