With one week left in office, President Biden signed an executive order that sets aside federal lands for building of artificial intelligence (AI) data centers, with the full cost borne by developers of AI foundation — or frontier — models like OpenAI’s GPT-4o. AI model developers must also make sure there is a clean energy source for these data centers, as intense AI workloads are a notorious energy guzzler.
The latest order follows Biden’s October 2023 executive order setting out guardrails for powerful frontier or foundation AI models. It includes ensuring the government gets to evaluate AI systems before they are deployed in areas such as cybersecurity and other national security risks.
Biden also pledged to develop labeling and content provenance mechanisms, so consumers can tell which content is AI-generated. Trump issued the first executive order on AI in 2020, calling for its use in the federal government. Different states (California, Texas and others) also have their own AI rules.
UK and EU RegulationsAI regulations differ in the U.S. from the U.K. and Europe. The EU AI Act is a much more sweeping legislation that assesses AI applications based on three risk levels: unacceptable risk (similar to government-run scoring of individuals based on social standing), high-risk (resume-scanning AI tools that ranks job applicants) and what’s not banned or seen as high risk.
U.K. Prime Minister Keir Starmer announced on Monday (Jan. 13) an action plan to make Britain a leader in AI, including expanding its data center capacity for AI workloads. Starmer said formal AI regulations are coming. His predecessor Rishi Sunak unveiled an AI regulatory framework for existing regulators to follow.
AI Chip Export ControlsBiden also expanded on his 2022 and 2023 AI chip export controls meant to keep China and other adversary nations from getting their hands on AI hardware. This week’s new regulations carve the world into groups of haves and have-nots: 18 allies and partners will not have any restrictions at all, while buyers of smaller chip orders of up to 1,700 advanced GPUs in computational power get the greenlight as well. These are typically universities and research organizations.
However, more than 120 other countries reportedly face new rules in setting up AI computing facilities. Trusted entities include those based in countries that are close U.S. allies and are not headquartered in a “country of concern.” Those not based in allied countries can still buy up to 50,000 advanced GPUs per country. Biden also set rules that would keep secret an AI model’s weights from untrusted entities, among other security controls.
Nvidia Decries Chip LimitsThe rules are expected to impact Nvidia, whose GPU chips have been the silicon of choice for training and inference of AI models. The company has a market share pegged at more than 80%.
Nvidia positioned itself for an AI revolution back in 2006; its GPUs were initially developed to handle gaming and other graphics-intensive applications. Co-founder and CEO Jensen Huang bet the company’s future on the pivot to AI, despite AI advances having stalled in past so-called “AI winters.”
Ned Finkle, Nvidia’s vice president of government affairs, decried Biden’s new rules. He wrote in a blog that the advance of AI globally is “now in jeopardy.” He said Biden’s “misguided” policy “threatens to derail innovation and economic growth worldwide.”
Finkle called Biden’s expanded export control rules as a “200+ page regulatory morass, drafted in secret and without proper legislative review.” Such regulatory action “threatens to squander America’s hard-won technological advantage,” he added.
Finkle lauded the first Trump administration for fostering an environment of AI innovation and said he “looked forward” to a return to his policies as the ex-president prepares to take his oath of office.
The Semiconductor Industry Association weighed in with its own statement. “We’re deeply disappointed that a policy shift of this magnitude and impact is being rushed out the door days before a presidential transition and without any meaningful input from industry.”
OpenAI’s Plan for USAs OpenAI CEO Sam Altman signaled plans to attend the inauguration of President Trump, his AI startup preemptively rolled out a blueprint to keep America at the forefront of AI development.
“We believe America needs to act now to maximize AI’s possibilities while minimizing its harms. AI is too powerful a technology to be led and shaped by autocrats, but that is the growing risk we face, while the economic opportunity AI presents is too compelling to forfeit,” according to OpenAI’s “AI in America” economic plan.
OpenAI’s vision hinges on the following beliefs:
The post Data Centers, AI Rules, Chip Limits and OpenAI Talks Policy appeared first on PYMNTS.com.