- UCLA Law experts examine OpenAI’s for-profit conversion
- Changing purposes requires extraordinary circumstances
OpenAI recently announced how it plans to remove nonprofit control of its enterprise. But the company may only abandon its nonprofit purposes under specific conditions, which likely don’t exist in this situation.
The company explained in a December 2024 blog post that the profit-making operations would become a form of for-profit company—a Delaware public benefit corporation—that may consider the public interest in its decision-making. The nonprofit that now controls those for-profit operations would lose its control and concentrate on making grants.
This plan turns the existing structure on its head. It not only deprives the nonprofit of its powers but also relieves the operational subsidiary of its duty to advance the nonprofit purposes.
Legal wrangling over whether OpenAI can wrest control from the nonprofit heated up in a California courtroom on Feb. 5.In the suit, Elon Musk, a former member of OpenAI’s board who later founded competitor X.AI Corp., argues that a nonprofit can’t abandon its original mandate. Delaware’s attorney general, warning that the parties have paid insufficient attention to the public’s interest in its legal briefings, committed to protecting “OpenAI’s mission and its obligations to the public.”
The law that governs nonprofits, especially when applied to complex entities such as OpenAI, is poorly understood. But the legal principles are straightforward.
When establishing a nonprofit, its founders make a public promise that the entity will advance its stated legal purposes. If the board wants to redirect assets toward new purposes, it must jump through legal hoops. The state attorney general participates in the process and a state court typically must bless any change in purpose.
To start the process, the court will decide whether the existing purposes have become unlawful, impossible, impracticable, or wasteful. If so, the court will then decide whether to approve the proposed change. Nonprofits can’t pick any new purpose they want—only one that approximates the old one or is as near as possible.
It’s far from certain that OpenAI could convince a court that it needs to change its purpose. OpenAI’s existing purpose, found in its Delaware certificate of incorporation, is “to ensure that artificial general intelligence benefits all of humanity, including by conducting and/or funding artificial intelligence research.”
The certificate further says that the corporation may also research or support efforts to safely distribute technology and its associated benefits. Finally, it promises that the “resulting technology will benefit the public and the corporation will seek to distribute it for the public benefit when applicable.”
These laudable purposes are neither illegal nor impossible. They are unlikely to be found wasteful, which is a legal judgment made when a nonprofit has more assets than needed to advance the stated purposes. OpenAI is arguing the opposite—that it doesn’t have enough capital to develop artificial general intelligence, or AGI.
This leaves only one pathway open: impracticability, a term that is ill-defined and rarely applied. OpenAI’s announcement that investors don’t like nonprofit constraints isn’t likely to persuade a court that the existing purpose is impracticable. OpenAI’s nonprofit purpose is to protect humanity, not to win the race to achieving AGI.
Even if a court finds that OpenAI needs to change its purposes, it’s doubtful that the proposed purposes would be close enough to “as near as possible.” OpenAI envisions a new structure in which the well-resourced nonprofit would pursue “charitable initiatives in sectors such as health care, education, and science.” Such purposes can do a lot of good, but they don’t have much to do with the specific goal of ensuring AGI benefits all of humanity.
With appropriate conditions, the nonprofit need not serve as the parent company to advance purposes that are approximately like its current ones.
First, the nonprofit must receive fair market value for the assets it holds, including compensation for loss of control over the operating company and an interest in the enormous value OpenAI anticipates. The nonprofit also should insist on sufficient liquid resources to operate.
Second, it needs to be governed by an independent board, so that it’s free of oversight by the new for-profit. This means a board where more than 50% of its members have no economic interests in OpenAI or its partners.
Third, its new activities should involve monitoring, evaluating, and reporting on the operations of AI companies, including the new OpenAI for-profit.
The law allows abandoning nonprofit purposes governing it’s assets only under extraordinary circumstances, which don’t exist here. Applicable law also requires that the nonprofit retains the resources it deserves by receiving fair market value, gaining independence from OpenAI, and retaining the purpose of ensuring that AGI benefits humanity. The public is owed no less.
This article does not necessarily reflect the opinion of Bloomberg Industry Group, Inc., the publisher of Bloomberg Law and Bloomberg Tax, or its owners.
Author Information
Rose Chan Loui is the founding executive director for UCLA School of Law’s Lowell Milken Center on Philanthropy and Nonprofits.
Jill R. Horwitz is a professor at UCLA School of Law.
Write for Us: Author Guidelines
To contact the editors responsible for this story:
Learn more about Bloomberg Law or Log In to keep reading:
Learn About Bloomberg Law
AI-powered legal analytics, workflow tools and premium legal & business news.
Already a subscriber?
Log in to keep reading or access research tools.