Skip navigation
OpenAI logo Alamy

OpenAI Is Not Open Source — But Neither Are Plenty of Other 'Open' Organizations

While OpenAI's name is misleading, the term "open" is ambiguous. Here's why the tech industry needs to clarify what it truly means to be "open" in any context.

Rhode Island, it has been observed, is neither a road nor an island.

You could make a similar remark about OpenAI, the company behind tools like ChatGPT and GitHub Copilot. OpenAI may be producing AI, but it's not open. None of OpenAI's technology is open source, and it's hard to argue that the company is open in any broader sense.

On the other hand, it's also hard to fault OpenAI too much over choosing a name for itself that perhaps implies something other than what most folks imagine when they hear the term "open" today. That word "open" is one of the most ambiguous — and, arguably, most misused — terms in the history of technology, and so you can't point fingers at OpenAI in particular for calling itself open when it's not actually open source.

To prove the point, let's look at what OpenAI does, what "open" means, and why OpenAI is just the latest of many examples from the technology industry where open turns out not to mean what reasonable people think it means.

Is OpenAI Open Source?

Let's make one thing clear: Nothing that OpenAI has produced to date is open source software. The code behind tools like ChatGPT is closed, meaning no one except developers at OpenAI can view it, modify it, or reuse it.

Nor are most of OpenAI's products free of cost. Some of OpenAI's tools are currently available without payment, but it is also monetizing many of its products and services, especially for customers who use them extensively. Even if everything OpenAI created were free from a pricing standpoint, that wouldn't make its products open source, but the fact that the company is clearly not out to make everything free of cost is worth noting because free-to-use software is sometimes conflated with open source software.

From a broader perspective, there is nothing in particular that makes OpenAI's culture, business model, or ecosystem engagement "open." Most of its technology is developed behind closed doors, with little or no input from the public — a fact that distinguishes OpenAI from projects like OpenTelemetry or the Open Container Initiative, which are examples of vendor-agnostic, community-based efforts to advance technology. OpenAI collaborates with a handful of businesses, most notably Microsoft, but it would be hard to argue that it's taking an "open" approach to partner engagement; most of its partnerships have a clear profit-oriented goal.

So, in short, there's nothing at all that makes OpenAI open.

Alamypaper peeled back to read "open source"

The Problem with Defining 'Open'

But again, it feels unfair to me to be overly critical of OpenAI about its choice of name.

Beyond the fact that — as others have noted — the company originated as a non-profit and therefore could make some kind of claim in its earlier years about being "open" in the sense that it wanted its products to be freely available to everyone, there's the issue that the word "open" itself means everything and nothing.

Historically, the term featured in the names of initiatives whose goal was to help standardize software platforms. A prominent example is X/Open, which was founded in the 1980s to promote standardization for Unix-like operating systems (and which later became part of the basis for The Open Group, which continues to promote vendor-neutral technology standards today).

Then, in 1998, "open" took on a separate meaning when the term "open source software" was coined. Open source software is software whose source code is publicly available for study and reuse, which is quite different from vendor-neutral technology standards. The adoption of the term "open source" was also complicated given that it was an effort to create an alternative to "free software," a term that had been around since Richard Stallman launched GNU in the mid-1980s, and that "open source" means something entirely separate in the context of intelligence organizations.

More recently, organizations like Docker, which calls itself an "open platform" although not all of its code is open source, have adopted the label "open" without bothering to explain what, exactly, they do to earn it.

My point here is that the meaning of "open" is open to a wide range of interpretations. Being open doesn't necessarily mean producing open source software; indeed, historically, the term "open" had nothing to do with open source code.

Should OpenAI Call Itself 'Open'?

For the record, I'm not trying to argue that OpenAI has a great name. I think it's misleading, and I'd love to see it changed.

But I also think it's a little unfair to accuse OpenAI of abusing the term "open" to serve its own agenda. In taking advantage of a term that has long been subject to substantial ambiguity, the company is in good company. Rather than debating the merits of OpenAI's name, the technology industry should perhaps instead consider ways to clarify what being open actually means, in any context.

About the author

Christopher Tozzi headshotChristopher Tozzi is a technology analyst with subject matter expertise in cloud computing, application development, open source software, virtualization, containers and more. He also lectures at a major university in the Albany, New York, area. His book, “For Fun and Profit: A History of the Free and Open Source Software Revolution,” was published by MIT Press.
Hide comments


  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.