Now developers can officially integrate ChatGPT into their products and services. Get ready, y’all.
OpenAI has announced that it’s now letting third-party developers integrate ChatGPT into their apps and services via an API and that doing so will be significantly cheaper than using its existing language models. The company is making Whisper, its AI-powered speech-to-text model, available for use through an API and making some important changes to its developer terms of service.
OpenAI says its ChatGPT API can be used for more than just creating an AI-powered chat interface — though it also highlights several companies that have been using it for that purpose, including Snap’s My AI feature, which was announced earlier this week. The company says its new model family, called gpt-3.5-turbo, is the “best model for many non-chat use cases.”
It’s worth noting that the model likely isn’t the same one that Bing is using, which Microsoft has called a “new, next-generation OpenAI large language model” that’s “even faster, more accurate and more capable” than ChatGPT and GPT-3.5. However, given how much money the company has invested in OpenAI, it’s not a surprise that it has access to tech not available to the average developer. Microsoft is also using a healthy dose of its own tech for Bing.
OpenAI is offering 1,000 tokens for $0.002 and says that’s “10x cheaper than our existing GPT-3.5 models,” thanks in part to “a series of system-wide optimizations.” While 1,000 seems like a lot, it’s worth noting that sending one snippet of text for the API to respond to could cost several tokens. (“Tokens” are the blocks of text that the system breaks sentences and words into in order to predict what text it should output next.)
According to OpenAI’s documentation, “ChatGPT is great!” takes six tokens — its API breaks it up into “Chat,” “G,” “PT,” “ is,” “ great,” and “!”. The company provides a tool for checking how many tokens it’ll take to interpret a string of text and says that a general rule of thumb is that “one token generally corresponds to ~4 characters” in English.
The company says that developers will also be able to get a dedicated instance of ChatGPT if they’re running a monstrous amount of data through the API. Its post says that doing so will give you more control over what model you’re using, how long you want it to take to respond to requests, and how long conversations with the bot can be.
While ChatGPT is likely to garner the most attention, OpenAI has also announced another new API for Whisper, its speech-to-text model. The company says you can use it to transcribe or translate audio at a cost of $0.006 per minute. Technically, the Whisper model is open source, so you can run it on your own hardware without paying anything. However, OpenAI likely has access to more powerful hardware, so if you’re looking for a quick turnaround or need to do transcription on lower-powered devices like phones, using the API may be the way to go.
OpenAI is also announcing some policy changes that it says are based on developer feedback. A big one is saying that it won’t use data submitted through the API to train its models anymore unless customers explicitly okay that usage.
In other words, it’s going from an opt-out system to an opt-in one. This change could help alleviate some concerns about putting proprietary information into the bot, as some companies have barred employees from using the tech entirely. If it’s learning from user input, it’d be a bad idea to input trade secrets, as there’s always the possibility that it could spit that data back out to someone else.
The company also says it’s working on improving its uptime and that its “engineering team’s top priority is now stability of production use cases.”
While several developers have come up with workarounds to include chat services in their apps — including by using OpenAI’s regular GPT API, which has been available for a while — the introduction of an official ChatGPT API feels like it could be the moment the floodgates open. While there are plenty of companies working on their own AI chatbot models, that sort of thing is completely out of reach for most developers. Now, they’ll be able to just use OpenAI’s tech.