OpenAI is opening the floodgates for ChatGPT by launching an API that makes it easy for third-party websites and apps to integrate the chatbot into their products.
What’s notable about ChatGPT’s API is the cost: AI-powered chatbots usually require data centers to run, forcing the provider to spend massively on hardware and electricity to serve millions of users. But through software optimizations, OpenAI has managed to reduce the costs to run ChatGPT by 90%.
OpenAI—which has also received billions in funding from Microsoft— now plans on passing on the cost savings to users of the ChatGPT API, the San Francisco-based company announced in a blog post(Opens in a new window) on Wednesday.
Specifically, OpenAI plans on charging API users $0.002 for every 1,000 tokens, or 750 words, generated by ChatGPT. The cost is 10 times cheaper than the company’s API access for its earlier “Davinci” GPT-3.5 large language model, which can also power chatbot services.
That said, third-party apps that adopt ChatGPT could still end up paying tens of thousands of dollars on a daily basis to OpenAI. If 10 million queries are made through the API, and ChatGPT ends up generating a 750-word response for each one, that could produce a $20,000 bill.
As a result, apps and websites that decide to embrace ChatGPT could try to subsidize the costs through paid subscriptions or via targeted ad schemes. Snapchat is among the first to adopt ChatGPT into its app, but the feature is currently only available to paid users on SnapChat+. Other early adopters include the study aid app Quizlet, which is using the ChatGPT API access to host an AI-powered tutor, and Instacart, which is tapping the technology to answer customer questions on recipes and food
Read more on pcmag.com