Premium
This is an archive article published on December 20, 2022

Here’s how much it costs to run OpenAI’s ChatGPT chatbot per day in dollars

It might cost around $3 million per month for OpenAI to run ChatGPT.

ChatGPT is an AI-powered chatbot that can offer human -like natural responses (Image credit: OpenAI / ChatGPT)ChatGPT is an AI-powered chatbot that can offer human -like natural responses (Image credit: OpenAI / ChatGPT)
Listen to this article
Here’s how much it costs to run OpenAI’s ChatGPT chatbot per day in dollars
x
00:00
1x 1.5x 1.8x

ChatGPT is an AI-powered chatbot, capable of creating interaction-style conversation. Developed by OpenAI, ChatGPT is known for its human-like responses and the best part about it is the fact that it is available for free. While you might be able to access ChatGPT for free, OpenAI is actually spending a lot of money to keep ChatGPT up and running.

Here are some of the least-known facts about Open AI’s ChatGPT that gives us a hint regarding the various aspects of developing and running an AI service.

The cost of running ChatGPT is $100,000 per day 

According to the analysis, ChatGPT is hosted on Microsoft’s Azure cloud, so, OpenAI doesn’t have to buy a setup physical server room. As per the current rate, Microsoft charges $3 an hour for a single A100 GPU and each word generated on ChatGPT costs $0.0003.

Story continues below this ad

Read more: ChatGPT can be used to write phishing emails, malicious code, warn security experts

A response from ChatGPT will usually have at least 30 words, hence, a single response from ChatGPT will cost at least 1 cent to the company. Currently, it is estimated that OpenAI is spending at least $100K per day or $3 million per month on running costs.

A single ChatGPT query uses at least 8 GPUs

According to Goldstein, Associate Professor at Maryland, a single NVIDIA A100 GPU is capable of running a 3-billion parameter model in about 6ms. With this speed, a single NVIDIA A100 GPU could take 350ms seconds to print out just a single word on ChatGPT.

Given ChatGPT’s latest version 3.5 has over 175 billion parameters, to get an output for a single query, it needs at least five A100 GPUs to load the model and text. ChatGPT is capable of outputting around 15-20 words per second, hence, it needs a server with at least 8 A100 GPUs.

Story continues below this ad

ChatGPT doesn’t have answers to all your questions

While ChatGPT is currently the most capable AI chat boat, it is actually trained using models that are created on or before 2021. Hence, it might not be able to give you accurate responses to all the queries.

ChatGPT has over one million users

Within a few days of the official launch, ChatGPT has over 1 million users. While most of these might not be active users, the company has definitely managed to gather a lot of users in a limited time. However, the company has to do a lot more than this to retail all these users to make ChatGPT a profitable AI tool.

Technology on smartphone reviews, in-depth reports on privacy and security, AI, and more. We aim to simplify the most complex developments and make them succinct and accessible for tech enthusiasts and all readers. Stay updated with our daily news stories, monthly gadget roundups, and special reports and features that explore the vast possibilities of AI, consumer tech, quantum computing, etc.on smartphone reviews, in-depth reports on privacy and security, AI, and more. We aim to simplify the most complex developments and make them succinct and accessible for tech enthusiasts and all readers. Stay updated with our daily news stories, monthly gadget roundups, and special reports and features that explore the vast possibilities of AI, consumer tech, quantum computing, etc.

Latest Comment
Post Comment
Read Comments