Tired of shortages, OpenAI considers making its own AI chips
The hardware situation is said to be a top priority for OpenAI, as the company currently relies on a massive supercomputer built by Microsoft, one of its largest backers. The supercomputer uses 10,000 Nvidia graphics processing units (GPUs), according to Reuters. Running ChatGPT comes with significant costs, with each query costing approximately 4 cents, according to Bernstein analyst Stacy Rasgon. If queries grow to even a tenth of the scale of Google search, the initial investment in GPUs would be around $48.1 billion, with annual maintenance costs at about $16 billion.
Source: Tired of shortages, OpenAI considers making its own AI chips
But… where is this $50+ billion going to come from? I know a lot of people like ChatGPT, but many non-technical users I discuss it with have moved back to standard web searches.