Their competitor is Google, which has already baked AI into search and allows you to talk to Gemini separately as well. Google already makes a hundred billion a year from text search and won't stop a free Gemini which adds value to their search. What makes you say that?
Also Deepseek and Alibaba would love to capture OpenAI users.
Inference (i.e. running an already trained LLM) requires current top-end hardware, but over time that will become commodity hardware, so there's only a small perioud (perhaps the next decade) where running an LLM as a service instead of locally even makes sense, so they will likely not go away because they are too expensive, but because they are too cheap.
Also Deepseek and Alibaba would love to capture OpenAI users.
reply on default site