The vast majority of people (evidenced by the number of
subsidized subscriptions) are not clamoring to these "LLM Slopbots." I put that in bold because the actual costs are off the charts!
We were promised cures for cancer and solutions for global warming. While there are genuinely great use cases—particularly for my tastes in Machine Learning and cancer detection—the vast majority of users on Gemini, "Nano banana," OpenAI, etc., are producing photos of monkeys riding surfboards. Fun, I'm sure. Useful? Not so sure.
However, when most folks talk "AI" these days and the massive infrastructural build-out, it is largely in support of these LLMs, so my viewpoint should be seen through that lens. I'm a huge proponent for ML and even SLMs. LLMs... not so much.
To use the latest models means using a subsidized tier. The numbers are not adding up, and I'm not convinced that silicon development and energy efficiency will exponentially increase in time to offset the complexity of the latest reasoning models which burn through tokens internally to produce an output. Consumers are going to have to switch from "how many tokens do I get for $20" to a "what is the cost per task" model.
The challenge with this is that in order to familiarize the market with these tools, you need low-friction, flat-rate access. If users have to weigh the cost of every single prompt, it kills the experimentation needed to actually learn the technology.
Finally, I'll share with y'all the following paper that is quite eye opening (bolding my own highlights

) :
Four key areas where chips can help manage AI's insatiable power appetite.
semiengineering.com
Please note the summary conclusion....
"
To put this in perspective, consider that the Hoover Dam in Nevada generates about 4 TWh per year; the Palo Verde nuclear power plant in Arizona generates 32 TWh per year, and the Three Gorges Dam in China is expected to generate 90 TWh per year. But between 2028 and 2030, given the current rate of growth, AI data center power demands will increase by 350 TWh, which is nearly three times as much energy as all of those generating facilities combined.[2]
No single change will shrink that gap. For the semiconductor industry to continue growing at the current pace, it will require changes from the grid on down, and from the chip up. And even then, it’s not clear if that will really close the gap, or whether it will simply enable AI data centers to grow even larger.
" ~ source :
https://semiengineering.com/crisis-ahead-power-consumption-in-ai-data-centers/ and author : Ed Sperling
Just my 0.02.