Let's be fair - not all of the masses are so ignorant.
If you consider API vs subscription, you probably get more bang for buck out of paying $20/USD than just paying per million tokens via API calls. At least for OAI models. It's legitimately a good deal for heavy users.
For simipler stuff and/or if you have decent hardware? For sure - go local. Qwen3-4B 2507 instruct matches or surpasses ChatGPT 4.1 nano and mini on almost all benchmarks...and you can run it on your phone. I know because it (or the ablit version) is my go to at home. Its stupidily strong for a 4B.
But if you need SOTA (or near to) and are rocking typical consumer grade hardware, then $20/month for basically unlimited tokens is the reason for subscription.
Ah but subscription to OpenAI ChatGPT ($20/USD) gives you access to ChatGPT 5.3 codex bundled in, with some really generous usage allowances (well, compared to Claude)
I haven't looked recently, but API calls to Codex 5.2 via OR were silly expensive per million tokens; I can't imagine 5.3 is any cheaper.
To be fair to your point: I doubt many people sign up specifically for this (let's say 20% if were making up numbers). Its still a good deal though. I can chew thru 30 million tokens in pretty much a day when I'm going hammer at tongs at stuff.
Frankly, I don't understand how OAI remain solvent. They're eating a lot of shit in their "undercut the competition to take over the market" phase. But hey, if they're giving it away, sure, I'll take it.