On top of that, apparently long-forgotten findings from the days of plasma-vs-lcd (since home plasma displays are now a thing of the past) consistently showed that better color and contrast blow away better resolution every time on subjective experience quality assessments. So I think a better question for you to ask would be why networks keep streaming disgustingly compressed video with crushed black levels. That can only be worsened by packing in more pixels, not improved.
On file sharing networks I see some movie that is in theaters now and there are 13,000+ participants. It turns out to be an awful capture with a camcorder or phone, really shaky and blurry. The audio is marginal at best and I'd have trouble understanding it if it was in English or some language I know somewhat but it often turns out to be in some language where the only words I know are Da and Do-svidaniya.
Some weeks later the Blu-Ray comes out and there are maybe 3 people sharing it. If you are lucky the download completes but actually I just buy the Blu-Ray most of the time.
I have an ATSC 3 receiver and antenna pointed at Syracuse. There is exactly one ATSC 3 transmitter that has beautiful 1080p signals from the big three networks, unfortunately the power allocation of the ABC signal makes it unviewable just like the ATSC 1 ABC signal. My android devices can get the AC4 audio but the Xbox One and Apple devices just can't. Instead of a 4k signal they are just multiplexing a bunch of normal channels. I hear the real reason Sinclair is going to ATSC 3 is that they want to add a gambling app to sports broadcasts.
I have stimulus cash burning a hole in my pocket and can't buy the electronics I really want. I think about buying a 4K HDR TV (available) but for the 4K to be really worth it I need a bigger TV, I'll have to change the setup in the room and find a different place to put it, but thank God I have a man cave and won't have to argue with my wife about it.
I have a laptop with a 4K screen. It's cool, most apps work OK, but the size of the labels and UI components in Lightroom is never right, particularly if I have multiple monitors plugged in. If people keep shoving me money for taking photographs maybe I can afford Capture One in addition to Creative Cloud and that will be better.
As for 4K gaming the 1070 card in that laptop wouldn't come close to decent frame rates. The adulterer down the road gloats that he has a recent model GFX card that he uses to mine Etherium but GFX cards good enough for 4K gaming just aren't available for decent folks.
https://www.blu-ray.com/movies/search.php?keyword=&studioid=...
There's a ton of 4K UHD Blurays. All of Apple's TV+ content is in 4K and HDR. Isn't most of Netflix's original content in 4K now too?
If you've got a capable gaming PC, lots of games support 4K resolutions, HDR, and high frame rates if you have a display.
I personally have no trouble feeding 4K content in to my LG OLED TV and it looks glorious.
Then there's the other question of: does the average person notice the difference between a 1080p TV and a 4K TV?
Most of the time when you see a 1080p stream from cable tv, or most streaming services, it's so heavily compressed it doesn't look that good. I'm still blown away when I walk into best buy and see the demo content on a 1080p TV.
Also more expensive to shoot, and being able to see peoples skin blemishes in ultra sharp clarity doesn't land in the pro column