For average folks it's not a biggie. Go for what fits your needs, situation and budget.
Last year, I replaced my ancient Dell 2005FPW that I got 17 years ago, with a 21.5-inch LG UltraFine that I got off of Ebay for half the original MSRP ($342 US). It was brand new, and had a manufacture date of 2017, so I have no idea where it was hiding for all of those years.
The Dell used the same 20-inch panel that the Cinema Display of that era featured, $700 on sale for the Dell, $1,299 for the nearly identical Apple monitor. I still remember the price differential, and reading the
Anandtech review where they mention this. The Apple tax goes way back.
Comparing my old monitor to the new one, the form factor and case color are almost identical. The panels are completely different, with the Dell being 1680X1050 featuring a matte finish. The LG has a "Retina" 4096X2304 semi-glossy display.
For me, personally, this was a massive upgrade and easily worth what I paid for it. However, when I've had non-tech users sit down and use my Mac mini with the LG, they didn't notice a difference. One person asked me when I was going to set up my new monitor. Two others had no idea that I had swapped anything.
We tech nerds notice and value these sort of upgrades, but many average users don't perceive anything other than the size and color of the monitor casing. It all depends on the individual, and for some a standard definition monitor is just fine, for others the "Retina" experience is a must. When Apple changed macOS to favor "Retina" displays, I stayed on Mojave, because newer versions gave me physical headaches while using the Dell. I only switched to Monterey when I got the UltraFine.
From here on out, I'm going to be using displays that are ~218ppi with my Macs. I just prefer sticking to Apple's recommended resolution, and anything else is unacceptable to me. However, there are plenty of folks who are fine with standard definition, or anything in-between, because we all have different preferences, tolerances, and biological vision.