A new study published in Nature by University of Cambridge researchers just dropped a pixelated bomb on the entire Ultra-HD market, but as anyone with myopia can tell you, if you take your glasses off, even SD still looks pretty good :)

  • tankplanker@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    6 days ago

    Quality of the system is such a massive dependency here, I can well believe that someone watching old reruns from a shitty streaming service that is upscaled to 1080p or 4k by their TV they purchased from the supermarket with coupons collected from their breakfast cereal is going to struggle to tell the difference.

    Likewise if you fed the TVs with a high end 4k blu ray player and any blu ray considered reference such as Interstellar, you are still going to struggle to tell the difference, even with a more midrange TV unless the TVs get comically large for the viewing distance so that the 1080p screen starts to look pixelated.

    I think very few people would expect their old wired apple earphones they got free with their iphone 4 would expect amazing sound from them, yet people seem to be ignoring the same for cheap TVs. I am not advocating for ultra high end audio/videophile nonsense with systems costing 10s of thousands, just that quite large and noticeable gains are available much lower down the scale.

    Depending what you watch and how you watch it, good quality HDR for the right content is an absolute home run for difference between standard 1080p and 4k HDR if your TV can do true black. Shit TVs do HDR shitterly, its just not comparable to a decent TV and source. Its like playing high rez loss less audio on those old apple wired earphones vs. playing low bitrate MP3s.

  • TheFeatureCreature@lemmy.ca
    link
    fedilink
    English
    arrow-up
    287
    arrow-down
    1
    ·
    9 days ago

    Kind of a tangent, but properly encoded 1080p video with a decent bitrate actually looks pretty damn good.

    A big problem is that we’ve gotten so used to streaming services delivering visual slop, like YouTube’s 1080p option which is basically just upscaled 720p and can even look as bad as 480p.

    • Feyd@programming.dev
      link
      fedilink
      English
      arrow-up
      114
      arrow-down
      2
      ·
      9 days ago

      Yeah I’d way rather have higher bitrate 1080 than 4k. Seeing striping in big dark or light spots on the screen is infuriating

    • woelkchen@lemmy.world
      link
      fedilink
      English
      arrow-up
      48
      arrow-down
      3
      ·
      9 days ago

      A big problem is that we’ve gotten so used to streaming services delivering visual slop, like YouTube’s 1080p option which is basically just upscaled 720p and can even look as bad as 480p.

      YouTube is locking the good bitrates behind the premium paywall and even as a premium users you don’t get to select a high bitrate when the source video was low res.

      That’s why videos should be upscaled before upload to force YouTube into offering high bitrate options at all. A good upscaler produces better results than simply stretching low-res videos.

      • azertyfun@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        7 days ago

        I think the premium thing is a channel option. Some channels consistently have it, some don’t.

        Regular YouTube 1080p is bad and feels like 720p. The encoding on videos with “Premium 1080p” is catastrophic. It’s significantly worse than decently encoded 480p. Creators will put a lot of time and effort in their lighting and camera gear, then the compression artifacting makes the video feel like watching a porn bootleg on a shady site. I guess there must be a strong financial incentive to nuke their video quality this way.

    • notfromhere@lemmy.ml
      link
      fedilink
      English
      arrow-up
      15
      ·
      9 days ago

      I can still find 480p videos from when YouTube first started that rival the quality of the compressed crap “1080p” we get from YouTube today. It’s outrageous.

      • IronKrill@lemmy.ca
        link
        fedilink
        English
        arrow-up
        10
        ·
        edit-2
        7 days ago

        Sadly most of those older YouTube videos have been run through multiple re-compressions and look so much worse than they did at upload. It’s a major bummer.

    • SaharaMaleikuhm@feddit.org
      link
      fedilink
      English
      arrow-up
      11
      ·
      8 days ago

      This. The visual difference of good vs bad 1080p is bigger than between good 1080p and good 4k. I will die on this hill. And Youtube’s 1080p is garbage on purpose so they get you to buy premium to unlock good 1080p. Assholes

      • TheFeatureCreature@lemmy.ca
        link
        fedilink
        English
        arrow-up
        7
        ·
        8 days ago

        The 1080p for premium users is garbage too. Youtube’s video quality in general is shockingly poor. If there is even a slight amount of noisy movement on screen (foliage, confetti, rain, snow, etc) the the video can literally become unwatchable.

    • deranger@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      4
      ·
      9 days ago

      HEVC is damn efficient. I don’t even bother with HD because a 4K HDR encode around 5-10GB looks really good and streams well for my remote users.

    • Omega_Jimes@lemmy.ca
      link
      fedilink
      English
      arrow-up
      4
      ·
      8 days ago

      I’ve been investing in my bluray collection again and I can’t believe how good 1080p blurays look compared to “UHD streaming” .

  • Hackworth@piefed.ca
    link
    fedilink
    English
    arrow-up
    153
    arrow-down
    2
    ·
    edit-2
    9 days ago

    I can pretty confidently say that 4k is noticeable if you’re sitting close to a big tv. I don’t know that 8k would ever really be noticeable, unless the screen is strapped to your face, a la VR. For most cases, 1080p is fine, and there are other factors that start to matter way more than resolution after HD. Bit-rate, compression type, dynamic range, etc.

    • Credibly_Human@lemmy.world
      link
      fedilink
      English
      arrow-up
      92
      arrow-down
      6
      ·
      edit-2
      9 days ago

      Seriously, articles like this are just clickbait.

      They also ignore all sorts of usecases.

      Like for a desktop monitor, 4k is extremely noticeable vs even 1440P or 1080P/2k

      Unless you’re sitting very far away, the sharpness of text and therefore amount of readable information you can fit on the screen changes dramatically.

      • cmnybo@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        29
        arrow-down
        3
        ·
        9 days ago

        The article was about TVs, not computer monitors. Most people don’t sit nearly as close to a TV as they do a monitor.

        • Credibly_Human@lemmy.world
          link
          fedilink
          English
          arrow-up
          10
          arrow-down
          1
          ·
          9 days ago

          Oh absolutely, but even TVs are used in different contexts.

          Like the thing about text applies to console games, applies to menus, applies to certain types of high detail media etc.

      • Rai@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        12
        arrow-down
        6
        ·
        9 days ago

        Complete bullshit articles. The same thing happened when 720p became 1080p. So many echos of “oh you won’t see the difference unless the screen is huge”… like no, you can see the difference on a tiny screen.

        We’ll have these same bullshit arguments when 8k becomes the standard, and for every large upgrade from there.

        • CybranM@feddit.nu
          link
          fedilink
          English
          arrow-up
          5
          ·
          8 days ago

          I agree to a certain extent but there are diminishing returns, same with refreshrates. The leap from 1080 to 4k is big. I don’t know how noticeable upgrading from 4k to 8k would be for the average TV setup.

          For vr it would be awesome though

    • Tarquinn2049@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      edit-2
      9 days ago

      So, a 55-inch TV, which is pretty much the smallest 4k TV you could get when they were new, has benefits over 1080p at a distance of 7.5 feet… how far away do people watch their TVs from? Am I weird?

      And at the size of computer monitors, for the distance they are from your face, they would always have full benefit on this chart. And even working into 8k a decent amount.

      And that’s only for people with typical vision, for people with above-average acuity, the benefits would start further away.

      But yeah, for VR for sure, since having an 8k screen there would directly determine how far away a 4k flat screen can be properly re-created. If your headset is only 4k, a 4k flat screen in VR is only worth it when it takes up most of your field of view. That’s how I have mine set up, but I would imagine most people would prefer it to be half the size or twice the distance away, or a combination.

      So 8k screens in VR will be very relevant for augmented reality, since performance costs there are pretty low anyway. And still convey benefits if you are running actual VR games at half the physical panel resolution due to performance demand being too high otherwise. You get some relatively free upscaling then. Won’t look as good as native 8k, but benefits a bit anyway.

      There is also fixed and dynamic foveated rendering to think about, with an 8k screen, even running only 10% of it at that resolution and 20% at 4k, 30% at 1080p, and the remaining 40% at 540p, even with the overhead of so many foveation steps, you’ll get a notable reduction in performance cost. Fixed foveated would likely need to lean higher towards bigger percentages of higher res, but has the performance advantage of not having to move around at all from frame to frame. Can benefit from more pre-planning and optimization.

      • 4am@lemmy.zip
        link
        fedilink
        English
        arrow-up
        6
        ·
        9 days ago

        A lot of us mount a TV on the wall and watch from a couch across the room.

        • Tarquinn2049@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          edit-2
          9 days ago

          And you get a TV small enough that it doesn’t suit that purpose? Looks like 75 inch to 85 inch is what would suit that use case. Big, but still common enough.

      • Damage@feddit.it
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        edit-2
        9 days ago

        I’ve got a LCD 55" TV and a 14" laptop. Ok the couch, the TV screen looks to me about as big as the laptop screen on my belly/lap, and I’ve got perfect vision; on the laptop I can clearly see the difference between 4k and FULL HD, on the TV, not so much.

        I think TV screens aren’t as good as PC ones, but also the TVs’ image processors turn the 1080p files into better images than what computers do.

        • Tarquinn2049@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          edit-2
          9 days ago

          Hmm, I suppose quality of TV might matter. Not to mention actually going through the settings and making sure it isn’t doing anything to process the signal. And also not streaming compressed crap to it. I do visit other peoples houses sometimes and definitely wouldn’t know they were using a 4k screen to watch what they are watching.

          But I am assuming actually displaying 4k content to be part of the testing parameters.

          • Damage@feddit.it
            link
            fedilink
            English
            arrow-up
            1
            ·
            8 days ago

            Yeah well my comparisons are all with local files, no streaming compression

            • Tarquinn2049@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              8 days ago

              Also, usually when people use the term “perfect” vision, they mean 20/20, is that the case for you too. Another term for that is average vision, with people that have better vision than that having “better than average” vision.

              • Damage@feddit.it
                link
                fedilink
                English
                arrow-up
                1
                ·
                8 days ago

                Idk what 20/20 is, I guess you guys use a different scale, last mandatory vision test at work was 12/10 with 6/7 on I don’t remember which color recognition range, but I’m not sure about the latter 'cause it was ok last year and 6/7 the year before also. IIRC the best score for visual acuity is 18/10, but I don’t think they test that far during work visits, I’d have to go to the ophthalmologist to know.

                • Tarquinn2049@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  8 days ago

                  I would imagine it’s the same scale, just a base 10 feet instead of 20 feet. So in yours you would see at 24 feet what the average person would see at 20 feet. Assuming there is a linear relation, and no circumstantial drop off.

    • frongt@lemmy.zip
      link
      fedilink
      English
      arrow-up
      6
      ·
      9 days ago

      There’s a giant TV at my gym that is mounted right in front of some of the equipment, so my face is inches away. It must have some insane resolution because everything is still as sharp as a standard LCD panel.

    • Frezik@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      9 days ago

      The counterpoint is that if you’re sitting that close to a big TV, it’s going to fill your field of view to an uncomfortable degree.

      4k and higher is for small screens close up (desktop monitor), or very large screens in dedicated home theater spaces. The kind that would only fit in a McMansion, anyway.

    • EndlessNightmare@reddthat.com
      link
      fedilink
      English
      arrow-up
      3
      ·
      8 days ago

      8K would probably be really good for large computer monitors, due to viewing distances. It would be really taxing on the hardware if you were using it for gaming, but reasonable for tasks that aren’t graphically intense.

      Computer monitors (for productivity tasks) are a little different though in that you are looking at section of the screen rather than the screen as a whole as one might with video. So having extra screen real estate can be rather valuable.

    • ubergeek@lemmy.today
      link
      fedilink
      English
      arrow-up
      2
      ·
      9 days ago

      Good to know that pretty much anything looks fine on my TV, at typical viewing distances.

    • Lemming6969@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      9 days ago

      People are legit sitting 15+ feet away and thinking a 55 inch TV is good enough… Optimal viewing angles for most reasonably sized rooms require a 100+ inch TV and 4k or better.

    • SCmSTR@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      1
      ·
      9 days ago

      How many feet away is a computer monitor?

      Or a 2-4 person home theater distance that has good fov fill?

    • ZoteTheMighty@lemmy.zip
      link
      fedilink
      English
      arrow-up
      1
      ·
      9 days ago

      Would be a more useful graph if the y axis cut off at 10, less than a quarter of what it plots.

      Not sure what universe where discussing the merits of 480p at 45 ft is relevant, but it ain’t this one. If I’m sitting 8 ft away from my TV, I will notice the difference if my screen is over 60 inches, which is where a vast majority of consumers operate.

  • treesquid@lemmy.world
    link
    fedilink
    English
    arrow-up
    55
    arrow-down
    7
    ·
    9 days ago

    4k is way better than 1080p, it’s not even a question. You can see that shit from a mile away. 8k is only better if your TV is comically large.

    • balance8873@lemmy.myserv.one
      link
      fedilink
      English
      arrow-up
      24
      arrow-down
      1
      ·
      edit-2
      9 days ago

      I think you overestimate the quality of many humans’ eyes. Many people walk around with slightly bad vision no problem. Many older folks have bad vision even corrected. I cannot distinguish between 1080 and 4k in the majority of circumstances. Stick me in front of a computer and I can notice, but tvs and computers are at wildly different distances.

    • SereneSadie@lemmy.myserv.one
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      2
      ·
      9 days ago

      I can immediately tell when a game is running at 1080p on my 2K monitor (yeah, I’m not interested in 4K over higher refresh rate, so I’m picking the middle ground.)

      Its blatantly obvious when everything suddenly looks muddy and washed together.

      • psycotica0@lemmy.ca
        link
        fedilink
        English
        arrow-up
        5
        ·
        9 days ago

        I think that’s relevant to the discussion though. Most people sit like two feet from their gaming monitor and lean forward in their chair to make the character go faster.

        But most people put a big TV on the other side of a boring white room, with a bare white ikea coffee table in between you and it, and I bet it doesn’t matter as much.

        I bet the closest people ever are to their TV is when they’re at the store buying it…

  • fritobugger2017@lemmy.world
    link
    fedilink
    English
    arrow-up
    41
    arrow-down
    1
    ·
    9 days ago

    The study used a 44 inch TV at 2.5m. The most commonly used calculator for minimum TV to distance says that at 2.5m the TV should be a least 60 inches.

    My own informal tests at home with a 65 inch TV looking at 1080 versus 4K Remux of the same movie seems to go along with the distance calculator. At the appropriate distance or nearer I can see a difference if I am viewing critically (as opposed to casually). Beyond a certain distance the difference is not apparent.

    • markko@lemmy.world
      link
      fedilink
      English
      arrow-up
      28
      arrow-down
      1
      ·
      8 days ago

      Exactly. This title is just clickbait.

      The actual study’s title is “Resolution limit of the eye — how many pixels can we see?”.

      • definitemaybe@lemmy.ca
        link
        fedilink
        English
        arrow-up
        2
        ·
        8 days ago

        Can’t believe I had to scroll down this far to find this:

        Here’s the gut-punch for the typical living room, however. If you’re sitting the average 2.5 meters away from a 44-inch set, a simple Quad HD (QHD) display already packs more detail than your eye can possibly distinguish. The scientists made it crystal clear: once your setup hits that threshold, any further increase in pixel count, like moving from 4K to an 8K model of the same size and distance, hits the law of diminishing returns because your eye simply can’t detect the added detail.

        On a computer monitor, it’s easily apparent because you’re not sitting 2+ m away, and in a living room, 44" is tiny, by recent standards.

      • SaveTheTuaHawk@lemmy.ca
        link
        fedilink
        English
        arrow-up
        2
        ·
        8 days ago

        Exactly why big box stores force you to look at TVs in narrow aisles, not at typical distances at home. They also adjust pictures on highest margin models properly.

  • Surp@lemmy.world
    link
    fedilink
    English
    arrow-up
    32
    arrow-down
    1
    ·
    8 days ago

    8k no. 4k with a 4k Blu-ray player on actual non upscaled 4k movies is fucking amazing.

    • Stalinwolf@lemmy.ca
      link
      fedilink
      English
      arrow-up
      12
      ·
      edit-2
      8 days ago

      I don’t know if this will age like my previous belief that PS1 had photo-realistic graphics, but I feel like 4k is the peak for TVs. I recently bought a 65" 4k TV and not only is it the clearest image I’ve ever seen, but it takes up a good chunk of my livingroom. Any larger would just look ridiculous.

      Unless the average person starts using abandoned cathedrals as their livingrooms, I don’t see how larger TVs with even higher definition would even be practical. Especially if you consider we already have 8k for those who do use cathedral entertainment systems.

      • brucethemoose@lemmy.world
        link
        fedilink
        English
        arrow-up
        12
        ·
        edit-2
        8 days ago

        (Most) TVs still have a long way to go with color space and brightness. AKA HDR. Not to speak of more sane color/calibration standards to make the picture more consistent, and higher ‘standard’ framerates than 24FPS.

        But yeah, 8K… I dunno about that. Seems like a massive waste. And I am a pixel peeper.

        • JigglySackles@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          8 days ago

          For media I highly agree. 8k doesn’t seem to add much. For computer screens I can see the purpose though as it adds more screen real estate which is hard to get enough of for some of us. I’d love to have multiple 8k screens so I can organize and spread out my work.

          • brucethemoose@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            8 days ago

            Are you sure about that? You likely use DPI scaling at 4K, and you’re likely limited by physical screen size unless you already use a 50” TV (which is equivalent to 4x standard 25” 1080p monitors).

            8K would only help at like 65”+, which is kinda crazy for a monitor on a desk… Awesome if you can swing it, but most can’t.


            I tangentially agree though. PCs can use “extra” resolution for various things like upscaling, better text rendering and such rather easily.

            • JigglySackles@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              8 days ago

              Truthfully I haven’t gotten a chance to use an 8k screen, so my statement is more hypothetical “I can see a possible benefit”.

              • brucethemoose@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                edit-2
                8 days ago

                I’ve used 5K some.

                IMO the only ostensible benefit is for computer type stuff. It gives them more headroom to upscale content well, to avoid anti aliasing or blurry, scaled UI rendering, stuff like that. 4:1 rendering (to save power) would be quite viable too.

                Another example would be editing workflows, for 1:1 pixel mapping of content while leaving plenty of room for the UI.

                But for native content? Like movies?

                Pointless, unless you are ridiculously close to a huge display, even if your vision is 20/20. And it’s too expensive to be worth it: I’d rather that money go into other technical aspects, easily.

        • SpacetimeMachine@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          3
          ·
          8 days ago

          The frame rate really doesn’t need to be higher. I fully understand filmmakers who balk at the idea of 48 or 60 fps movies. It really does change the feel of them and imo not in a necessarily positive way.

          • brucethemoose@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            edit-2
            8 days ago

            I respectfully disagree. Folk’s eyes are ‘used’ to 24P, but native 48 or 60 looks infinitely better, especially when stuff is filmed/produced with that in mind.

            But at a bare minimum, baseline TVs should at least eliminate jitter with 24P content by default, and offer better motion clarity by moving on from LCDs, using black frame insertion or whatever.

    • HugeNerd@lemmy.ca
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      8 days ago

      I think you’re right but how many movies are available in UHD? Not too many I’d think. On my thrifting runs I’ve picked up 200 Blurays vs 3 UHDs. If we can map that ratio to the retail market that’s ~1% UHD content.

  • OR3X@lemmy.world
    link
    fedilink
    English
    arrow-up
    32
    arrow-down
    3
    ·
    8 days ago

    ITT: people defending their 4K/8K display purchases as if this study was a personal attack on their financial decision making.

    • treesquid@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      8 days ago

      My 50" 4K TV was $250. That TV is now $200, nobody is flexing the resolution of their 4k TV, that’s just a regular cheap-ass TV now. When I got home and started using my new TV, right next to my old 1080p TV just to compare, the difference in resolution was instantly apparent. It’s not people trying to defend their purchase, it’s people questioning the methodology of the study because the difference between 1080p and 4k is stark unless your TV is small or you’re far away from it. If you play video games, it’s especially obvious.

      • michaelmrose@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 days ago

        Old people with bad eyesight watching their 50" 12 feet away in their big ass living room vs young people with good eyesight 5 feet away from their 65-70" playing a game might have inherently differing opinions.

        12’ 50" FHD = 112 PPD

        5’ 70" FHD = 36 PPD

        The study basically says that FHD is about as good as you can get 10 feet away on a 50" screen all other things being equal. That doesn’t seem that unreasonable

    • Nalivai@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      edit-2
      8 days ago

      Right? “Yeah, there is a scientific study about it, but what if I didn’t read it and go by feelings? Then I will be right and don’t have to reexamine shit about my life, isn’t that convenient”

    • michaelmrose@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      2
      ·
      8 days ago

      They don’t need to this study does it for them. 94 pixels per degree is the top end of perceptible. On a 50" screen 10 feet away 1080p = 93. Closer than 10 feet or larger than 50 or some combination of both and its better to have a higher resolution.

      For millennials home ownership has crashed but TVs are cheaper and cheaper. For the half of motherfuckers rocking their 70" tv that cost $600 in their shitty apartment where they sit 8 feet from the TV its pretty obvious 4K is better at 109 v 54

      Also although the article points out that there are other features that matter as much as resolution these aren’t uncorrelated factors. 1080p TVs of any size in 2025 are normally bargain basement garbage that suck on all fronts.

  • lepinkainen@lemmy.world
    link
    fedilink
    English
    arrow-up
    26
    ·
    8 days ago

    4k with shit streaming bitrate is barely better than high bitrate 1080p

    But full bitrate 4k from a Blu-ray IS better.

    • kadu@scribe.disroot.org
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      8 days ago

      But full bitrate 4k from a Blu-ray IS better.

      Full Blu-Ray quality 1080p sources will look significantly better than Netflix 4K.

      Hence why “4K” doesn’t actually matter unless your panel is gigantic or you’re sitting very close to it. Resolution is a very small part of our perceived notion of quality.

  • the_riviera_kid@lemmy.world
    link
    fedilink
    English
    arrow-up
    33
    arrow-down
    8
    ·
    8 days ago

    Bullshit, actual factual 8k and 4k look miles better than 1080. It’s the screen size that makes a difference. On a 15inch screen you might not see much difference but on a 75 inch screen the difference between 1080 and 4k is immediately noticeable. A much larger screen would have the same results with 8k.

        • Soup@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          8 days ago

          Literally this article is about the study. Your “well-known” fact doesn’t hold up to scrutiny.

          • the_riviera_kid@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            8 days ago

            The other important detail to note is that screen size and distance to your TV also matters. The larger the TV, the more a higher resolution will offer a perceived benefit. Stretching a 1080p image across a 75-inch display, for example, won’t look as sharp as a 4K image on that size TV. As the age old saying goes, “it depends.”

            literally in the article you are claiming to be correct, maybe should try reading sometime.

            • Soup@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              8 days ago

              Yes, but you got yourself real pissy over it and have just now admitted that the one piece of criticism you had in your original comment was already addressed in the article. Obviously if we start talking about situations that are extreme outliers there will be edge cases but you’re not adding anything to the conversation by acting like you’ve found some failure that, in reality, the article already addressed.

              I’m not sure you have the reading the comprehension and/or the intention to have any kind of real conversation to continue this discussion further.

        • JigglySackles@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          8 days ago

          So I have a pet theory on studies like that. There are many things out there that many of us take for granted and as givens in our daily lives. But there are likely equally as many people out there to which this knowledge is either unknown or not actually apparent. Reasoning for that can be a myriad of things; like due to a lack of experience in the given area, skepticism that their anecdotal evidence is truly correct despite appearances, and on and on.

          What these “obvious thing is obvious” studies accomplish is setting a factual precedent for the people in the back. The people who are uninformed, not experienced enough, skeptical, contrarian, etc.

          The studies seem wasteful upfront, but sometimes a thing needs to be said aloud to galvanize the factual evidence and give basis to the overwhelming anecdotal evidence.

    • kadu@scribe.disroot.org
      link
      fedilink
      English
      arrow-up
      9
      ·
      8 days ago

      It’s the screen size that makes a difference

      Not by itself, the distance is extremely relevant. And at the distance a normal person sits away from a large screen, you need to get very large for 4k to matter, let alone 8k.

    • mean_bean279@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      2
      ·
      8 days ago

      I like how you’re calling bullshit on a study because you feel like you know better.

      Read the report, and go check the study. They note that the biggest gains in human visibility for displays comes from contrast (largest reason), brightness, and color accuracy. All of which has drastically increased over the last 15 years. Look at a really good high end 1080p monitor and a low end 4k monitor and you will actively choose the 1080p monitor. It’s more pleasing to the eye, and you don’t notice the difference in pixel size at that scale.

      Sure distance plays some level of scale, but they also noted that by performing the test at the same distance with the same size. They’re controlling for a variable you aren’t even controlling for in your own comment.

      • SeriousMite@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 days ago

        This has been my experience going from 1080 to 4K. It’s not the resolution, it’s the brighter colors that make the most difference.

        • M0oP0o@mander.xyz
          link
          fedilink
          English
          arrow-up
          1
          ·
          7 days ago

          And that’s not releated to the resolution yet people have tied higher resolutions to better quality.

      • Corhen@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 days ago

        Have a 75" display, the size is nice, but still a ways from a theater experience, would really need 95" plus.

  • arthurpizza@lemmy.world
    link
    fedilink
    English
    arrow-up
    25
    ·
    8 days ago

    An overly compressed 4k stream will look far worse than a good quality 1080p. We keep upping the resolution without getting newer codecs and not adjusting the bitrate.

    • Psythik@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      edit-2
      8 days ago

      This is true. That said, if can’t tell the difference between 1080p and 4K from the pixels alone, then either your TV is too small, or you’re sitting too far away. In which case there’s no point in going with 4K.

      At the right seating distance, there is a benefit to be had even by going with an 8K TV. However, very few people sit close enough/have a large enough screen to benefit from going any higher than 4K:


      Source: https://www.rtings.com/tv/learn/what-is-the-resolution

    • Squizzy@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      8 days ago

      I went looking for a quick explainer on this and that side of youtube goes so indepth I am more confused.

      • starelfsc2@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        ·
        8 days ago

        On codecs and bitrate? It’s basically codec = file type (.avi, .mp4) and bitrate is how much data is sent per second for the video. Videos only track what changed between frames, so a video of a still image can be 4k with a really low bitrate, but if things are moving it’ll get really blurry with a low bitrate even in 4k.

        • sue_me_please@awful.systems
          link
          fedilink
          English
          arrow-up
          1
          ·
          7 days ago

          “File types” like avi, mp4, etc are container formats. Codecs encode video streams that can be held in different container formats. Some container formats can only hold video streams encoded with specific codecs.

          • starelfsc2@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            5 days ago

            ah yeah I figured it wasn’t quite right, I just remember seeing the codec on the details and figured it was tied to it, thanks.

      • Redex@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        7 days ago

        I’ll add another explanation for bitrate that I find understandable: You can think of resolution as basically the max quality of a display, no matter the bitrate, you can’t display more information/pixwls than the screen possess. Bitrate, on the other hand, represents how much information you are receiving from e.g. Netflix. If you didn’t use any compression, in HDR each pixel would require 30 bits, or 3.75 bytes of data. A 4k screen has 8 million pixels. An HDR stream running at 60 fps would require about 1.7GB/s of download wihout any compression. Bitrate is basically the measure of that, how much we’ve managed to compress that data flow. There are many ways you can achieve this compression, and a lot of it relates to how individual codecs work, but put simply, one of the many methods effectively involves grouping pixels into larger blocks (e.g. 32x32 pixels) and saying they all have the same colour. As a result, at low bitrates you’ll start to see blocking and other visual artifacts that significantly degrade the viewing experience.

        As a side note, one cool thing that codecs do (not sure if literally all of them do it, but I think most by far), is that not each frame is encoded in its entirety. You have, I, P and B frames. I frames (also known as keyframes) are a full frame, they’re fully defined and are basically like a picture. P frames don’t define every pixel, instead they define the difference between their frame and the previous frame, e.g. that the pixel at x: 210 y: 925 changed from red to orange. B frames do the same, but they use both previous and future frames for reference. That’s why you might sometimes notice that in a stream, even when the quality isn’t changing, every couple of seconds the picture will become really clear, before gradually degrading in quality, and then suddenly jumping up in quality again.

      • HereIAm@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        7 days ago

        For an ELI5 explanation, this is what happens when you lower the bit rate: https://youtu.be/QEzhxP-pdos

        No matter the resolution you have of the video, if the amount of information per frame is so low that it has to lump different coloured pixels together, it will look like crap.

      • null_dot@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 days ago

        The resolution (4k in this case) defines the number of pixels to be shown to the user. The bitrate defines how much data is provided in the file or stream. A codec is the method for converting data to pixels.

        Suppose you’ve recorded something in 1080p (low resolution). You could convert it to 4k, but the codec has to make up the pixels that can’t be computed from the data.

        In summary, the TV in my living room might be more capable, but my streaming provider probably isn’t sending enough data to really use it.

  • 4am@lemmy.zip
    link
    fedilink
    English
    arrow-up
    21
    ·
    9 days ago

    Highly depends on screen size and viewing distance, but nothing reasonable for a normal home probably ever needs more than 8k for a high end setup, and 4K for most cases.

    Contrast ratio/HDR and per-pixel backlighting type technology is where the real magic is happening.

    • MangoCats@feddit.it
      link
      fedilink
      English
      arrow-up
      4
      ·
      9 days ago

      Depends on your eyes quite a bit, too. If I’m sitting more than 15’ back from a 55" screen, 1080p is just fine. Put on my distance glasses and I might be able to tell the difference with 4K.

  • deranger@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    21
    arrow-down
    1
    ·
    9 days ago

    If you read RTINGS before buying a TV and setting it up in your room, you already knew this. Screen size and distance to TV are important for determining what resolution you actually need.

    Most people sit way too far away from their 4K TV.

    • SCmSTR@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      9 days ago

      People that have their tiny displays on the opposite side of a room is so funny to me. It’s a similar reaction I have to giant-guy tiny-car.

      I remember one time I saw a maybe 27 inch computer monitor on the wall above a fireplace and it was just like… I need to leave before I say something.

      • PancakesCantKillMe@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        9 days ago

        Something like a dozen years ago I visited my father and sat 20+ feet away from his 27" television struggling to make things out. It was comically small for the room. I asked if he was interested in buying a newer and larger one. He agreed and we made the change to a 43". A modest increase and it helped quite a bit, though an even larger model would’ve been my choice. This satisfactorily infuriated his wife who then had to learn a new remote. Change is hard for some.

        • MelodiousFunk@slrpnk.net
          link
          fedilink
          English
          arrow-up
          4
          ·
          9 days ago

          As someone who grew up with a 20-some inch CRT in a console format (think TV-as-furniture that sits on the floor and not a TV that sits on furniture), and then eventually got a 19" hand me down for bedroom use… yeah, the commonality of enormous flat panels still makes me shake my head in wonder sometimes.

          That said, when my parents’ 27" CRT died about 10 years ago, we gave them our old 55" plasma. It was hilariously oversized for the space. But it was free (to them) and we made it work.

      • kaitco@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        9 days ago

        Michael Scott vibes.

        “Brand New Plasma TV. Fits right into the wall.”

    • 7U5K3N@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      9 days ago

      My father in law loves to sit and research… that’s his thing… made a career out of it yadda yadda yadda…

      He asked me about a new TV… I was like…well have you seen rtings.com?

      My MIL had to remind him to eat… lmfao

      He just rabbit holed for days. It was like he clicked a TV tropes link or something.

      Anyway, he made a very informed decision and loves his TV. Haha

  • Baggie@lemmy.zip
    link
    fedilink
    English
    arrow-up
    19
    ·
    9 days ago

    Honestly after using the steam deck (800p) I’m starting to wonder if res matters that much. Like I can definitely see the difference, but it’s not that big a deal? All I feel like I got out of my 4k monitor is lower frame rates.

    • floquant@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      11
      ·
      edit-2
      8 days ago

      Pixel density is what makes content appear sharp rather than raw resolution. 800p on a 7" screen is plenty, if you think about it a 50" 1080p TV is almost 10x the size more than 50x the size with a ~25% increase in (vertical) resolution

      • Baggie@lemmy.zip
        link
        fedilink
        English
        arrow-up
        1
        ·
        7 days ago

        Yes, but wouldn’t we be using % of your vision vs pixels in display? Steam deck being right in front of my face and tv 5 or 6 metres away etc.

        Absolutely higher res does look sharper though, which is great for movies etc. I’m more coming from a performance vs visual fidelity ratio. What I’m trying to express is that given 800p still looks surprisingly good, I’m starting to question the industry pushing higher resolution displays for gaming applications.

      • pirat@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        8 days ago

        if you think about it

        I tried that, and I’m not totally sure about the correctness of my numbers, but your numbers intuitively seem off to me:

        a 50" 1080p TV is almost 10x the size [of a 7" screen]

        How did you arrive at this? I’d argue a 50" screen is much more than 10 times the size of a 7" screen.

        The inches are measured diagonally, and I see how 50" is somewhat “almost 10x” of 7", as 49" would be 7 times longer diagonally than a 7", and 7.something is " almost" 10.

        But if we assume both screens have a 16:9 ratio, the 50" screen has a width of ≈110.69 cm and height of ≈62.26 cm, while the 7" is only ≈15.50 by ≈8.72 cm.

        The area of the 7" is 135.08 cm² while for the 50" it’s ≈6891.92 cm². The ratio between these two numbers is ≈51.02, which I believe means the 50" screen is more than 51x the physical size.

        At least, that number seems more realistic to me. I’m looking at my 6.7" phone screen right now and comparing it to my 55" TV screen, and it seems very possible that the phone screen could fit more than 50 times inside the TV screen, not just “almost 10x”.

        If I totally misunderstood you, please explain what you mean.

        My numbers for width and height were calculated using this display calculator site that someone else mentioned somewhere under this post, and I rounded the decimals after doing the calculations with all decimals included.

        • floquant@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          8 days ago

          Haha no, you have not misunderstood at all! I was just driving a point and I did no calculations whatsoever, by that «50" is almost 10x 7"» I did mean that 50 is “almost” 70 and nothing else x) As your calculations show, it’s actually a much bigger difference in area, but that stat seemed enough to make my point and easier to understand :)

          Thank you for actually thinking about it and taking the time to do the math ^^

          • pirat@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            8 days ago

            Oh, I see. But yeah, it’s a pretty big difference.

            You’re welcome. I like to think that I like thinking about things and stuff.