• her-1g@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Im still.rocking my 2411 benq at 144hz for almost a decade. Lol. i also have a shy 2060 and i only play cs so no big deal for me

  • Xvailer@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    What games are even capable of running at 540 frames??? No one is gonna buy this for any modern cod titles because you can barely get 200fps - 300fps on a 4090. I know you can get some nutty frames in games like cs2 but I can’t see that many setups being able to push out that many frames.

  • kinkyskinnyslutCD@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Really don’t understand this trend you will never that. Even 240hz is hardly noticeable over 165. Just need OLEDs to come down in price but for most gamers who dont have 40xx gpus we can’t really run much over 1440p, 240hz anyway

  • needledicklarry@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Very cool, now find me a decently priced rig that can actually run games consistently above that refresh rate. This is future tech. Barely anyone will be able to take advantage of it rn

    • whosat___@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      It’s a $900 monitor for professional esports players. It’s like you’re complaining about F1 cars being too fast because roads are too slow for them.

      • Pyro_Light@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Having had played near that level of several years I highly doubt 99% of e sport players could tell the difference between 240 and this. 160+ is hard enough for most of the semi-pro players I’ve played with and 99% of the pro players I’ve talked/played with all say 144 is plenty fine in 99% of situations.

  • Sebastianx21@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Anything above like 90-100 FPS is barely perceptible to me and provides close to no benefit, even my 165hz monitor is currently locked at 100 for more consistent frames rather than more frames.

    So what’s the point of 540hz exactly? And don’t tell me it’s a competitive advantage, it’s not, no human will be able to utilize the 1 digit millisecond increases it provides.

    • 2roK@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I’m sure that you and your friends think that you can see a difference between 240 and 360 but objectively, no, you all suffer heavily from bias and placebo.

      • bunkSauce@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        You are wrong. And the same thing was said about 120 vs 240, and surprisingly 60 vs 120.

        This is an age old argument. But those who use 120 vs 240 will tell you it is not just noticeable, but will affect your hand eye coordination timing.

        Placebo should not be ruled out, but I believe you have the bias, here.

    • reelznfeelz@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      You have the equipment to test people you know’s ability to see 240 vs 360 in a reliable manner at home?

      • Fredasa@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Not what you’re looking for, but I remember watching a documentary (decades ago) where they wired people up to EEG machines and then showed them images for increasingly slight instants. After a certain point, like 1/1000th of a second or something, the individual would stop reporting seeing anything, but their brain would still register it. And furthermore it would register it quicker than in the cases where the person was still able to consciously notice it.

    • edireven@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      It’s pretty f*cking funny to read when 15 years ago you were arguing with people on the internet about the superiority of 120Hz monitors and they would tell you that the human eye cannot see any difference above the 60Hz.

      • FireMaker125@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Reminds me of people claiming that 30FPS was better than 60 because “the human eye can only see 30FPS”. At least the “30FPS is cinematic” argument made a little sense, despite the fact that it is completely wrong in the context of games.

      • VexatiousJigsaw@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Human visual processing breaks down at different rates. I had an extremely informative college course which I took fairly poor notes but here are the basics.

        Object recognition - 24hz. Actually got a live demo for this one. Flashing 24 unrelated images per second and students could call out when a duck was visible. About half of us could spot it even when most other images were unrecognizable. This rate targets the visual cortex.

        Flicker Fusion threshold - 60hz, the rate the eye can percieve that a light is flickering vs solidly lit. Depends also on the display technology. This rate targets the rods/cones of the retina themselves.

        Motion Recognition - 500-1000hz, Inside the retina itself, the human eye has special ganglion cells which detect edge patterns traveling between rods/cones, these are clever hair trigger comparators that can spot differences in detection smaller than the original signal (stilll ~60hz but unsynchronized) but this information travels separately from the object recognition signals to the brain and is never fully re-incorporated. Motion blur/interpolation used in film almost complete blinds these cells and the absence of these signals is relatively subtle. Unblurred 60hz film and video games can still trigger the stop motion effect. These cells need much higher rates to be completely fooled.

        Think of it another way, if you have a vertical line moving across the screen, you want atleast one frame for each motion cell. If the line jumps pixels you get more like motion stripes than a motion image. If you jump multiple pixels per frame, skipped motion cells report zero motion and the perception of motion breaks down. To complete the illusion you nearly need a new frame for every column of pixels(not every rod/cone cell is connected to a motion cell) . This is why perception still subtly changes up to absurdly high frame rates.

      • AngelicTrader@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        they just moved the goal post. now its “the human eye cant see past 1000 hz”

        we’ll see about that!

      • Bobbar84@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        I have to wonder if they have ever even tried higher refresh rates/FPS? The difference is so apparent.

  • ItsNjry@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    The diminishing returns after 144hz is insane. I have a 240hz and a 144hz and it’s hard to tell the difference. It’s there, but you gotta have them side by side or be playing a really competitive shooter to notice.

  • thelonelyward2@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I have a monitor with 165 refresh, 60 to 100 there is a difference, 100 to 144 I see nothing, and 100 to 165 I see nothing, turned it off and left it at 100 hz. I really could not see a difference. 540 is riduclous.

  • Staalone@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Cool. I like how we keep getting higher refresh rates for monitors and yet keep getting less and less frame rates because of poor optimization and disappointing performance from newer graphics cards

    • Gosexual@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Because they probably switched from 60Hz to 144Hz at first and thought that 240Hz would be similar difference in upgrade, realized it’s not, and now have to have a reason for overspending.

      • aesthetically-@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        I dont quite agree with that. While the jump from 60Hz to 144Hz was definitely an extreme change, 144Hz to 240Hz was extremely noticeable and very beneficial. Specifically I got mine for only about $50 than a 144Hz, so I wouldn’t call that overspending for a considerable jump in quality

        • bjv2001@alien.topB
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          Yeah the jump from 144hz to 280hz for me was incredibly noticeable and made the experience in fps games I play far more enjoyable. Sure its less of a boost than 60->144hz but a 67% increase is nothing to scoff at.

    • bunkSauce@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      As an owner of a 240 Hz TN panel, I find switching to my 140 Hz very noticeable… but only for competitive gaming.

      For instance in rocket league when I switch, my muscle memory seems to feel a bit delayed. My timing is always just a bit different. It isn’t something I noticed until I gamed at 240 Hz for months. But it is definitely there, and it definitely affects my performance.

      All that said, my monitor isn’t expensive… I have that 1080p 240 Hz for competitive, and my 4k 140 Hz TV for all my pretty and relaxed games