• Kyrgizion@lemmy.world
    link
    fedilink
    arrow-up
    34
    arrow-down
    4
    ·
    8 months ago

    60hz4k until I die. Or can afford 144hz4k. Which’ll probably be around the same date, give or take.

    • CanadianCorhen@lemmy.ca
      link
      fedilink
      arrow-up
      37
      arrow-down
      1
      ·
      8 months ago

      I’m very much a 1440p 144hz guy.

      I’d like 4k, but would take this compromise for now.

      I want my next screen to be 4k, 144hz oled

      • Domi@lemmy.secnd.me
        link
        fedilink
        arrow-up
        4
        ·
        8 months ago

        1440p high refresh rate gamers unite. I have an Alienware AW3423DWF and boy are those new OLED panels beautiful. Expensive but beautiful. I still remember playing Left 4 Dead right after I got it and even without HDR I was baffled by the credits at the end of the match. Just white text floating in nothingness.

        They also recently released the AW3225QF which is 4k@240.

      • yeehaw@lemmy.ca
        link
        fedilink
        arrow-up
        1
        ·
        8 months ago

        I have a 2.5K ultrawide 144Hz. Even when this PC was new it struggled on that era of games :(. We need better graphics cards that don’t cost the price of a mortgage.

  • mox@lemmy.sdf.org
    link
    fedilink
    arrow-up
    16
    arrow-down
    1
    ·
    edit-2
    8 months ago

    FreeSync is for variable refresh rates, which 60Hz monitors generally don’t support anyway. So this headline is nothing but clickbait.

    Also, I don’t know of any sub-120Hz VRR monitors that are still being made, but if they exist, they’re not aimed at anyone who cares about FreeSync branding.

    So this whole article is a pointless waste of time.

    • notfromhere@lemmy.ml
      link
      fedilink
      arrow-up
      2
      ·
      8 months ago

      I recently got a 75hz 1080p monitor with FreeSync branding from Costco, so yea they are still made.

  • FiveMacs@lemmy.ca
    link
    fedilink
    arrow-up
    18
    arrow-down
    5
    ·
    8 months ago

    Provided the 60hz monitors still work who cares…if they do some arbitrary bullshit to prevent stuff from not working just because profit, then get fucked. I personally don’t care about their certification or claims.

      • andrew@lemmy.stuart.fun
        link
        fedilink
        arrow-up
        3
        arrow-down
        4
        ·
        edit-2
        8 months ago

        Open source is the best. That doesn’t mean the recommendation to move off 60hz isn’t profit motivated. Especially when driving displays at over 60hz means selling more graphics cards since your older one may not go far beyond 60.

        • SchmidtGenetics@lemmy.world
          link
          fedilink
          arrow-up
          5
          ·
          edit-2
          8 months ago

          That’s a good point, but freesync also isn’t mandatory to use I thought? Couldn’t you still use vsync for 120hz and lower?

        • Formes@lemmy.ca
          link
          fedilink
          arrow-up
          4
          ·
          8 months ago

          In a round about way? Maybe. But no.

          The first commercially available variable refresh monitor came out like a decade ago, needing expensive bespoke hardware to drive it. Now? We are at a point we are reaching commodity level costs. And yet we still have piles and piles of bottom tier and crap tier products being shoved onto the market.

          Sooner or later, the machines and production lines for making those monitors will need overhaul, and at that point - it would 100% make sense to just go to variable refresh.

          The reality is, the benefactor is you - if you get a GPU upgrade: You get more frames. If you don’t, variable refresh can still provide a smoother better game experience. This is especially true as frame generation, and upscaling techniques have gotten extremely good in the last few years.

          you don’t need to upgrade the GPU to benefit

          I want to spell that out clearly: AMD doesn’t need you to buy a new GPU to benefit. NVIDIA doesn’t either. But it also means, if you buy a new monitor that is variable refresh today - when you upgrade your GPU, you get to really take advantage.

          Where my perspective comes from

          I did the monitor upgrade before a GPU upgrade a few years ago. Variable refresh is king. HDR when the content supports it is amazing - provided the monitor has decent HDR support (low end monitors… don’t).

          Given that I had my previous multi-monitor set up for over a decade, and went through 3 system builds with it - Your monitor is something that is going to hang around, and have more impact on your overall experience than you realize. Same with the keyboard and mouse. Unironically the part that you can likely get away with cheaping out the most on in your first build is… the GPU. Decent CPU will last a good 5-6 years at least these days. So get a decent monitor, get good peripherals - those will hang around when you upgrade the GPU. Then start that CPU - GPU - GPU upgrade cycle where it’s CPU, then GPU, then GPU, then back to the CPU. The reality is, once you have a base system - storage carries over, PSU can cycle over a build, the case can be reused.

          So I guess what I am saying is: Spend the money on the things liable to hang around the longest. It will lead to a better overall experience.

  • ArbitraryValue@sh.itjust.works
    link
    fedilink
    arrow-up
    6
    arrow-down
    3
    ·
    8 months ago

    Maybe I’m weird because anything over 20 FPS looks smooth to me (and I know it doesn’t to other people) but what’s the point of going over 60 FPS? Can anyone actually see the difference or is this just a matter of “bigger numbers must be better”?

    • CaptainEffort@sh.itjust.works
      link
      fedilink
      arrow-up
      11
      ·
      8 months ago

      Theres a huge difference once you use it for long enough. I have a 144hz monitor and love getting to play games that high, they’re so smooth! If you play long enough the difference becomes night/day.

      • Nik282000@lemmy.ca
        link
        fedilink
        arrow-up
        2
        arrow-down
        2
        ·
        8 months ago

        Theres a huge difference once you use it for long enough.

        sus. If you can’t notice a big difference right away then what difference IS there?

        • CaptainEffort@sh.itjust.works
          link
          fedilink
          arrow-up
          3
          arrow-down
          1
          ·
          edit-2
          8 months ago

          It basically just takes your eyes a second to adjust, the same is true for 30 fps to 60. If you only play at 30 then you might not notice the leap when going to 60, but once you start playing 60 regularly your eyes will adjust, and 30 will start to look choppy. Once that happens the difference will become easy to point out, and you’ll be able to appreciate the frame increase.

          Eventually it’ll become night and day, taking zero effort to notice the difference between frame rates, and the difference being a massive deal. I still only play at 1080p, for example, so I can play at 120-144 fps consistently, as that smoothness is infinitely more important to me than a sharper image.

        • Buddahriffic@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          8 months ago

          It’s hard to say exactly what it is, but if my monitor ends up getting set to 60hz because some game has weird defaults, I’ll notice that something is off until I change the setting and get it back to 144hz. Maybe it’s the monitor itself being tuned for 144hz so the pixels fade a bit before they get refreshed? Or maybe my eyes/brain can tell the difference after getting used to the higher refresh rate.

          I think it is different for games that are fps locked to 60fps while the monitor is set to 144hz, which suggests that it might be the fading thing (or something similar).

          Though I did notice a big difference in overwatch when I upgraded my GPU from one that would get fps in the range of 60-90 to one that could consistently get over 120.

          It’s really hard to quantify your own senses, so all I can say for sure is that I definitely notice it when my monitor is set to 60hz instead of 144hz.

    • TheOneCurly@lemm.ee
      link
      fedilink
      arrow-up
      10
      ·
      8 months ago

      There are diminishing returns but I can absolutely tell the difference between my 165Hz display and my wife’s 240Hz.

    • glimse@lemmy.world
      link
      fedilink
      arrow-up
      10
      ·
      8 months ago

      Lucky you. Seriously. I wish I didn’t care because it means displays are more expensive for me.

      I definitely thought it was all hype but once I saw games 120+ fps, even 60fps looks choppy to me. I also very much notice the difference between 30fps and 60fps video but 120fps (at full speed) didn’t do much for me

      For what it’s worth, I was a professional video editor for years so I’m a bit more inclined to notice than the average person

      • Formes@lemmy.ca
        link
        fedilink
        arrow-up
        1
        ·
        8 months ago

        I’m kind of in that boat - digital art, and so on more. I never understood buying a computer monitor of over about 22" that was 1080p resolution. I want decent colour reproduction - I get it, it won’t be perfect unless you spend a fortune but it should be at least decent.

        120hz w/ good HDR support is fantastic for content that supports it, and 240hz is just buttery smooth. Variable refresh is pretty much a must for modern gaming.

    • Fermion@mander.xyz
      link
      fedilink
      arrow-up
      8
      ·
      edit-2
      8 months ago

      It really depends on what and how you play. If reaction time is important then you’ll feel more than see the difference in refresh rates. If none of your games require sub second reaction time accuracy, then it’s much more of a nice to have luxury than a game changer.

      Also, frametime pacing matters a lot. If your system very consistently puts out 30 fps, you’ll have more accurate keypresses than if you normally get 50 and it gets hung on a few frames and it dips to 30fps. Your nervous system adapts pretty well to consistent delay, but it’s much more difficult to compensate for delay that varies a lot.

      I don’t really play first person shooters so resolution matters more to me than framerate.

    • RaoulDook@lemmy.world
      link
      fedilink
      arrow-up
      8
      ·
      8 months ago

      It’s very easy to tell the difference when you see them in person. I have a 60Hz monitor and a 144Hz monitor on the same PC and you can drag a window across the desktop from one to the other and the lack of animation frames in the movement going from 144 to 60 makes the movement look choppy on the slower one. In games, the animation becomes smooth to the point of being lifelike and visually vibrant when your framerate is able to go up to 90 to 100 or more FPS

    • TheSambassador@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      8 months ago

      For 3d games where the whole screen is moving and changing as the camera moves, I’ve noticed a big difference between 60 and 144. It just makes the game feel absurdly smooth.

      For smaller games with more static views it doesn’t really make much difference.

      It mostly depends on the speed of the game.

    • cevn@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      8 months ago

      Occasionally my fps gets set to 60. As soon as I start playing rocket league I can tell it is off. I went to a friends house and asked why everything is so choppy, checked his monitor settings and it was set to 60 instead of 144. There are people that can see the difference

    • Incandemon@lemmy.ca
      link
      fedilink
      arrow-up
      2
      ·
      8 months ago

      I don’t play competitive games so I don’t need the extra shooting accuracy. What I have found is that the higher refresh rate has made panning maps in RTS or looking around quickly in FPS much smoother. Its an overall nicer experience, but not really any better gaming than at 60hz.

    • ILikeBoobies@lemmy.ca
      link
      fedilink
      arrow-up
      2
      arrow-down
      2
      ·
      8 months ago

      I can tell you I notice no difference between 240 and 60

      Stability is all that really matters

    • CountVon@sh.itjust.works
      link
      fedilink
      arrow-up
      2
      ·
      8 months ago

      To be “FreeSync certified”, a monitor has to have certain minimum specs and must pass some tests regarding its ability to handle Variable Refresh Rate (VRR). In exchange for meeting the minimum spec and passing the tests, the monitor manufacturer gets to put the FreeSync logo on the box and include FreeSync support in its marketing. If a consumer buys an AMD graphics card and a FreeSync certified monitor then FreeSync (AMD’s implementation of VRR) should work out of the box. The monitor might also be certified by Nvidia as GSync compatible, in which case another customer with an Nvidia graphics card should have the same experience with Gsync.

  • stanka@lemmy.ml
    link
    fedilink
    arrow-up
    1
    ·
    8 months ago

    What does this mean for standard TVs that people us for gaming. LG/Sony/Samsung OLEDs tend to be able to do 4k@120, having native 120hz panels. Maybe this only covers “monitors” getting freesymc certified.

    • Dudewitbow@lemmy.zip
      link
      fedilink
      arrow-up
      2
      ·
      8 months ago

      those handle VRR with the HDMI 2.1 hardware spec which is a little bit different than the traditional method of VRR.

      its the main reason how current gen consoles have VRR (through hdmi 2.1 spec)

      • stanka@lemmy.ml
        link
        fedilink
        arrow-up
        1
        ·
        8 months ago

        Rtings says that the LG (B2 at least) TV’s support VRR via several standards: HDMI 2.1 , FreeSync, and GSYNC. I have a console hooked up, but no GPU good enough in a PC.

        • Dudewitbow@lemmy.zip
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          8 months ago

          its freesync/gsync over hdmi 2.1 standard. Nivida does not have a Gsync over HDMI in the standard hdmi connection. There is no non 2.1 hdmi monitor/tv that will accept VRR over HDMI on Nvidia. Only AMD had Freesync over HDMI (on very low end budget monitors)

          Gsync Compatible is basically gsync over the display port standard. Gsync Ultimate is over the FPGA which uses display port as a medium.

          • stanka@lemmy.ml
            link
            fedilink
            arrow-up
            1
            ·
            8 months ago

            I would love to learn more about this. Know of any technical papers or references?

            • Dudewitbow@lemmy.zip
              link
              fedilink
              arrow-up
              2
              ·
              8 months ago

              idk about technical documents perse, but heres a news article when AMD introduced VRR over hdmi ways back, noting how vrr on hdmi wasnt a thing yet, so AMD partnered with monitor makers to use a different scaler that would make it compatible with freesync.

              VRR over display port would be in the Displayport 1.2a specification sheet. VRR over HDMI (officially) is under the HDMI 2.1b sheet.

  • Dizzy Devil Ducky@lemm.ee
    link
    fedilink
    arrow-up
    1
    arrow-down
    6
    ·
    8 months ago

    To this day, I will never understand why people want their media to look absolutely perfect in super hyper 4k definition.