• Ulrich@feddit.org
    link
    fedilink
    English
    arrow-up
    76
    ·
    edit-2
    8 days ago

    tl:dw some the testing shows 300-500% improvements in the 16GB model. Some games are completely unplayable on 8GB while delivering an excellent experience on the 16GB.

    It really does seem like Nvidia is intentionally trying to confuse their own customers for some reason.

    • inclementimmigrant@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      16
      ·
      edit-2
      8 days ago

      To me it sounds like they are preying on the gamer who isn’t tech savvy or are desperate. Just a continuation of being anti-consumer and anti-gamer.

      • NekuSoul@lemmy.nekusoul.de
        link
        fedilink
        English
        arrow-up
        2
        ·
        7 days ago

        Yup. This is basically aimed at the people who only know that integrated GPUs are bad and they need a dedicated card, so system manufacturers can create a pre built that technically checks that box for as little money as possible.

      • Ulrich@feddit.org
        link
        fedilink
        English
        arrow-up
        2
        ·
        8 days ago

        Okay well that’s the low-hanging fruit but explain to me the correlation? How does confusing their customers fuel their greed?

          • Ulrich@feddit.org
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            8 days ago

            So their strategy is making and selling shitty cards at high prices? Don’t you think that would just make consumers consider a competing brand in the future?

            • frazorth@feddit.uk
              link
              fedilink
              English
              arrow-up
              1
              ·
              7 days ago

              The reviews said that it was a better card than the other brand.

              Just imagine how bad those must have been!

              They don’t know they’ve been ripped off.

            • BCsven@lemmy.ca
              link
              fedilink
              English
              arrow-up
              1
              ·
              7 days ago

              It’s like teens and IPhones, they don’t care if they pickup a used 3 year old iPhone for more money than a new Android, they want the iPhone branding

            • gaael@lemm.ee
              link
              fedilink
              English
              arrow-up
              6
              ·
              8 days ago

              For most consumers it might not, the amount of nvidia propaganda advertisement in games is huge.

            • MBech@feddit.dk
              link
              fedilink
              English
              arrow-up
              3
              ·
              8 days ago

              Yea I don’t know why buying a shitty product should convince me to throw more money at the company. They don’t have a monopoly, so I would just go to their competitor instead.

    • mindbleach@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      ·
      8 days ago

      They had trouble increasing memory even before this AI nonsense. Now they have a perverse incentive to keep it low on affordable cards, to avoid undercutting their own industrial-grade products.

      Which only matters thanks to anticompetitive practices leveraging CUDA’s monopoly. Refusing to give up the fat margins on professional equipment is what killed DEC. They successfully miniaturized their PDP mainframes, while personal computers became serious business, but they refused to let those run existing software. They crippled their own product and the market destroyed them. That can’t happen, here, because ATI is not allowed to participate in the inflated market of… linear algebra.

      The flipside is: why the hell doesn’t any game work on eight gigabytes of VRAM? Devs. What are you doing? Does Epic not know how a texture atlas works?

      • schizo@forum.uncomfortable.business
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        8 days ago

        The flipside is: why the hell doesn’t any game work on eight gigabytes of VRAM? Devs. What are you doing? Does Epic not know how a texture atlas works?

        It’s not that they don’t work.

        Basically what you’ll see is kinda like a cache miss, except the stall time to go ‘oops, don’t have that’ and go out and get the required bits is very slow, and so you can see 8gb cards getting 20fps, and 16gb ones getting 40 or 60, simply because the path to get the missing textures is fucking slow.

        And worse, you’ll get big framerate dips and the game will feel like absolute shit because you keep running into hitches loading textures.

        It’s made worse in games where you can’t reasonably predict what texture you’ll get next (ex. Fortnite and other such online things that are you know, played by a lot of people) but even games where you might be able to reasonably guess, you’re still going to run into the simple fact that the textures from a modern game are simply higher quality and thus bigger than the ones you might have had 5 years ago and thus 8gb in 2019 and 8gb in 2025 is not an equivalent thing.

        It’s crippling the performance of the GPU that may be able to perform substantially better, and for a relatively low BOM cost decrease. They’re trash, and should all end up in the trash.

        • mindbleach@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          3
          ·
          8 days ago

          That’s what I’m on about. We have the technology to avoid going ‘hold up, I gotta get something.’ There’s supposed to be a shitty version that’s always there, in case you have to render it by surprise, and say ‘better luck next frame.’ The most important part is to put roughly the right colors onscreen and move on.

          id Software did this on Xbox 360… loading from a DVD drive. Framerate impact: nil.

          • PenguinTD@lemmy.ca
            link
            fedilink
            English
            arrow-up
            1
            ·
            7 days ago

            the virtual texture tech is not all mighty and you can still run into situation where if the allocation is fewer than you need you run into the page swap. It act similarly to traditional cache miss if you cross certain threshold because you can’t keep enough “tiles” in memory. Texture quality popping and then stuttering is the symptom progressing from lower than needed vram allocated to severely insufficient.

          • PenguinTD@lemmy.ca
            link
            fedilink
            English
            arrow-up
            1
            ·
            7 days ago

            the virtual texture tech is not all mighty and you can still run into situation where if the allocation is fewer than you need you run into the page swap. It act similarly to traditional cache miss if you cross certain threshold because you can’t keep enough “tiles” in memory. Texture quality popping and then stuttering is the symptom progressing from lower than needed vram allocated to severely insufficient.

  • WormFood@lemmy.world
    link
    fedilink
    English
    arrow-up
    62
    ·
    8 days ago

    it is 2019, the 2060ti has 8gb of vram. it is 2020, the 3060ti has 8gb of vram. it is 2023, the 4060ti has 8gb of vram. it is 2025, the 5060ti has 8gb of vram.

  • filister@lemmy.world
    link
    fedilink
    English
    arrow-up
    43
    arrow-down
    1
    ·
    8 days ago

    The whole fact that NVIDIA is not allowing AIBs to send the 8GB card to reviewers is quite telling. They are simply banking on illiterate purchasers, system integrators to sell this variant. That’s another low for NVIDIA but hardly surprising anyone.

    Planned obsolescence.

    • HeyJoe@lemm.ee
      link
      fedilink
      English
      arrow-up
      3
      ·
      8 days ago

      I agree, but it is still crazy that there are people out there making $500 plus purchases without the smallest bit of research. I really hope this card fails only for the reason that it deserves to.

  • kugmo@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    10
    ·
    edit-2
    7 days ago

    On the flip-side, every game worth playing uses 2GB vram or less at 1080p.

    • jnod4@lemmy.ca
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      2
      ·
      7 days ago

      I don’t even have a GPU, but to be honest I don’t even game anymore cuz I work more hours then there are in a day

      • notthebees@reddthat.com
        link
        fedilink
        English
        arrow-up
        3
        ·
        7 days ago

        I have a game that eats 11 gb of vram on low at 1080p (I play it on windowed). It suffers from some Unreal engine shenanigans and it’s also a few years old.

        • Madbrad200@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          ·
          7 days ago

          Just like normal ram, if you have more to use then it’ll get used. It doesn’t mean it requires more than 8gb in order to run well.

          I played all of Cyberpunk on high settings with 8gb.

          • notthebees@reddthat.com
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            7 days ago

            You’re not wrong, but then there’s games like this that need at least 6 gb (more on dx12) to run on low without it running out of memory and either crashing or not launching. This is an actual issue with this particular game.

            Edit: Cyberpunk has gotten a lot better though and will run on things it has no business running on.

        • ihatefascist@lemm.ee
          link
          fedilink
          English
          arrow-up
          2
          ·
          7 days ago

          UE is probably the worst engine ever made, even games from 20 years ago look better than that blurry mess of an engine. I hope nobody makes any game on it anymore, most of them are also badly optimized, never understood why people like that engine.

          • I Cast Fist@programming.dev
            link
            fedilink
            English
            arrow-up
            1
            ·
            7 days ago

            Same reason people still use Unity after the whole shitfest on “per install tax”: large community, huge knowledge base, tutorials everywhere, professional courses that focus on it.

            Kinda ironic that the Unreal Tournament games (99, 2004, 3) were all incredibly optimized.

      • amorangi@lemmy.nz
        link
        fedilink
        English
        arrow-up
        3
        ·
        7 days ago

        Video editing and AI require as much VRAM as you can get. Not everyone uses the cards just for gaming.

        • Alaknár@lemm.ee
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          3
          ·
          7 days ago

          Then don’t buy the current-gen low-end card for video editing, mate. Get previous-gen with more vRAM, or go AMD.