• schizo@forum.uncomfortable.business
    link
    fedilink
    English
    arrow-up
    118
    ·
    edit-2
    1 month ago

    Amazing what happens when your primary competitor spends 18 months stepping on every rake they can find.

    And, then, having run out of rakes, they then deeply invest in a rake factory so they can keep right on stepping on them.

    This’ll probably be a lot more interesting a year from now, given that the product lines for the next ~9 months or so are out and uh, well…

    • shortwavesurfer@lemmy.zip
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 month ago

      All of my computers had been Intel for many many years and here about a year and a half ago I got my first AMD computer because I had seen other people’s machines with AMD processors but I had never owned one for myself and so now I do I have one with an AMD Ryzen 5

  • Jesus@lemmy.world
    link
    fedilink
    English
    arrow-up
    61
    arrow-down
    1
    ·
    1 month ago

    If you look at who is manufacturing silicon, the numbers look even worse for Intel. All of these competitors are using TSMC fabs. AMD, Apple, Qualcomm, etc.

    TSMC is the real 500lb gorilla in the room.

      • Wahots@pawb.social
        link
        fedilink
        English
        arrow-up
        23
        arrow-down
        3
        ·
        1 month ago

        Pray they don’t, but I’m almost certain they will now that the US is appointing complete morons to every portion of the US government. The US won’t really be able to help until this rot gets cleaned out. China has four years before we can really help Taiwan again. (Or at least give them air superiority)

      • randon31415@lemmy.world
        link
        fedilink
        English
        arrow-up
        11
        arrow-down
        3
        ·
        1 month ago

        Biden just finalized the Arizona TSMC plant.

        If that gets invaded, I think semiconductors are the least of our problems.

        • chutchatut@lemm.ee
          link
          fedilink
          English
          arrow-up
          15
          arrow-down
          1
          ·
          1 month ago

          But the Arizona plant wouldn’t be allowed to manufacture the most cutting edge chips.

            • Da Bald Eagul@feddit.nl
              link
              fedilink
              English
              arrow-up
              17
              arrow-down
              1
              ·
              1 month ago

              Taiwan is incentivized to keep the latest and greatest local, so they can hopefully get protection from the USA and Europe

            • ColeSloth@discuss.tchncs.de
              link
              fedilink
              English
              arrow-up
              8
              ·
              1 month ago

              Taiwans rule. Foreign tsmc fabs have to be a gen behind. This would definitely change if China took over Taiwan, but who knows what China will do or allow at that point. They could shut the whole US fab down if they want. Even if they did try to re-tool the US fab (taiwan or china or tsmc) in a few years, it would cost billions and a lot of time to get it done.

              • Dudewitbow@lemmy.zip
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                30 days ago

                you also have to keep in mind, the client that purchases cutting edge nodes first is apple. AMD only currently uses it for Zen 5c, and Qualcomm uses it for snapdragon elite/8 gen 4. mobile usally always gets them for efficiency reasons(and better yields due to smaller dies). other markets have historically been a node behind already (e.g despite the 9800x3d being new, its only a N4 die with a N6 io die)

    • TheGrandNagus@lemmy.world
      link
      fedilink
      English
      arrow-up
      17
      arrow-down
      1
      ·
      1 month ago

      And Intel. Intel has been using TSMC fabs for a while.

      They used to get a 40% discount, too, but that stopped recently when Pat Gelsinger said people should stop buying from TSMC because there’s a good chance they’ll be invaded.

      TSMC’s CEO didn’t like that, and said “ok, no more 40% discount for you. Effective immediately.” (TL;DR’d, obviously).

  • walden@sub.wetshaving.social
    link
    fedilink
    English
    arrow-up
    52
    ·
    edit-2
    1 month ago

    I just built a computer for a friend and she decided to get an AMD when I told her it was about the same performance but used half as much electricity.

    This is a person who knows nothing about computers. Intel is losing their “household name” status in a big way judging by that.

  • circuitfarmer@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    33
    arrow-down
    1
    ·
    1 month ago

    Not surprised. I switched to AMD CPU and GPU about a year ago. Could not be happier. Ryzen sips power and I run mine in Eco mode (since I’m on an air cooler). Performance is still fantastic.

    • addie@feddit.uk
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 month ago

      Invested in a water cooler setup back when I had a Bulldozer chip, which was near essential. Now on a Ryzen, and getting it to exceed about 35 degrees is very difficult. Been very good for long-term stability of my desktop - all the niggling hard disk issues seem to just go away when they’ve not subjected to such thermal cycling any more.

      Fantastic chips.

  • brucethemoose@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    arrow-down
    1
    ·
    edit-2
    1 month ago

    I feel like they are dropping the ball in the GPU space though, both on desktop and in servers.

    They’renot really leveraging it. They killed the steam deck line of “small core count, GPU heavy APUs” which is why Valve hasn’t updated it and competitors seem so power hungry. They all but killed server APUs, making them mega expensive and HPC only. They’re finally coming out with a M-Pro like consumer APU, but it took until 2025, and pricing will probably be a joke just like their Radeon Pro GPUs…

    And I don’t even wanna get into the AI space. They get like 99% there and then go “nah, we don’t really care about this market, let Nvidia have their monopoly and screw everyone over.” It makes me want to pull my hair out.

    • vin@lemmynsfw.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 month ago

      It wouldn’t be possible to dethrone nvidia in AI anyway, at least not alone.

  • GHiLA@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    10
    ·
    edit-2
    30 days ago

    My thought process:

    Desktop: I need cost for performance…

    Server: fps for the Jellyfin, transcodes for the transcode god

    • frezik@midwest.social
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      30 days ago

      I’d drop in an old Nvidia GPU for transcoding, anyway. There’s lots of old cards that support nvenc. Don’t neglect the Quadro cards, either. Lots of them are cheap on ebay and will transcode just fine without even needing their own cooling fan.

    • frazorth@feddit.uk
      link
      fedilink
      English
      arrow-up
      1
      ·
      30 days ago

      Transcodes worked vastly better with QuickSync last time I bought a machine.

      Does the AMD transcoded work as well these days?

        • frazorth@feddit.uk
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          30 days ago

          Damnit.

          I wonder if thats because the transcoding hardware ismcrap or they just aren’t concentrating on that in the software.

  • Myro@lemm.ee
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    1
    ·
    30 days ago

    Sad but true. Intel’s performance was poor over the last year. Shuddering thinking about my Mac with Intel CPU, there must be burn victims from this thing. Still, less competition is never a good thing.

    • Blackmist@feddit.uk
      link
      fedilink
      English
      arrow-up
      11
      ·
      1 month ago

      It’s taken this long for Intel to lose gamer trust.

      Intel also have lower power consumption iirc, which is useful for laptops etc.

      AMD have the best server chips: https://www.cpubenchmark.net/high_end_cpus.html

      You have to remember that most people aren’t “choosing a CPU” as much as buying a PC. If the majority of pre-build retail PCs have Intel, then the majority of purchases will be Intel.

        • Blackmist@feddit.uk
          link
          fedilink
          English
          arrow-up
          2
          ·
          30 days ago

          That’s under load. At Idle (which is where your average home PC will spend most of it’s time) I think Intel has the edge still.

          It’s certainly a consideration for a battery device. Watching a video reading emails or staring at a spreadsheet will likely have better battery life than a similar spec AMD device.

          We’ve reached a point where most everyday computing tasks can be handled by a cheapo N100 mini PC.

          • 486@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            30 days ago

            Actually AMDs mobile parts are pretty good at idle power consumption and so are their desktop APUs. Their normal CPUs, which use the chiplet design are rather poor when it comes to idle power consumption. Intel isn’t really any better when compared to the monolithic parts at idle and Intel CPUs have horrible power consumption under load. Their newest CPUs are better when it comes to efficiency than 13th and 14th gen CPU, bus still don’t match or even exceed AMD.

          • daellat@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            30 days ago

            I would have to ask for a source on that. I can’t really find anything comparing many cpus.

            However this video compares top end models on otherwise pretty much identical laptops and amd definitely wins in YouTube playback on battery https://youtu.be/X_I8kPlHJ3M?si=8a4Tkmd556hQh7BZ

            But if you’ve got anything to better compare I’m all ears

            • Blackmist@feddit.uk
              link
              fedilink
              English
              arrow-up
              2
              ·
              30 days ago

              It may well be the case that they’re similar or even swapped now. I can see that the N100 is pretty low power compared to the newest low end AMD chips, but then the AMD chips are better in terms of what they can do.

              This one reckons they’re pretty similar.

              https://www.reddit.com/r/Amd/comments/10evt0z/ryzen_vs_intels_idle_power_consumption_whole/

              This one reckons Intel are better.

              https://news.ycombinator.com/item?id=32809852

              I doubt there’s much in it either way. Even if AMD are ahead now, laptops don’t get replaced right away, normies replace shit when it fails or is too slow to run whatever shit Google shoehorned into Chrome this year, and the most popular laptops are probably the ones with the lowest sticker price.

              • daellat@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                30 days ago

                Ah yeah, I should have specified I was looking at the laptop side of things more as the person I originally replied to mentioned that power usage is more important there (which is understandable). There appears to be only a handful of laptop chips that I can recognize in that first link and all of them amd but I don’t know the naming scheme of modern intel laptop parts anymore.

    • frezik@midwest.social
      link
      fedilink
      English
      arrow-up
      8
      ·
      edit-2
      30 days ago

      Servers need very high uptime. Also, when something is documented to work a certain way, it had damn well better work as stated.

      Intel had a long reputation of solid engineering. Even when they were losing at both performance and performance per watt, they could still fall back on being steady. The 13th/14th gen degradation problems have shot that argument to hell, and server customers are jumping ship.

    • CriticalMiss@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      30 days ago

      Note: I’m not from the US, so in a lot of cases going to a manufacturer’s website and purchasing computers is not an option. Resellers are still the ones in charge here.

      I work IT and when it time for a hardware refresh the reseller we are in contact with said they don’t stock AMD as there’s no demand. Which in a way creates a chicken and egg problem. I asked them if it would be possible to get laptops with AMD chips and the reseller said yes but we have to wait. So we bought 4 Intel machines for the meantime and placed a custom order for ones with AMD chips. The ThinkPads we are buying are significantly cheaper if they come with AMD chips, I was honestly a bit baffled there was no demand. Regardless, we are happy with the purchase and so are the users who claim the computers are relatively cooler than their Intel 8th gen predecessors. It just goes to show that for the most part, enterprise makes a huge chunk of the desktop market share nowadays (as younger generations tend to simply not use a computer and do everything on their phone) and that market just isn’t ready for the transition yet. They’ve been going strong with Intel for about 30-40 years. Weening of that tit is gonna take some time.

  • Routhinator@startrek.website
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    1 month ago

    Still not ready to trust AMD/ATI again. I used them exclusively right up until they bought ATI and then decided fuck open source and the drivers for Linux tanked.

    I hear all the issues folks have had with Intel/NVIDIA but I have yet to experience any of them. From where I’m sitting they are still working great. And their open source has not been perfect but its consistent. Instead of going from being golden to fuck you Linux folks overnight.