• wichwigga@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I remember upgrading to a 3600 from a i5-3470, the difference was massive. Then upgraded again to the 5800x3D. Thanks AM4.

    • Minnieal28@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I just went from a 4790K to at 7800X3d (with RTX 3070ti in both) and received almost double the frames. I’m glad that AMD is supporting upgradability. That’s how I justified spending $500 on a MB.

  • Lainofthewired79@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I had a 9600k for the longest time and it did work for me very well for most of that time, but as time went on the weaknesses definitely showed themselves really quick. I then upgraded to a 5800x and man that was such an amazing upgrade.

    I gave that 5800x to my sister who does actual productivity work with her PC and one time she exclaimed to me “man I love this PC!”

    • ThisToastIsTasty@alien.top
      cake
      B
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      The oldest one I have running is I5 2500k with 4x2gb sticks of ddr3 ram.

      The next one is a 1800x.

      It’s nice having so many extras just to have old cases holding back up drives.

      • ayunatsume@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Use them as NAS systems. Just need SATA HBM cards and maybe a faster LAN card.

        Also the 2509k can go faster with 1866 ram. Try to push your speed up. I Got massive gains on minimum fps going to 1866CL10 for two 8GB sticks.

  • xenonisbad@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    It’s weird to make “5 years later” test, and out of 21 games tested most are older than a year

    • two 5 years old games (Assetto Corsa Competizione, Shadow of the Tomb Raider)
    • 3 years old port of 7 years old port (Horizon Zero Dawn)
    • two 3 years old game (Watch Dogs: Legion, Hitman 3)
    • three 2 years old games (Far Cry 6, Total War: Warhammer 3, Halo Infinite)
    • 1 year old remaster of 5 years old game (Spider-man Remastered)
    • three 1 year old games (Plague Requiem, COD Modern Warfare 2, Callisto Protocol)

    Few of the titles that are from this year are kinda questionable choices:

    • game that comes out every year (F1 23)
    • game that started as expansion for 3-years old game (and it’s heavily based on original game tech) (AC Mirage)
    • this year expansion for 3 years old game with expansion (Cyberpunk 2077: Phantom Liberty)

    I think it would be much more interesting video, if “5 years later” video was about checking how those CPU perform in today’s games, not how well they perform in games released in last 5 years.

    • oginer@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      You forgot to mention they test BG3, TLoU remaster, RE4, Starfield, Hogwards Legacy and Callisto Protocol, all being less than 1 year old. And CP 2077 had a big graphics update with the expansion.

      It looks like a pretty balanced selection, with games from the last 5 years, 1/3 being less than 1 year old.

    • dedoha@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Most of them are either demanding/good looking despite being old or just easier and faster to test because of built in benchmarks like F1 series to provide additional data. I agree that some choices are questionable especially when HUB benches RT performance, Spiderman remastered instead of newer Miles Morales or Far Cry 6 RT

    • RealThanny@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      The purpose is to see how the processors aged over the five years since their release. Testing games that span those five years, including titles released this year (which you neglected to list, given your obvious contrarian agenda), is the obvious way to do that.

      • xenonisbad@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        The purpose is to see how the processors aged over the five years since their release.

        They way they are wording it, it’s performance after 5 years, not during 5 years. They even combined all those games across 5 years into one slide. After reading the title, watching the intro and watching summary results I had different idea of what they were testing than what they actually were testing, which I find weird.

        including titles released this year (which you neglected to list, given your obvious contrarian agenda)

        Sure, it’s agenda, not the fact that listing games from 2023 would add literally nothing to my comment. /s

        I don’t know how do you imagine criticism would work if we would be obligated to list things that are correct. Of course when I’m complaining about problems I’m seeing, I won’t list every thing I don’t see a problem with. You just wrote about things from comment you don’t agree with, but you haven’t listed any things you think I got right. So, do you think literally everything in my comment is wrong, or do you have obvious contrarian agenda?

        I wrote how many games they tested in total, and counted games older than 1 year. Even if “out of 21 games tested most are older than a year” for reason unknown to me is not specific enough for you, number of how many games from this year they tested is only “hidden” behind basic math.

        If by “listed” you are speaking about the fact that I wrote names of older games - I did that so anyone could verify if I was right about how many how old games they tested. Like someone could say “well this game had overhaul this year so it’s more like this year game”, or “well this game isn’t 3 years old it’s 2 years 10 months old” or “well this port have many new features so it’s more CPU heavy than original”. I literally wrote it so it would be easier to disagree with me if people have relevant information that I don’t have.

    • Slyons89@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      It’s practical to use their existing test suite. It allows for comparison against other parts they have previously tested. Plus plenty of people are still playing games from the past 5 years.

      • xenonisbad@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        It’s practical to use their existing test suite. It allows for comparison against other parts they have previously tested.

        It’s not good idea to compare hardware performance from different tests against each other. Even if they test the same game at same settings, testing procedure can be different, and that makes those tests kinda incomparable.

        Just look at 5800x3d performance in this video and compare it to 5800x3d performance their video about 14900k from 1 month ago:

        • Baldur’s Gate 3, Ultra 1080p: 132/100 vs 145/106
        • TLOU Part 1, Ultra 1080p: 139/116 vs 152/129
        • Assetto Corsa Competizione, Epic 1080p: 175/143 vs 161/128
        • Spider-man Remastered, 1080p High + High RT: 144/119 vs 122/93

        I’m all in for using few years old CPUs in benchmarks to see how they compare to newest CPUs, but if that was the goal of the video, they would’ve also tested newest CPUs.

        Plus plenty of people are still playing games from the past 5 years.

        Of course they are, plenty of people are still playing games older than 5 years too. I don’t have a problem with testing older games, but when someone say “5 years later”, “today’s games”, “to see which platform aged the best”, “in 2023”, when most of the games aren’t “today’s”, aren’t from “5 years later” CPUs were released, weren’t released “in 2023”, and say us nothing about how CPUs aged in 5 years, I find it confusing and kinda dishonest.

  • capn_hector@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    all of these processors were utterly wiped out by the “spend $100 more on a 8700K and overclock” option.

    There is such a thing as false economy, sometimes spending more money results in a thing that lasts long and gives better results throughout that whole timespan… classic “boots theory” stuff.

    Having your $2000 build last 2-3 years less than it otherwise could have because you didn’t spend $100 more on the 8700K when you had the option to, is stupid, and not good value. Reviewers over-fixated on the minmaxing, the order of the day was “cut out everything else and shunt it all into your GPU”, some reviewers took it as far as saying you should cut down to a 500W PSU or even less. And today that $100 extra you spent on your GPU is utterly wasted, while going from a 8600K to 8700K or buying a PSU that doesn’t cut out on transient loads got you a much better system in the long term, even if it didn’t win the benchmarks on day 1.

    (and yes, transients were already problematic back then, and in fact have remained pretty constant at around 2x average power draw…)

    • SENDMEJUDES@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      You really think people with $2000 builds would buy a $200 cpu? Comparing 3600 and 8600k is for the 99% of the builds which is prebuilds and sub $1000 builds.

      I agree with you that going all out for the graphics cards is not optimal expect if you plan to upgrade after 4-5 years ( which many people with budget builds do).

      (enormous) downsides of Zen1/Zen+.

      ? They were pretty close and way more future proof than the lower core count intel counter parts. They also had smother .1% ( so less shuttering ) from day 1 of zen. Latency was a non issue with not crap ram timings.

  • conquer69@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I remember a friend deciding between the 9600k and 2600x since the 3600 hadn’t come out yet. The 9600k came out like 9 months earlier.

    • RealThanny@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      It was, though I don’t know how long the original was up. It appeared in my RSS feed, but the video was removed by the time I tried to watch it. Either it’s the same video, and the original release was a mistake for timing reasons, or they had to make an edit to remove some kind of mistake or encoding SNAFU.

  • LifePineapple@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    The Ryzen 3600 beats the Intel 9600K? But i have a very reliable source which says that:

    In terms of performance, the i5-9600K is almost unbeatable for desktop

    the overclocked Ryzen 3600 is approximately 13% worse for gaming

    The masterfully hyped Ryzen 3600 may well be the best CPU for multimedia producers on a tight budget but in today’s market there are faster and less expensive alternatives for gamers, streamers and general desktop users.

    /s

  • Royale_AJS@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Even if there’s a 10% difference in gaming, the longevity of the AM4 platform way outweighs the slight performance win on the Intel. The fact that the same board running the 3600 could have started with an R5 1600, and ended on a 5950 or 5800X3D is worth more than any slight generational win.

    • wichwigga@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Same. It applies to a lot of normal people who don’t upgrade every year or are more modest with their PC spending.

    • AntiworkDPT-OCS@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I remember their rx580 8gb and 4gb vs gtx 1060. Seeing the longevity gains from VRAM on those old videos convinced me to prioritize that in my GPUs.

  • riderer@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    3600 was under $100 back then? wth? how?

    1600 was only sub $100 for a short time.

    • chithanh@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      AMD re-released the 3600 in a boxed WoF configuration (100-100000031AWOF), that one dipped below $100 when on sale, and permanently in late 2022.

    • capn_hector@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      it absolutely never was, the real bottom for 3600 was $160 ish and then the pandemic and 5000-series hit and prices went up like crazy.

      he is thinking of 1600AF for sure

    • Reeggan@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Maybe he meant it as a ballpark price idk pcpp pricing history shows at 130 and 120 lowest point in 2022. 100 just recently in 2023 I don’t think he’s referring to that

    • TroubledMang@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      3600 was $150 regularly at microcenter. Later, the 5k/5600x came out, was in short supply, and the hype from the better performing 5k series got the prices on 3k to go up lol. Soon after, the 3600 went back up to $200.

      I ended up doing a bunch of 9700k builds because they were on sale for $200 at MC. Better for gaming than the 3k series.