You heard him 4090 users, upgrade to a more powerful GPU.

  • Scrubbles@poptalk.scrubbles.tech
    link
    fedilink
    English
    arrow-up
    24
    arrow-down
    42
    ·
    1 year ago

    Runs great on my 5000 series AMD CPU and 3000 series Nvidia GPU, those came out 2 years ago now, and that’s averaging about 50fps on a 4k monitor.

    If that isn’t optimized, idk what is. Yes, I had high end stuff from 2 years ago, but now it’s solid middle range.

    People are so damn entitled. There used to be a time in PC gaming where if you were more than a year out of date you’d have to scale it down to windows 640x480. If you want “ultra” settings you need an “ultra” PC, which means flipping out parts every few years. Otherwise be content with High settings at 1080p, a very valid option

    • ocassionallyaduck@lemmy.world
      link
      fedilink
      English
      arrow-up
      31
      arrow-down
      2
      ·
      1 year ago

      I mean, this was also before video cards cost as much as some used cars or more than a month’s rent for some people.

          • JJROKCZ@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            1
            ·
            1 year ago

            For only 300 more I have a mortgage on a 2000sq foot home in a large American city….

            I have a 6900xt because I got a promotion recently and wanted to treat myself to get off the r9-300 series finally but it wasn’t 1600, I think I paid 1100

      • Scrubbles@poptalk.scrubbles.tech
        link
        fedilink
        English
        arrow-up
        12
        arrow-down
        21
        ·
        1 year ago

        I’m not saying it’s not an expensive hobby, it is. PC gaming on ultra is an incredibly expensive hobby. But that’s the price of the hobby. Saying that a game isn’t optimized because it doesn’t run ultra settings on hardware that came out 4+ years ago is nothing new, and to me it’s a weird thing to demand. If you want ultra, you pay for ultra prices. If you don’t want to/can’t, that’s 100% acceptable, but then just be content to play on High settings, maybe 1080p.

        If PC gaming is too expensive in general that’s why consoles exist. You get a pretty great experience on a piece of hardware that’s only a few hundred dollars.

          • NuPNuA@lemm.ee
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            1 year ago

            I don’t know if you noticed, but everything became more expensive in the last year. Food, housing, etc, it’s called inflation and PC parts aren’t immune.

          • Scrubbles@poptalk.scrubbles.tech
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            14
            ·
            1 year ago

            4090 is definitely nuts, but with inflation the 4080 is right about on par. As usual team red very close in comparison for a much lower cost. You don’t have to constantly run the highest of the high level to get those sweet graphics, but it’s about personal taste. Personally it’s not for me paying the 40% more for a 10% jump in graphics, but every 2-3 generations is when I usually step back and reanalyze. Tbh usually it’s a game like starfield that makes me think if I should get a new one. Runs great for now though, probably have at least 1 hopefully 2 more generations before I upgrade again

            • ono@lemmy.ca
              link
              fedilink
              English
              arrow-up
              12
              ·
              edit-2
              1 year ago

              4090 is definitely nuts, but with inflation the 4080 is right about on par.

              On par with the competing product? Sure. On par with inflation? Not by a long shot. GPU prices tripled a couple years back. Inflation accounted for only a small fraction of that. They have come down somewhat since then, but nowhere close to where they should be even with inflation.

              As usual team red very close in comparison

              Indeed. Both brands being overpriced doesn’t make them any less overpriced. Cryptocurrency and scalping may be mostly gone now, but corporate greed persists.

              That’s not Todd Howard’s fault, but when he makes a snarky comment expecting everyone to cough up that kind of money to play his game, it’s more than a little tone deaf.

              • Scrubbles@poptalk.scrubbles.tech
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 year ago

                I’ll admit didn’t know the 4000 was that high, but yeah 1200 for the midrange card is too much. If it stays like this I may switch back to team Red. I do believe costs are probably higher, (I remember buying my first board with an AGP slot), the ones now are… a bit more complicated and complex to make, but the jump from 800 in 2020 to 1200 in 2023 is too much.

            • Nunchuk@lemmy.bigsecretwebsite.net
              link
              fedilink
              English
              arrow-up
              7
              ·
              1 year ago

              the 4080 is right about on par

              Adjusted for inflation in the US, the 1080 ti cost only $876 in today’s money when it came out. The 4080 launched at $1231 in today’s money. You are simply incorrect

              • Hoomod@lemmy.world
                link
                fedilink
                English
                arrow-up
                7
                ·
                1 year ago

                The dude digs a hole and then grabs a bigger shovel

                Some people just really love a company and will do anything to excuse their shortcomings

                Starfield is poorly optimized and that’s really all there is to it. I’m sure in a few weeks modders will (once again) fix some obvious issues. Bethesda has no incentive to do the work themselves when the community will do it for free

              • Scrubbles@poptalk.scrubbles.tech
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                1 year ago

                Okay I’ll admit I didn’t know that’s how much the 4080 was, last time I checked was the 3000 series and yeah, that’s a lot. (I thought it started around 8-900) I stick to my points though, if you want ultra gaming, it’s going to cost an arm and a leg. My main point is still shouldn’t expect older hardware to get ultra settings, and that’s okay. You can play a game on medium settings and still have a blast.

    • MooseLad@lemmy.world
      link
      fedilink
      English
      arrow-up
      19
      arrow-down
      1
      ·
      1 year ago

      People are entitled because they don’t want to spend thousands of dollars on components only for them to be outdated within a fraction of the lifecycle of a console?

      How about all the people that have the minimum or recommended specs and still can’t run the game without constant stuttering? I meet the recommended specs and I’m playing on low everything with upscaling turned on and my game turns into a laggy mess and runs at 15fps if I have the gall to use the pause menu in a populated area. I shouldn’t have to save and reload the game just to get it to run smoothly.

      Bethesda either lied about the minimum/recommended requirements or they lied about optimization. Let’s not forget about their history of janky PC releases, dating back to Oblivion, which was 6 games and 17 versions of Skyrim ago.

      • NuPNuA@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Consoles don’t even last their whole life time anymore, both machines required pro models to keep up with performance last gen and rumours have it Sony are gearing up for one this gen too.

      • Scrubbles@poptalk.scrubbles.tech
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        9
        ·
        1 year ago

        and no one is saying they have to, that’s my point that keeps getting overlooked. If someone wants to play sick 4k 120fps that’s awesome, but you’re going to pay a premium for that. If people are upset because they can’t play ultra settings on hardware that came out 5 years ago, to me that’s snobby behavior. The choice is either pay up for top of the line hardware, or be happy with medium settings and maybe you go back in a few years and play it on ultra.

        If the game doesn’t play at all on lower hardware (like Cyberpunk did on release), then that is not fair and needs to be addressed. The game plain did not work for lower end hardware, and that’s not fair at all, it wasn’t about how well it played, it’s that it didn’t play.

        • Hoomod@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          edit-2
          1 year ago

          4k 120fps would be great

          But the 4090 only averages 75fps at 4k high preset. 7900xtx averages 74fps

          You can go skim the Gamers Nexus review of the 7700xt, it has a portion dedicated to Starfield in it.

          You need a 6700xt or 4060ti to get 60fps on high at 1080p

          • Scrubbles@poptalk.scrubbles.tech
            link
            fedilink
            English
            arrow-up
            3
            ·
            1 year ago

            Idk what to tell you mate, I’m on a 3080, 1440p, and I’m getting average 60fps on 1440p My settings are all ultra except for a couple, FSR on at 75% resolution scale. To me, that’s optimized, I don’t even expect 60fps on an RPG. Cyberpunk I’ve never had higher than 50.

            • Hoomod@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              Why don’t you set it to ultra “except for a couple” until you get 60+ in cyberpunk

    • _Decoy_@lemmy.world
      link
      fedilink
      English
      arrow-up
      21
      arrow-down
      5
      ·
      1 year ago

      You’re missing the point.

      There are a lot of games that look much better AND run much better.

      It’s not about how often you upgrade.

      • Scrubbles@poptalk.scrubbles.tech
        link
        fedilink
        English
        arrow-up
        11
        arrow-down
        9
        ·
        1 year ago

        I mean, yeah but also by what metric. There’s a thousand things that can affect performance and not just what we see. We know Starfield has a massive drive footprint, so most everything is probably high end textures, shaders, etc. Then the world sizes themselves are large. I don’t know, how do you directly compare two games that look alike? Red Dead 2 still looks amazing, but at 5 years old it’s already starting to show it’s age, but it also had a fixed map size, but it got away with a few things, etc etc etc every game is going to have differences.

        My ultimate point is that you can’t expect to get ultra settings on a brand new game unless you’re actively keeping up on hardware. There’s no rules saying that you have to play on 4K ultra settings, and people getting upset about that are nuts to me. It’s a brand new game, my original comment was me saying that I’m surprised it runs as good as it does on the last generation hardware.

        I played Borderlands 1 on my old ATI card back in 2009 in windowed mode, at 800x600, on Low settings. My card was a few years old and that’s the best I could do, but I loved it. The expectation that a brand new game has to work flawlessly on older hardware is a new phenomenon to me, it’s definitely not how we got started in PC gaming.

    • bfg9k@lemmy.world
      link
      fedilink
      English
      arrow-up
      17
      arrow-down
      4
      ·
      1 year ago

      I have an AMD 3800X and an RTX2070 and I am barely seeing 30fps on the lowest settings at 1080p and 1440p.

      DOOM Eternal runs just fine at 144fps on High and looks miles better.

      It’s just not optimised.

      • Scrubbles@poptalk.scrubbles.tech
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        21
        ·
        1 year ago

        Doom eternal also came out 3.5 years ago now, and your card is nearly 5 years old. That’s the performance I would expect from a card that is that old playing a brand new game that was meant to be a stretch.

        I’m sorry, but this is how PC gaming works. Brand new cards are really only awesome for about a year, then good for a few years after that, then you start getting some new releases that make you think it’s about time. I’ve had the 3000 series, the 1000 series, before that I was an ATI guy with some sapphire, and before that the ATI 5000 series. It’s just how it goes in PC gaming, this is nothing new

    • NightOwl@lemmy.one
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      2
      ·
      1 year ago

      Why do people use entitled like it is a bad thing? Why wouldn’t consumers be entitled as opposed to spending money as though it is an act of charity? Pretty weird how mindset of gamers over the years has shifted in a way where the fact that they are consumers has been forgotten.

      • Scrubbles@poptalk.scrubbles.tech
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        6
        ·
        1 year ago

        I say entitled because gamers should just be happy, be happy with the hardware you have even if it can’t put out 4k, turn off the FPS counter, play the game. If you’re enjoying it, who cares if it occasionally dips down to 55? The entitlement comes from expecting game makers to produce games that run flawlessly at ultra settings on hardware that’s several years old. If you want that luxury, you have to spend a shitload of money on the top of the line gear, otherwise just be happy with your rig.

        • NightOwl@lemmy.one
          link
          fedilink
          English
          arrow-up
          8
          arrow-down
          1
          ·
          edit-2
          1 year ago

          Products are just products designed to get money out of people. I don’t have an appreciation like its some sports team for them. It comes down to simply if it is worth spending money on or not. Being entitled is a good thing, since it encourages less consumerist behavior with how lot of people can use less frivolous spending in their lives.

          You can try to spin it as a negative, but I find this hail corporation approach to consumerism very odd. Wanting more value for the money is a good standard to have.

          • Scrubbles@poptalk.scrubbles.tech
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            2
            ·
            1 year ago

            I’m actually agreeing with you, people should be happy to play the games on their older hardware even if it can’t pull down the ultra specs. We don’t need to always be buying the latest generation of GPUs, it’s okay to play on medium specs. We don’t have to have the top of the line latest card/processor/drive, we can enjoy ours for years, even if it means newer games don’t play on ultra. If you have the funds to buy new ones every generation, more power to you, but I buy my cards to last 8-10 years. The flipside is just expect that the games won’t run on ultra.

            • NightOwl@lemmy.one
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              2
              ·
              1 year ago

              People should expect more optimization for the games they look into and better price for performance offerings for hardware. Approach of just pushing what is acceptable further into the category of the premium tier leads to worse consumer offerings over the long run. What is considered acceptable hardware has gotten more and more out of reach each generation while disposable income has not kept up.

              Complacency and constantly falling scale of what is acceptable is what leads to worse standards. Bad prices and optimization should not get passes. PR management of be happy with hardware or performance is not the job of consumers aside from those who are being paid to run those type of campaigns.

              • szczuroarturo@programming.dev
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                1
                ·
                1 year ago

                Hmm .i dont know if you ever noticed but there usualy is a very little diffrence between ultra and high/very high but a lot of diffrence in performance. Ultra settings were always designed to sweat the pc and i assume its similar with starfield . And there is also advent of the 4k which put this ridicolous standard even higer( which especialy on pc makes very little sense unless you play on it like on a console from your couch ). In fact the fact that old graphics card are still faring so well is an anomaly rather than the standard.

              • deranger@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                1 year ago

                That’s the thing - I’d say this game is pretty well optimized. People have unrealistic expectations of what their hardware can do. That’s a better way of putting it than “entitled”.

                None of the 3D Bethesda games played this well at release. I speak from first hand experience building PCs since 1999 and playing Oblivion, FO3, NV, Skyrim, and FO4 at release. Playing those games on years old hardware required lower than native resolutions and medium settings - exactly what you see in Starfield currently.

    • dsstant@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      3
      ·
      1 year ago

      I’m running it on a Ryzen 1600 AF and a 1070. NOT Ti. 1440 at 66% resolution. Mix of mostly low some medium. 100% GPU and 45% CPU usage. 30 fps solid in cities. I won’t complain at all. I’m just happy it runs at all solidly under minimum spec.

      • Scrubbles@poptalk.scrubbles.tech
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        10
        ·
        1 year ago

        This is a great way to view it, and I think you’re getting excellent specs for that card. Kudos to you for getting it running !

    • jjjalljs@ttrpg.network
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      I’m happy with my games at 1080 and I’m going to be sad when they start requiring higher resolutions.

    • NuPNuA@lemm.ee
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      PC gamers enjoyed a bit of a respite from constantly needing to upgrade during the PS4/Xbone era. Those machines were fairly low end even at launch and with them being the primary development formats for most games, it was easy to optimize PC ports even on old hardware.

      Then the new consoles came out that were a genuine jump in tech again as consoles used to be, and now PCs need to be upgraded to keep up and people that got used to the last decade on PC are upset they can’t rock hardware for multiple years anymore.

    • NekuSoul@lemmy.nekusoul.de
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      Runs great on my 5000 series AMD CPU and 3000 series Nvidia GPU

      Just specifying the series doesn’t really say much. Based on that and the release year you could be running a 5600X and RTX3060 or you could be running a 5950X and RTX3090. There’s something like a ~2.5x performance gap between those.

      • Scrubbles@poptalk.scrubbles.tech
        link
        fedilink
        English
        arrow-up
        13
        arrow-down
        7
        ·
        1 year ago

        I mean, there isn’t one thing you can point to and say “ah ha that’s causing all teh lag”, things just take up more space, more compute power, more memory as it grows. As hardware capabilities grow software will find a way to utilize it. But if you want a few things

        • Textures are larger, where 4k was just getting rolling in 2017 (pre RDR2 after all), to accomodate 4K textures had to be scaled up (and remember width and height, so that’s 4x the memory and 4x the space on drive)
        • Engines have generally grown to be more high fidelity including more particles, more fog, (not in Starfield but Raytracing, which is younger than 2017), etc. All of these higher fidelity items require more computer power. Things like anti-aliasing for example, they’re always something like 8x, but that’s 8x the resolution, which the resolutions have only gone up, again rising with time.

        I don’t know what do you want? Like a list of everything that’s happened from then? Entire engines have come and gone in that time. Engines we used back then we’re on at least a new version compared to then, Starfield included. I mean I don’t understand what you’re asking, because to me it comes off as “Yeah well Unreal 5 has the same settings as 4 to me, so it’s basically the same”

        • Edgelord_Of_Tomorrow@lemmy.worldOP
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          10
          ·
          edit-2
          1 year ago

          Textures are larger, where 4k was just getting rolling in 2017 (pre RDR2 after all), to accomodate 4K textures had to be scaled up (and remember width and height, so that’s 4x the memory and 4x the space on drive)

          Texture resolution has not considerably effected performance since the 90s.

          Changing graphics settings in this game barely effects performance anyway.

          Things like anti-aliasing for example, they’re always something like 8x, but that’s 8x the resolution, which the resolutions have only gone up, again rising with time.

          Wtf are you talking about, nobody uses SSAA these days. TAA has basically no performance penalty and FSR has a performance improvement when used.

          If you’re going to try and argue this point at least understand what’s going on.

          The game is not doing anything that other games haven’t achieved in a more performant way. They have created a teetering mess of a game that barely runs.

          • Scrubbles@poptalk.scrubbles.tech
            link
            fedilink
            English
            arrow-up
            10
            arrow-down
            6
            ·
            1 year ago

            Texture resolution has not considerably effected performance since the 90s.

            If this were true there wouldn’t be low resolution textures at lower settings, high resolutions take up exponentially more space, memory, and time to compute. I’m definitely not going to be re-learning what I know about games from Edgelord here.

                • regbin_@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  edit-2
                  1 year ago

                  Only if you run out of VRAM. If there’s sufficient VRAM the frame rate barely changes between Lowest and Highest texture quality.

          • avater@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            1
            ·
            1 year ago

            Texture resolution has not considerably effected performance since the 90s.

            lol. try to play a game with 4K textures in 4K on a NVIDIA graphics card with not enough vram and you see how it will affect your performance 😅

            I wouldn’t say that Starfield is optimized as hell, but I think it runs reasonably and many people will fall flat on their asses in the next months because they will realize that their beloved “high end rig” is mostly dated as fuck.

            To run games on newer engines (like UE5) with acceptable framerates and details you need a combination of modern components and not just a “beefy” gpu…

            So yeah get used to low framerates if you still have components from like 4 years ago

            Changing graphics settings in this game barely effects performance anyway.

            That’s sound like you are cpu bound…

              • avater@lemmy.world
                link
                fedilink
                English
                arrow-up
                3
                arrow-down
                3
                ·
                1 year ago

                I don’t know and I don’t care what is wrong with your system but the amd driver tells me I’m averaging at 87fps with high details on a 5800X and a radeon 6900, a system that is now two years old and I think this is just fine for 1440p.

                So yeah the game is not unoptimized, sure could use a few patches and performance will get better (remember it’s a fucking bethesda game for christ’s sake…) but for many people the truth will be to upgrade their rig or play on xbox

            • regbin_@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              The game might be much more CPU bound on Nvidia cards. Probably due to shitty Nvidia drivers.

              I have a 5800X paired with a 3080 Ti and I can’t get my frame rate to go any higher than 60s in cities.

              • avater@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 year ago

                sorry to hear that, no problems here with AMD card but I’ve been team AMD all my life so I have no expierence in NVIDIA Cards and their drivers

    • Unaware7013@kbin.social
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      I’m running it on a Ryzen 5 2600 and an RX 570, and it seems to run relatively well other than CTD every hour or so.

    • regbin_@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      edit-2
      1 year ago

      I have a PC with 5800X, 3080 Ti, and 64 GB DDR4-3600. I play at 1440p with 80% render scale, Medium-High settings (mostly Medium) and it’s barely above 60 FPS outdoors. It runs like shit.

      Luckily it can go 140+ FPS indoors.

      • NuPNuA@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        1 year ago

        Why does it need to go above 60fps? It’s not a twitch FPS where every bit of latency counts. It’s an RPG and 60 is perfectly smooth.

        • regbin_@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          60 FPS is quite smooth and playable but far from perfectly smooth. There’s still noticeable juddering on continuous camera motion.

      • Scrubbles@poptalk.scrubbles.tech
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        edit-2
        1 year ago

        I’m curious, I have a 3080 as well and I’m getting ultra across the board and I average 60fps, maybe a setting or two is at high, also 1440p. Installed on an SSD, right? Render scale for me is 75%, only other thing I can think of is I overclocked my ram? But I don’t think that’d account for that huge of a jump

        • regbin_@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          1 year ago

          Exactly my point. I want 90 FPS at least and lowering the settings didn’t help at all.

          • Scrubbles@poptalk.scrubbles.tech
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            1 year ago

            Oh, well then I’d readjust expectations. Doom and fast paced shooters usually go up that high because they have quick fast-paced combat, but RPGs focus on fidelity over framerate. Hell, Skyrim at launch only offered 30fps, Cyberpunk I mentioned I never got above 45. 60 in an RPG is really a good time, don’t let the number on the screen dictate your experience. Comparing a fast shooter and an RPG like this is apples and oranges

            I’m honestly shocked a game like this can run at 60fps. <45 and I start to get annoyed in RPGs. I’d expect if you wanted framerates that high you may be needing to window it at 1080 and lowering the settings further.

            • regbin_@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              1 year ago

              Nah 60 is not good enough for me. I’m fine if it’s a mobile game or handheld. I have no problems getting 90 FPS minimum in A Plague Tale: Requiem and Cyberpunk 2077.

              In Starfield, not even 720p with lowest settings will help because the game is very heavily dependent on CPU. Looking at HW Unboxed benchmarks, the 5800X only managed to do 57 FPS average. You need a 7800X3D or a 13600K to get 90 FPS average.

              • Scrubbles@poptalk.scrubbles.tech
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                1
                ·
                1 year ago

                As long as you know you’re definitely not in the key demographic then, for RPGs 60fps is pretty much the standard. Fine if you want more, but the game was not built as an FPS, it was built as an RPG. Those are the people I’m annoyed with, the ones who are complaining at Bethesda for not building an RPG to run like how you describe on hardware that’s several years out of date already, that’s just not possible

                • regbin_@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  arrow-down
                  1
                  ·
                  1 year ago

                  Bullshit, there’s no “standard” FPS for a certain genre. Also the 3080 Ti is a $1200 last gen GPU and the 5800X is a $450 last gen CPU. It’s ridiculous that they can’t even push 100+ FPS at the lowest settings. The CPU overhead in this game is insane. I used to target 120 FPS minimum for all games I play, hence the high-end build, but now even 90 FPS is too much? lmao

                  How about people with a Ryzen 5 5600 and RTX 3060 that wants to play at 60 FPS? Keep in mind that we’re not talking about 120 FPS, just measly 60 FPS and those parts are barely 2 years old.