Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Huntn

macrumors Core
Original poster
May 5, 2008
23,539
26,655
The Misty Mountains
I purchased a Geforce RTX 2070 in 2019. Playing on my home built PC originally built in 2013, upgraded in 2019. I payed the most I had ever paid for a graphics card ($500). So far I have been happy with it, until last night I got into a space battle in X4: Foundations and noticed the jerks (low frames) something I had not seen for many years now. My kneejerk response was to say maybe I need a RTX 3070, but I don’t really want to pay $800 for a card.

Then I saw the RTX 3060 card for $440 so I placed an order but it won’t ship for a week.

Then I started researching the performance of the GeForce RTX 2070 vs the RTX 3060 and it’s really not that much different in actual games. Then I became aware that my motherboard uses the PCI 3.0 vs the newer PCI 4.0 standard and although the newer card is compatible, it does represent a slight performance throttle, even though, one article said PCI 3 and 4 are so close to each other performance wise that it was not worth upgrading the motherboard to see the slight difference. And in the benchmarks the 2070 actually edges out the 3060 in some games and in others, it is close with the 3060 getting 5-15 extra frames.


Anyway, I’m thinking I should enjoy my current card a while longer, make some adjustments and suck it up for X4 Foundations, and wait and see what happens with the next Gen cards.

Of possible interest, I used GeForce Experience to optimize X4 and it switched the resolution from 1080 to 3060. Before in the space battle and even after at 1080 resolution, I was running about 21-28 FPS. I’ll check and see what this “optimization” does to frames, and if not satisfied, I’ll leave the other optimizations and put it back on 1080 and see if that helps.

What is the future of computer graphic cards? Just more and more expensive I presume.
 
Last edited:

diamond.g

macrumors G4
Mar 20, 2007
11,155
2,466
OBX
I purchased a Geforce RTX 2070 in 2019. Playing on my home built PC originally built in 2013, upgraded in 2019. I payed the most I had ever paid for a graphics card ($500). So far I have been happy with it, until last night I got into a space battle in X4: Foundations and noticed the jerks (low frames) something I had not seen for many years now. My kneejerk response was to say maybe I need a RTX 3070, but I don’t really want to pay $800 for a card.

Then I saw the RTX 3060 card for $440 so I placed an order but it won’t ship for a week.

Then I started researching the performance of the GeForce RTX 2070 vs the RTX 3060 and it’s really not that much different in actual games. Then I became aware that my motherboard uses the PCI 3.0 vs the newer PCI 4.0 standard and although the newer card is compatible, it does represent a slight performance throttle, even though, one article said PCI 3 and 4 are so close to each other performance wise that it was not worth upgrading the motherboard to see the slight difference. And in the benchmarks the 2070 actually edges out the 3060 in some games and in others, it is close with the 3060 getting 5-15 extra frames.


Anyway, I’m thinking I should enjoy my current card a while longer, make some adjustments and suck it up for X4 Foundations, and wait and see what happens with the next Gen cards.

Of possible interest, I used GeForce Experience to optimize X4 and it switched the resolution from 1080 to 3060. Before in the space battle and even after at 1080 resolution, I was running about 21-28 FPS. I’ll check and see what this “optimization” does to frames, and if not satisfied, I’ll leave the other optimizations and put it back on 1080 and see if that helps.

What is the future of computer graphic cards? Just more and more expensive I presume.
Does X4 not support DLSS?
 

mi7chy

macrumors G4
Oct 24, 2014
10,495
11,155
$400 3060ti FE from Best Buy is the best bang for the buck and current NIB valuation is $620 so it's worth more than you've paid for it. 2nd best was $579 reference 6800 which what I have before AMD killed and replaced it with crappier $549 6750xt.
 

appltech

macrumors 6502a
Apr 23, 2020
688
166
You could wait for the summer. Heard stories that few cryptos will move to a new protocol (sort of) so GPU mining will be not so interesting or profitable (if it is still such though) So if everything will be ok with global situation and $, prices will decrease
Having ASUS ROG RX 580 on the first rig, and the 2nd rig with 3070ti 8Gb
 
  • Like
Reactions: Huntn

Huntn

macrumors Core
Original poster
May 5, 2008
23,539
26,655
The Misty Mountains
Does X4 not support DLSS?
Don't know I'll assume it does.
$400 3060ti FE from Best Buy is the best bang for the buck and current NIB valuation is $620 so it's worth more than you've paid for it. 2nd best was $579 reference 6800 which what I have before AMD killed and replaced it with crappier $549 6750xt.
True, but the price I saw and was willing to pay did not get me that much by my estimation. I've decided to wait. Yes, and I hate spending over $500 for a card.
You could wait for the summer. Heard stories that few cryptos will move to a new protocol (sort of) so GPU mining will be not so interesting or profitable (if it is still such though) So if everything will be ok with global situation and $, prices will decrease
Having ASUS ROG RX 580 on the first rig, and the 2nd rig with 3070ti 8Gb
How much did you pay for the 3070? I've decided to hold off. I did not see myself getting that much bang for my buck as compared to what I already own. In some of the game bench marks, the 2070 held it's own or came close to the 3070... I'll wait and take a chance. :D
 
  • Like
Reactions: Irishman

Colstan

macrumors 6502
Jul 30, 2020
330
711
Seeing how AMD's RDNA3 and Nvidia's Lovelace are expected later this year, I'd suggest waiting for reviews on that generation. I realize that there's always something better around the corner, but ~6 months isn't that much time. If those cards are too pricey, then you'd likely be able to score a better deal on current-gen GPUs, since the crypto craze seems to be winding down and early adopters will want the new cards.
 
  • Like
Reactions: Huntn

T'hain Esh Kelch

macrumors 603
Aug 5, 2001
6,342
7,209
Denmark
Then I became aware that my motherboard uses the PCI 3.0 vs the newer PCI 4.0 standard and although the newer card is compatible, it does represent a slight performance throttle, even though, one article said PCI 3 and 4 are so close to each other performance wise that it was not worth upgrading the motherboard to see the slight difference.
PCIe 4 has twice the bandwith over PCIe 3, so I definitely wouldn't call it close to each other.
 
  • Like
Reactions: Huntn

diamond.g

macrumors G4
Mar 20, 2007
11,155
2,466
OBX
In some of the game bench marks, the 2070 held it's own or came close to the 3070... I'll wait and take a chance. :D
I'd be curious to see the settings for this to be true, as far as I was aware the 3070 should be matching/beating a 2080.
 
  • Like
Reactions: Huntn

Huntn

macrumors Core
Original poster
May 5, 2008
23,539
26,655
The Misty Mountains
PCIe 4 has twice the bandwith over PCIe 3, so I definitely wouldn't call it close to each other.
I was just quoting what an article said which was: paying for the motherboard upgrade to get PCIe4 was not going to make that much difference in card performance vs putting a PCIe 4 card on a PCIe3 motherboard? I wonder if there is a significance difference between PCIe bandwidth and in-game performance?

There might be an argument that upgrading to the 3060 from the 2070, might not be with it if you have a PCIe 3 motherboard?
I still play on my GTX 1080 Ti. I haven't found the need to upgrade given my monitor is still at 1080p.
It depends on the demand of the games you like playing. I’ve noticed that in X4, on my 4K monitor, my RTX-2070 produces about the same frames whether running it at 2k or 4k resolution.
I'd be curious to see the settings for this to be true, as far as I was aware the 3070 should be matching/beating a 2080 of the game bench marks
Most of the games regarding benchmarks, the 3060 was slightly better with frames, but for some the 2070 was roughly equivalent or slightly better. See the game benchmarks in the Second link post 1.
 

Colstan

macrumors 6502
Jul 30, 2020
330
711
I was just quoting what an article said which was: paying for the motherboard upgrade to get PCIe4 was not going to make that much difference in card performance vs putting a PCIe 4 card on a PCIe3 motherboard? I wonder if there is a significance difference between PCIe bandwidth and in-game performance?

There might be an argument that upgrading to the 3060 from the 2070, might not be with it if you have a PCIe 3 motherboard?
From a technical perspective, PCIe 4.0 has twice the bandwidth of PCIe 3.0, which is what that post was pointing out. That extra bandwidth doesn't do you any good if the device doesn't utilize it. The only mainstream product that has shown to benefit from gen 4.0 are high-speed SSDs, and that's not in gaming scenarios. An external SSD with an internal SATA interface running on USB 3.0 has game load times almost as fast as an internal NVME drive on PCIe 4.0. That may change with Microsoft's DirectStorage, but not right now.

Regardless, with current generation graphics cards, you're unlikely to see any tangible benefit between PCIe 3.0 vs. 4.0. Perhaps next-gen cards will be able to benefit, but not what's currently on the market, in most scenarios.
 
  • Like
Reactions: Huntn and Irishman

diamond.g

macrumors G4
Mar 20, 2007
11,155
2,466
OBX
From a technical perspective, PCIe 4.0 has twice the bandwidth of PCIe 3.0, which is what that post was pointing out. That extra bandwidth doesn't do you any good if the device doesn't utilize it. The only mainstream product that has shown to benefit from gen 4.0 are high-speed SSDs, and that's not in gaming scenarios. An external SSD with an internal SATA interface running on USB 3.0 has game load times almost as fast as an internal NVME drive on PCIe 4.0. That may change with Microsoft's DirectStorage, but not right now.

Regardless, with current generation graphics cards, you're unlikely to see any tangible benefit between PCIe 3.0 vs. 4.0. Perhaps next-gen cards will be able to benefit, but not what's currently on the market, in most scenarios.
That isn't 100% true. The RX6500XT seems to need PCIe 4 because they only gave it a x4 connection instead of x16 and it only has a 4GB framebuffer.
 
  • Like
Reactions: Huntn and Irishman

Colstan

macrumors 6502
Jul 30, 2020
330
711
That isn't 100% true. The RX6500XT seems to need PCIe 4 because they only gave it a x4 connection instead of x16 and it only has a 4GB framebuffer.
The 6500 XT is an edge case, and if a user has to resort to using one, then they can't be too choosy about their mainboard, either. So, that's true in a technical sense, but it's an absolutely worthless product that AMD should be ashamed of, but it does fit the exception to the rule, so to speak. I get that AMD was trying to address shortages by slapping what is essentially a laptop GPU onto desktop, but they did it in the most humiliating way possible. (And I say that as someone who likes AMD and appreciates their technology.)
 
  • Like
Reactions: Huntn and Irishman

diamond.g

macrumors G4
Mar 20, 2007
11,155
2,466
OBX
The 6500 XT is an edge case, and if a user has to resort to using one, then they can't be too choosy about their mainboard, either. So, that's true in a technical sense, but it's an absolutely worthless product that AMD should be ashamed of, but it does fit the exception to the rule, so to speak. I get that AMD was trying to address shortages by slapping what is essentially a laptop GPU onto desktop, but they did it in the most humiliating way possible. (And I say that as someone who likes AMD and appreciates their technology.)
I know, I was just being pedantic, lol. Especially since you can hide the hit by not running over the framebuffer size.
 

jav6454

macrumors Core
Nov 14, 2007
22,303
6,257
1 Geostationary Tower Plaza
It depends on the demand of the games you like playing. I’ve noticed that in X4, on my 4K monitor, my RTX-2070 produces about the same frames whether running it at 2k or 4k resolution.
The games I like or buy are running at 60+ (some 100+) fps on the mentioned res. I see no need to increase it. The only one that slows down and sometimes to the teens is Cities: Skylines. However, from what everyone describes, CS is mainly CPU bound.
 
  • Like
Reactions: Huntn

erayser

macrumors 65816
Apr 9, 2011
1,253
1,185
San Diego
I'm running an RTX 3090 purchased (msrp) in November 2020 at Microcenter. I can't believe it's been a year and half since I built my PC. Because of how hard GPU's were so hard to get, I still feel like my rig is still new. My kid bought his RTX 3070 at the same time. We were real fortunate to get our hands on a 30 series cards at msrp in 2020. I'm NOT planning to upgrade for a long while... I mean... my previous build had GTX 780's in tri-SLI. My kid's rig does more gaming... and still runs like a beast.

I do more Photo and content creation editing on my rig, but I do game here and there. Sometimes I pull out the VR headset, play online with friends... but lately I've been playing CP2077 again after the last big patch came out.

I do have a motherboard that supports PCI GEN 4.0. I don't see a big difference going from GEN 3.0 to 4.0, but I keep it set to GEN 3.0 because I'm running a AMD Ryzen 5900X. I continually get USB disconnects with AMD. It doesn't hinder gaming or my productivity... but the Windows error sound gets annoying. Going to GEN 3.0 fixes the issue... AMD still hasn't fixed the issue. Makes me want to go back to Intel on my next build, but that's years away.

Hopefully the next time I upgrade, parts will be more available... but I don't see the prices dropping. I hope I'm wrong. I do have a play/build money savings that I rarely touch for my next build in the future. I like paying for my builds outright, with money to spare. I just hope it's not a struggle to buy parts like how it was over the last year and half.
 
  • Like
Reactions: Huntn

diamond.g

macrumors G4
Mar 20, 2007
11,155
2,466
OBX
I have a reference 6900XT with a EK water block. I like to think that I am going to skip upgrading this next go around, but I will probably fall victim to FOMO.

So far everything I play runs fine @1440P Ultra settings. RT can be hit or miss depending on the game though.
 
  • Like
Reactions: Irishman and Huntn

Irishman

macrumors 68040
Nov 2, 2006
3,401
845
I have a reference 6900XT with a EK water block. I like to think that I am going to skip upgrading this next go around, but I will probably fall victim to FOMO.

So far everything I play runs fine @1440P Ultra settings. RT can be hit or miss depending on the game though.

I have a dumb question for you - if I bought a new AMD 6900 XT - could it be made to run every Ray-Tracing game? Even the ones made specifically for nVidia hardware (Quake 2 RTX eg)?
 

diamond.g

macrumors G4
Mar 20, 2007
11,155
2,466
OBX
I have a dumb question for you - if I bought a new AMD 6900 XT - could it be made to run every Ray-Tracing game? Even the ones made specifically for nVidia hardware (Quake 2 RTX eg)?
As long as the game either uses DXR or the vendor agnostic vulkan RT extensions it will work just fine on the 6000 series GPU's. Q2 RTX actually updated when the 6000 series came out to use vendor agnostic extension so it runs "fine".

The only game I know of with RT that doesn't (shouldn't?) work is the updated Crysis, because they use some hack to add nvidias vulkan extension to their DX11 engine.
 
  • Like
Reactions: Irishman

maflynn

macrumors Haswell
May 3, 2009
73,572
43,555
I have a dumb question for you - if I bought a new AMD 6900 XT - could it be made to run every Ray-Tracing game? Even the ones made specifically for nVidia hardware (Quake 2 RTX eg)?
Cyberpunk did not support RT with AMD cards when the game first came out. At some point they added support.

I'm rocking a RTX 2060 card, and I think the time may be coming to replace that. The 7900XT not something I'm willing to buy, and so I'll need to wait and see what a possible 7600XT will look like. I'm choosing to avoid nvidia for a number of reasons, but chief among the reasons is their pricing.
 
  • Like
Reactions: Irishman

diamond.g

macrumors G4
Mar 20, 2007
11,155
2,466
OBX
Cyberpunk did not support RT with AMD cards when the game first came out. At some point they added support.

I'm rocking a RTX 2060 card, and I think the time may be coming to replace that. The 7900XT not something I'm willing to buy, and so I'll need to wait and see what a possible 7600XT will look like. I'm choosing to avoid nvidia for a number of reasons, but chief among the reasons is their pricing.
Odd.
I found a list though...
 
  • Like
Reactions: Irishman

maflynn

macrumors Haswell
May 3, 2009
73,572
43,555

diamond.g

macrumors G4
Mar 20, 2007
11,155
2,466
OBX
I forget where I saw it, but on roll out the game did not. I could be wrong, but I'm pretty sure it lacked support
That is fair. They probably looked for RTX 20/30 series cards instead of just checking for DXR support.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.