It's powerful granted, but I wish they would stop using so many artificial elements to inflate performance figures. I miss the days when a card could be tested in its basic form and it would get 60-120fps and a 50% performance boost on the last gen and everyone would be happy. Nowadays its all DLSS + Frame Gen and other factors and sure, a game like Cyberpunk may push out 300fps at ultra settings but with all the back-end upscaling on the card, it's literally just 1080p upscaled 4 times to buggery and the image may end up looking blurred or of a lower quality than if you just ran everything maxed out on a 3080 with all upscaling factors turned off and the resolution flat-native. Yeah, the frames may be cut by two thirds and Nvidia's marketing gets blown to shit but at least it's an honest depiction.
Literally going the same way as the iPhone and these gimmick performance bolstering elements exclusive to cards are actually covering up bad optimisation or giving a false experience just to convince users the card is better than it is. I always turn off all this shit even if it means getting XXfps less because I know the image quality will be an accurate representation of the game and not hardware bloat.
Nvidia announces the RTX 50-series, led by the $1,999 RTX 5090 with 'twice the performance of the 4090' | PC Gamer
WWW.PCGAMER.COM
With more AI acceleration than ever even the RTX 5070 is claimed to offer RTX 4090-level performance for $549.
The 5070 also looks a mild improvement on the pretty lame as it is 4070 but the price isn't too bad.
Nvidia's GeForce RTX 5070 at $549 — How does it stack up to the previous generation RTX 4070? | Tom's Hardware
WWW.TOMSHARDWARE.COM
Same price, different architectures, and some big tech changes.