Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Kraftwerk
Aug 13, 2011
i do not have 10,000 bircoins, please stop asking

Taima posted:

It's all relative. 2077 will run on a 1080Ti... probably fairly well. It will, however, be far behind even 2000 level GPUs. That's just the reality of DLSS 2.0, let alone the other optimizations that have been made during the tenure of the 2000 series, and any witchcraft the 3000s series will employ in terms of ray tracing etc.

My point is we're basically on completely different goal posts. If the goal is "this game will run and look ok" then fine, conceded. If you think it's going to look anywhere near as good as even a 2000-level system, you're dreaming.


You can be as skeptical as you want about DLSS 2.0 but as someone who has seen its effects across multiple titles, and given the incredibly vast amount of dev help that Nvidia is injecting into 2077, I think you're crazy if you think it won't make a giant difference. But all questions will be answered shortly and we can revisit it then.

DLSS 2.0 is as close to magic as we've come in my entire 20+ year tenure following graphics cards, starting with my good old Riva TNT in 1998. It took a while, DLSS 1.0 was poo poo, but we're here now, and it's the real deal. I'm totally ok with skeptics doubting it, because I've seen it firsthand :shrug:

e: and for the record people might be thinking "didn't you just talk poo poo on the 2000 series" and yes I did, and do. The 2000 series needed DLSS 2.0 out the gate, and Nvidia failed to make that happen. Now, when DLSS is making real inroads, we are already effectively in Ampere country. So I don't even really count it as a 2000 series feature, though that series of cards will benefit from it greatly.

What about an i7 7700k?

Adbot
ADBOT LOVES YOU

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
Sorry I'm not a CPU expert by any stretch. I'm sure someone could answer that question with greater accuracy than I could.

sean10mm
Jun 29, 2005

It's a Mad, Mad, Mad, MAD-2R World

Kraftwerk posted:

What about an i7 7700k?

Nobody knows for sure. Best guess is that 4 core/8 thread i7 are probably fine, especially if you overclock it decently, but that 4 core/4 thread i5 are starting to show their age a little in some new titles, and that probably applies to CP too.

VelociBacon
Dec 8, 2009

Kraftwerk posted:

What about an i7 7700k?

Are you asking if that will be good enough for the game? Yeah and of course depends on the GPU. I wouldn't upgrade from a 7700k for this game.

Kraftwerk
Aug 13, 2011
i do not have 10,000 bircoins, please stop asking

VelociBacon posted:

Are you asking if that will be good enough for the game? Yeah and of course depends on the GPU. I wouldn't upgrade from a 7700k for this game.

Excellent. Then my plan to get a 3080 and leave the CPU alone is complete.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
Witcher 3 already runs significantly better in cities on a 6C12T. 4C8T may be pushing it for 2077.

Indiana_Krom
Jun 18, 2007
Net Slacker
There are games that suffer significantly on 4c8t, but they remain the exception and not the rule for now. Couldn't tell you where 2077 will end up on that scale until it gets benchmarked, so basically if I was still using my 7700k I would just wait and see.

(I'm not still using my delidded and overclocked 7700k because shadow of the tomb raider approaches 50% faster on my stock 9900k with the same GPU and RAM capacity/speed.)

shrike82
Jun 11, 2005

lol it’s kinda dumb to talk about an unreleased game needing a 2 or 3 series to run. It’s going to be designed to run on current gen and next gen consoles. And even with the 3090 and DLSS2.0, I’m pretty sure you’re not going to get 4K60 “ultra” so it’s more about what settings you turn down.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

shrike82 posted:

lol it’s kinda dumb to talk about an unreleased game needing a 2 or 3 series to run. It’s going to be designed to run on current gen and next gen consoles.

Well no one made that claim so I think we’re all on the same page there.

sauer kraut
Oct 2, 2004

Kraftwerk posted:

What about an i7 7700k?

4c/8t is the possible edge case for Cyberpunk but we just don't know.
If they add a 'reduce cycles spent on background NPC scripting' option like Hitman 2 you'll be fine. If the PC version is more in line with nextgen consoles, maybe not.

What does your RAM situation look like? If it was 16 gigs of 2933/CL14 or there abouts something decent, that can help a lot.

shrike82
Jun 11, 2005

It kinda sucks that the A100 PCIe has no RT support. Would have been cool to be able to build an AIO gaming/deep learning box.

repiv
Aug 13, 2009

e: nevermind, missed that the ampere whitepaper explicitly says no RT cores

shrike82
Jun 11, 2005

https://www.microway.com/knowledge-center-articles/in-depth-comparison-of-nvidia-ampere-gpu-accelerators/

quote:

The NVIDIA “Ampere” Datacenter GPUs have been designed for computational workloads rather than graphics workloads. RT cores for accelerated raytracing are not included in A100. Similarly, video encoding units (NVENC) are not included.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

Taima posted:

I actually think you're super wrong. The lack of DLSS 2.0 support will make your GPU look like poo poo in 2077. It will run, and if that's your concern then more power to you, but it will be absolutely nothing compared to even a 2000 series, let alone a 3 series. A 2070 is going to run a train on the 1080Ti in cyberpunk, that's just the truth.

People who haven't tried DLSS 2.0 continue to underrate the enormous impact it will produce in titles that support it (and increasingly that's looking like most titles with heavy GPU workloads).

They are not going to make the PC version inaccessible to all but the top 1-2% of hardware. Obviously a 2070 will run it better because of DLSS 2.0 but a 1080ti isn't going to struggle. At no point have I undersold DLSS2.0, in fact if you check my post history in this thread you will find that I have described it as black magic, or revolutionary, so please don't jump down my throat for things I haven't said. I don't even know why you started going on about the 2000 series cards :confused: What a super aggressive post.

Zedsdeadbaby fucked around with this message at 07:57 on Jun 29, 2020

Craptacular!
Jul 9, 2001

Fuck the DH
Yeah, i don't know what that guy is getting off on. I can understand having used raytracing without DLSS and then with and going 'wow this is night and day', but 1080ti with raytracing off won't run it poorly at all.

Really what it comes down to is that most people haven't seen anything where raytracing adds a big difference except stuff like Minecraft. In theory world building games like that should look excellent now, but cinematic AAA big budget games will be built with details reamed out the rear end whether raytracing is enabled or not so it won't be as much of a differencemaker.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
https://www.youtube.com/watch?v=vrJ330NGjCU

Given what's possible with Witcher 3, I imagine that you'd still be able to run Cyberpunk 2077

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

My apologies, that was uncalled for and frankly I seem to have misunderstood your post entirely, or was trying to respond to someone else and got mixed up or something. My bad.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
It's all good :)

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Zedsdeadbaby posted:

They are not going to make the PC version inaccessible to all but the top 1-2% of hardware.

They won't, obviously. But I think that Taima isn't wrong in some respects: the PS5/XBoXXX are enormously more powerful relative to the average PC than we're used to. So even just assuming that it's being designed with the consoles as a target is going to mean a lot of people are going to have systems that struggle to run it with modern pretties turned on. CDProjeckt Red is not at all afraid of having features and settings that only are accessible to the top 1% or whatever (see: HairWorks), and to that end I think Taima is also right that if you want to be able to slam everything to UltraMAXXX, or want to play it at 4k>60, DLSS 2.0 is going to be a huge factor. A "free" 30-50% performance bump just isn't something you can hand wave away in terms of what it lets you cram in there as a dev.

Does that mean people with Pascal and older cards won't be able to play it? Of course not. But just like in TW3, it may very well mean that many people need to turn off the fancy options, turn down the settings, and--heaven forbid!--run at Medium settings. Same with the fact that it'll be released on PS4/XBox1: yeah, it'll run, but I would not at all be surprised to find that it runs at ~30fps at a sub-1080p internal render target (720p for the PS4 non-pro?), and of course without any of the fancy stuff enabled. I also wouldn't be surprised if the current-gen console versions cut out background NPCs and such to some extent to help with their lack of performance.

Now, how will a 1080Ti do? Who knows--but I think all signs point to it being able to capably run the 1440p60 that was asked about acceptably well (Medium settings). Again, zero chance you'll be turning on ray-tracing, all the fancies, and still keep a reasonable framerate, but that's just how that goes--you'll certainly be able to play the game.

v1ld
Apr 16, 2012

DrDork posted:

But just like in TW3, it may very well mean that many people need to turn off the fancy options, turn down the settings, and--heaven forbid!--run at Medium settings.

TW3 ran well at far more than Medium on not-the-best cards. I got a consistent 45+ fps on a 7970 at 1080p, common for much of its early life, with everything at Ultra except for Hairworks off, Draw Distance and AO to one notch below Ultra. TW3 was one of my standard examples of a game designed to look great on medium to lower systems. They put a bunch of work into scaling down well for visuals.

The 7970 was 3 years old when TW3 released in 2015, so I think that's a fair comparison to a middle of the road card by then.

DLSS is going to be a very good boost for FPS and folks with cards that have it will have breathing room at higher resolutions. But I personally don't see them designing high end visual features that are only accessible with DLSS since that's work that none but a small set of folks will be able to experience.

E: Of course, the real reason for the 7970's wonderful longevity in this and every other case is that it matched and even bettered the hardware in the then current gen consoles. That's relevant to CP2077 too I would think.

v1ld fucked around with this message at 19:28 on Jun 29, 2020

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

I think it’s also quite possible that Medium Settings of Compromise will look better on CP2077 than they did in TW3, given general advances on the software and asset side as well.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

v1ld posted:

DLSS is going to be a very good boost for FPS and folks with cards that have it will have breathing room at higher resolutions. But I personally don't see them designing high end visual features that are only accessible with DLSS since that's work that none but a small set of folks will be able to experience.

E: Of course, the real reason for the 7970's wonderful longevity in this and every other case is that it matched and even bettered the hardware in the then current gen consoles. That's relevant to CP2077 too I would think.

Well, two things here: I think you're right that the console hardware bit is relevant--the next gen consoles (which are obviously more of the target than the current gen ones) are considerably more powerful relative to average PCs right now than the PS4/XBox1 were when they launched. So that's likely gonna bump things up a notch for target systems, but that it'll at least release on the PS4/XBox1 says that at least it'll run on charity-case hardware.

The other part is that CDProjekt Red absolutely designs high end visual features and settings knowing that they're only going to be usable by a small segment of the population at release. They did this with TW3, as well. HairWorks by itself cost a Titan X almost 20FPS at 1080p. Foliage visibility had a similar impact of around 20fps per step, HBAO cost 10FPS, Shadow quality could eat 10FPS per step, etc. Turning half those things on at the same time and your Titan suddenly was struggling to stay at 1080p@60. And the impacts were more pronounced on lower-end hardware, so most people didn't play with that stuff on Ultra--especially because it generally looked great even on Medium/Low.

With that in mind--and the level of interest NVidia has taken with the game--I don't see how you come away thinking that the game won't heavily leverage DLSS, RTX, etc., to really push the bounds of what you can do with it. I also have no doubt that it'll look great without them, since they do have to keep the consoles in mind, but everything points to it being a "this is why you should buy a 30-series card" title so far.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

Subjunctive posted:

I think it’s also quite possible that Medium Settings of Compromise will look better on CP2077 than they did in TW3, given general advances on the software and asset side as well.

I'd much rather lower resolution scale first than settings, TAA/TXAA and image sharpening/scaling has come a long way. Doom eternal for example looks a million times clearer and sharper at 1080p than destiny 2 does at 4k, simply because the latter has obsolete AA

I will likely set cyberpunk to standard high settings, set resolution to 1440p and lower the resolution scale down gradually until its solid 60 at the most taxing point I can find in the first area. It's what I do for every game. So far I've not had anything where it had to go below 75%, so touch wood.

There's always one or two settings that are ridiculously expensive, I usually ding those down to medium as well. Mhw 's volumetric rendering and witcher 3's hair works come to mind.

Zedsdeadbaby fucked around with this message at 19:56 on Jun 29, 2020

v1ld
Apr 16, 2012

DrDork posted:

With that in mind--and the level of interest NVidia has taken with the game--I don't see how you come away thinking that the game won't heavily leverage DLSS, RTX, etc., to really push the bounds of what you can do with it. I also have no doubt that it'll look great without them, since they do have to keep the consoles in mind, but everything points to it being a "this is why you should buy a 30-series card" title so far.

I can agree with this (minus Hairworks) but there're two different meanings to "DLSS is required" here since the original statement on what DLSS as a requirement means, made by Taima not you, is considerably stronger than your statement which I can get behind:

Taima posted:

The lack of DLSS 2.0 support will make your GPU look like poo poo in 2077. It will run, and if that's your concern then more power to you, but it will be absolutely nothing compared to even a 2000 series, let alone a 3 series.

The difference is in whether the game will look good or like poo poo without DLSS. I think the game will look great on current high end PCs but may run at 45 fps close to Ultra even without the additional beauty/fidelity of RTX without DLSS. And that it will look even better on the next gen consoles even without DLSS.


Re: Hairworks. I turned that off as soon as I installed the game and never thought of it again. It was purely in the "cool experiment, bro" feature set for me and I never considered it part of the core visual features of the game until the last two pages made it clear I was quite alone in that thinking. It's possible that there may be similar features in CP2077 if NVidia is this actively involved again.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

v1ld posted:

The difference is in whether the game will look good or like poo poo without DLSS. I think the game will look great on current high end PCs but may run at 45 fps close to Ultra even without the additional beauty/fidelity of RTX without DLSS. And that it will look even better on the next gen consoles even without DLSS.

I guess that depends on what your definition of "looks like poo poo by comparison" means. I don't think a Pascal or earlier card is gonna be able to handle RTX options any better than they do with current games (read: basically unable to do them at good framerates), and in that sense there'll be a noticeable visual difference between card generations. But I don't think TW3 looks like poo poo, and that game is how old now? So yeah, it's all about frame of reference in that sense.

I don't think HairWorks was a "core" visual feature, but it certainly was there, in the same way that pushing a setting to Ultra isn't required, but it's presented as an option. In that vein, RTX options won't be required or make the game just terrible to live without, but they'll be included and probably heavily promoted because it's an excellent way for NVidia to convince people to finally move off 9/10 series cards. And while RTX options certainly make more of an impact on the overall scene than HairWorks ever did, it's also way more expensive. At 1080p, BF V's RTX High option dropped a 1080Ti from ~170FPS to 55, with mins in the low 30's. So, yeah, if you don't have a 20/30-series card, you're turning that poo poo off immediately.

Honestly, one thing I'm real interested to see is how they do RTX-style options on the next-gen consoles, given that AMD has promised to support ray-tracing. Obviously they can't simply use RTX, since that's NVidia only, so what sort of translation is going to be done and how effective will it be? :iiam:

repiv
Aug 13, 2009

DrDork posted:

Honestly, one thing I'm real interested to see is how they do RTX-style options on the next-gen consoles, given that AMD has promised to support ray-tracing. Obviously they can't simply use RTX, since that's NVidia only, so what sort of translation is going to be done and how effective will it be? :iiam:

"RTX" is just branding, there isn't really anything Nvidia specific about the raytracing implementations we've seen so far other than they supported it first.

Games with "RTX" features are actually using the standardized DX12 DXR interface which AMD (and probably Intel) will have participated in developing.

repiv fucked around with this message at 20:47 on Jun 29, 2020

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
There's a decent amount of footage of ray traced ps5 games, it's mostly for reflections and ambient occlusion, nothing fancy like global illumination though. They seem to be using it to complement rasterized graphics, which is about what can be expected of consoles.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

repiv posted:

"RTX" is just branding, there isn't really anything Nvidia specific about the raytracing implementations we've seen so far other than they supported it first.

Games with "RTX" features are actually using the standardized DX12 DXR interface which AMD (and probably Intel) will have participated in developing.

Ah, gotcha. For some reason I'd thought they'd taken the DX12 parts and then tacked on NVidia-only extensions / features. But if that's not the case, then all's well the better for the industry in general, since having consoles and PCs not be able to share implementations would hobble adoption considerably.

repiv
Aug 13, 2009

DrDork posted:

Ah, gotcha. For some reason I'd thought they'd taken the DX12 parts and then tacked on NVidia-only extensions / features. But if that's not the case, then all's well the better for the industry in general, since having consoles and PCs not be able to share implementations would hobble adoption considerably.

Yeah Nvidia actually played nice and avoided lock-in as much as was feasible. They did make a proprietary raytracing extension for Vulkan but it was more out of necessity, Vulkan has a much larger committee so everything takes longer and they still haven't fully finalized their official raytracing extension as of today.

That means Wolfenstein Youngbloods raytracing probably won't work on AMD since it uses the NV extension but that game sucks so it's not a huge loss.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
Maybe that's what it was: I was crossing the streams between their Vulkan extensions and the DX12/RTX ones. In any event, that's good for moving forward with a less fragmented industry.

I do think that the PS5/XBoxXXX won't really have the oomph to do much with RT stuff other than the reflections and minor window dressings we've seen so far. It's still a step in the right direction, but there's only so much you can do with that hardware.

repiv
Aug 13, 2009

I wonder if Nvidia making early moves to get standardized RT into DirectX actually pushed AMD to start developing their own raytracing hardware earlier than they would have otherwise.

AMD may not have known about Turing but NV kramering into the DirectX working group and going "HEY WOULDN'T IT BE COOL IF WE HAD A RAYTRACING ABSTRACTION" probably gave the game away.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
I'm still very very curious to learn more about crytek's software ray tracing

shrike82
Jun 11, 2005

The extended gameplay trailer from last week didn’t include any RTX effects so it sounds like it’s going to come in hot.

I’m still skeptical about how ubiquitous RTX/DLSS2 will be in the short term - people should also look at whether other AAA games e.g Halo Infinite end up supporting it. If we’re talking about a gradual rollout, that just shifts focus to when the 4-series launches.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

shrike82 posted:

The extended gameplay trailer from last week didn’t include any RTX effects so it sounds like it’s going to come in hot.

Is the RTX stuff supported on the streaming thing they used?

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

shrike82 posted:

The extended gameplay trailer from last week didn’t include any RTX effects so it sounds like it’s going to come in hot.

I’m still skeptical about how ubiquitous RTX/DLSS2 will be in the short term - people should also look at whether other AAA games e.g Halo Infinite end up supporting it. If we’re talking about a gradual rollout, that just shifts focus to when the 4-series launches.

Short term we'll see little of either (I'm assuming you mean ray tracing in general when you say RTX), but they are both too revolutionary and too much of a paradigm shift for nvidia/amd to simply let wither away. In the long term they will both become the standard for sure.

You can bet AMD software engineers will be beavering away at their own implementation of DLSS. There is an impetus on them to develop their own upscaling/reconstruction method especially as PS5 pro/XSX refreshes are expected to push 8k output (bit like how ps4 pro/x1x pushed 4k output)

Zedsdeadbaby fucked around with this message at 23:08 on Jun 29, 2020

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
Yeah. I get the skepticism for DLSS 2 (though to be fair that is primarily from just a few people, in this thread) because the history of graphics cards is filled to the brim with techniques that more or less turned out to be nonsense.

I really do get that, but DLSS2 is substantial, and crucially, multi-purpose in a way that we haven't really seen from an in-house GPU tech. It's not niche, it benefits almost literally everyone from the high-end gamer to the people still rocking 960s or whatever, along with Nvidia on a marketshare level.

Its use intersects perfectly between user and platform. Which is to say, both Nvidia, and the end user of Nvidia cards, both have extremely strong interest in making this tech proliferate. That stands in stark contrast to somewhat similar initiatives like PhysX and Hairworks, where Nvidia had a far, far greater stake in their adoption than users ever did.

Therefore we can expect that Nvidia will spend obscene amounts of money to help AAA developers implement the tech. Hell, they already did that with DLSS 1.0, until it turned out to be poo poo, but that was clearly their game plan from the get go.

If it was just a tech that devs had to painstakingly implement, maybe things would be different, but Nvidia's developer outreach arm is absurdly well funded and that puts this thing over the edge from merely "amazing but niche" to "This is totally loving happening, at the VERY least on the AAA level".

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
Something worth noting for the CP2077 discussion - one of the people who got to play the preview said that he was getting 60-80 FPS at 1080p on a 2080Ti. Don't know if that's DLSS on or off, but that's a ballpark for how demanding the game is with things cranked up. On the other hand, yes, it also has to run ~30 fps-ish on what is essential a decade old laptop CPU and a Radeon 7950 or whatever.

shrike82
Jun 11, 2005

Wasn’t there some confusion about whether that was through GeForce Now which is limited to 1080p?

The 3-series is going to end up another transitional generation given how immature the RTX and DLSS implementations are. 3060 and 3070 aren’t going to run games with RTX effects at high settings and I suspect 3080 and 3090 buyers would have bought them regardless of their feature set.

Anyway pricing should be interesting. People have been claiming they’re going to cut prices vs the 2-series so that’d be a much better sales pitch.

Cactus
Jun 24, 2006

Taima posted:

Man, this is the worst general timeframe in graphics cards- that stagnant middle ground where you know much, MUCH better stuff will be available in a few months, but you're just forced to wait with no information (especially since Nvidia seems to be playing chicken with AMD and holding their cards (as in playing cards, not graphics cards) till the last possible second, creating a scenario where we know next to nothing for sure.

Like holy poo poo, just post the 3090 you cowards. My wallet will grow consciousness and buy the preorder before I even wake up in the morning.

Personally speaking I feel like we've been waiting forever.... Turing really should have had HDMI 2.1, and didn't to much disappointment, but at this point, having HDMI 2.1 technology and having to wait for a card to support it is just torture. I can't even remember a time where it was this stupid, in terms of waiting for the tech stack to mature. There's these separate technologies that all have to come together at once, and in the meanwhile your fuckin' dick is flapping in the wind at 4K/60 waiting for the 120hz support.

I realize this is not everyone's issue, and most people are sitting on smaller, higher refresh panels but goddamn can the HDMI 2.1 era just loving start already, poo poo. OLED gaming is already taking the crown at 60hz, 120hz will be untouchable with OLED response times and color accuracy/infinite contrast.

It's not just HDMI 2.1 people waiting for this. Ray tracing needs to get less intensive immediately. The Turing cards (of which I own a 2080 so no bias here) will become a footnote. At best a step towards something good, and at worst a failure. A true low point in graphics card history. Let's move on to something better.

As a 970 owner waiting to upgrade... I feel this post in the depths of my bones.

Adbot
ADBOT LOVES YOU

sean10mm
Jun 29, 2005

It's a Mad, Mad, Mad, MAD-2R World
Yeah, I'm eager to upgrade my PC, but it's just a bad time between the RTX 30 series and the Zen 3 coming out as (at least rumored) big leaps forward.

Also everything is out of stock or price hiked because of the covid anyway.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply