|
Taima posted:It's all relative. 2077 will run on a 1080Ti... probably fairly well. It will, however, be far behind even 2000 level GPUs. That's just the reality of DLSS 2.0, let alone the other optimizations that have been made during the tenure of the 2000 series, and any witchcraft the 3000s series will employ in terms of ray tracing etc. What about an i7 7700k?
|
# ? Jun 28, 2020 21:57 |
|
|
# ? May 11, 2024 10:00 |
|
Sorry I'm not a CPU expert by any stretch. I'm sure someone could answer that question with greater accuracy than I could.
|
# ? Jun 28, 2020 22:08 |
|
Kraftwerk posted:What about an i7 7700k? Nobody knows for sure. Best guess is that 4 core/8 thread i7 are probably fine, especially if you overclock it decently, but that 4 core/4 thread i5 are starting to show their age a little in some new titles, and that probably applies to CP too.
|
# ? Jun 28, 2020 22:35 |
|
Kraftwerk posted:What about an i7 7700k? Are you asking if that will be good enough for the game? Yeah and of course depends on the GPU. I wouldn't upgrade from a 7700k for this game.
|
# ? Jun 28, 2020 22:36 |
|
VelociBacon posted:Are you asking if that will be good enough for the game? Yeah and of course depends on the GPU. I wouldn't upgrade from a 7700k for this game. Excellent. Then my plan to get a 3080 and leave the CPU alone is complete.
|
# ? Jun 28, 2020 22:39 |
|
Witcher 3 already runs significantly better in cities on a 6C12T. 4C8T may be pushing it for 2077.
|
# ? Jun 28, 2020 23:30 |
|
There are games that suffer significantly on 4c8t, but they remain the exception and not the rule for now. Couldn't tell you where 2077 will end up on that scale until it gets benchmarked, so basically if I was still using my 7700k I would just wait and see. (I'm not still using my delidded and overclocked 7700k because shadow of the tomb raider approaches 50% faster on my stock 9900k with the same GPU and RAM capacity/speed.)
|
# ? Jun 28, 2020 23:55 |
|
lol it’s kinda dumb to talk about an unreleased game needing a 2 or 3 series to run. It’s going to be designed to run on current gen and next gen consoles. And even with the 3090 and DLSS2.0, I’m pretty sure you’re not going to get 4K60 “ultra” so it’s more about what settings you turn down.
|
# ? Jun 29, 2020 00:16 |
|
shrike82 posted:lol it’s kinda dumb to talk about an unreleased game needing a 2 or 3 series to run. It’s going to be designed to run on current gen and next gen consoles. Well no one made that claim so I think we’re all on the same page there.
|
# ? Jun 29, 2020 00:44 |
|
Kraftwerk posted:What about an i7 7700k? 4c/8t is the possible edge case for Cyberpunk but we just don't know. If they add a 'reduce cycles spent on background NPC scripting' option like Hitman 2 you'll be fine. If the PC version is more in line with nextgen consoles, maybe not. What does your RAM situation look like? If it was 16 gigs of 2933/CL14 or there abouts something decent, that can help a lot.
|
# ? Jun 29, 2020 01:02 |
|
It kinda sucks that the A100 PCIe has no RT support. Would have been cool to be able to build an AIO gaming/deep learning box.
|
# ? Jun 29, 2020 01:05 |
|
e: nevermind, missed that the ampere whitepaper explicitly says no RT cores
|
# ? Jun 29, 2020 01:08 |
|
https://www.microway.com/knowledge-center-articles/in-depth-comparison-of-nvidia-ampere-gpu-accelerators/quote:The NVIDIA “Ampere” Datacenter GPUs have been designed for computational workloads rather than graphics workloads. RT cores for accelerated raytracing are not included in A100. Similarly, video encoding units (NVENC) are not included.
|
# ? Jun 29, 2020 01:11 |
|
Taima posted:I actually think you're super wrong. The lack of DLSS 2.0 support will make your GPU look like poo poo in 2077. It will run, and if that's your concern then more power to you, but it will be absolutely nothing compared to even a 2000 series, let alone a 3 series. A 2070 is going to run a train on the 1080Ti in cyberpunk, that's just the truth. They are not going to make the PC version inaccessible to all but the top 1-2% of hardware. Obviously a 2070 will run it better because of DLSS 2.0 but a 1080ti isn't going to struggle. At no point have I undersold DLSS2.0, in fact if you check my post history in this thread you will find that I have described it as black magic, or revolutionary, so please don't jump down my throat for things I haven't said. I don't even know why you started going on about the 2000 series cards What a super aggressive post. Zedsdeadbaby fucked around with this message at 07:57 on Jun 29, 2020 |
# ? Jun 29, 2020 07:52 |
|
Yeah, i don't know what that guy is getting off on. I can understand having used raytracing without DLSS and then with and going 'wow this is night and day', but 1080ti with raytracing off won't run it poorly at all. Really what it comes down to is that most people haven't seen anything where raytracing adds a big difference except stuff like Minecraft. In theory world building games like that should look excellent now, but cinematic AAA big budget games will be built with details reamed out the rear end whether raytracing is enabled or not so it won't be as much of a differencemaker.
|
# ? Jun 29, 2020 08:32 |
|
https://www.youtube.com/watch?v=vrJ330NGjCU Given what's possible with Witcher 3, I imagine that you'd still be able to run Cyberpunk 2077
|
# ? Jun 29, 2020 08:41 |
|
My apologies, that was uncalled for and frankly I seem to have misunderstood your post entirely, or was trying to respond to someone else and got mixed up or something. My bad.
|
# ? Jun 29, 2020 09:53 |
|
It's all good
|
# ? Jun 29, 2020 10:07 |
|
Zedsdeadbaby posted:They are not going to make the PC version inaccessible to all but the top 1-2% of hardware. They won't, obviously. But I think that Taima isn't wrong in some respects: the PS5/XBoXXX are enormously more powerful relative to the average PC than we're used to. So even just assuming that it's being designed with the consoles as a target is going to mean a lot of people are going to have systems that struggle to run it with modern pretties turned on. CDProjeckt Red is not at all afraid of having features and settings that only are accessible to the top 1% or whatever (see: HairWorks), and to that end I think Taima is also right that if you want to be able to slam everything to UltraMAXXX, or want to play it at 4k>60, DLSS 2.0 is going to be a huge factor. A "free" 30-50% performance bump just isn't something you can hand wave away in terms of what it lets you cram in there as a dev. Does that mean people with Pascal and older cards won't be able to play it? Of course not. But just like in TW3, it may very well mean that many people need to turn off the fancy options, turn down the settings, and--heaven forbid!--run at Medium settings. Same with the fact that it'll be released on PS4/XBox1: yeah, it'll run, but I would not at all be surprised to find that it runs at ~30fps at a sub-1080p internal render target (720p for the PS4 non-pro?), and of course without any of the fancy stuff enabled. I also wouldn't be surprised if the current-gen console versions cut out background NPCs and such to some extent to help with their lack of performance. Now, how will a 1080Ti do? Who knows--but I think all signs point to it being able to capably run the 1440p60 that was asked about acceptably well (Medium settings). Again, zero chance you'll be turning on ray-tracing, all the fancies, and still keep a reasonable framerate, but that's just how that goes--you'll certainly be able to play the game.
|
# ? Jun 29, 2020 15:02 |
|
DrDork posted:But just like in TW3, it may very well mean that many people need to turn off the fancy options, turn down the settings, and--heaven forbid!--run at Medium settings. TW3 ran well at far more than Medium on not-the-best cards. I got a consistent 45+ fps on a 7970 at 1080p, common for much of its early life, with everything at Ultra except for Hairworks off, Draw Distance and AO to one notch below Ultra. TW3 was one of my standard examples of a game designed to look great on medium to lower systems. They put a bunch of work into scaling down well for visuals. The 7970 was 3 years old when TW3 released in 2015, so I think that's a fair comparison to a middle of the road card by then. DLSS is going to be a very good boost for FPS and folks with cards that have it will have breathing room at higher resolutions. But I personally don't see them designing high end visual features that are only accessible with DLSS since that's work that none but a small set of folks will be able to experience. E: Of course, the real reason for the 7970's wonderful longevity in this and every other case is that it matched and even bettered the hardware in the then current gen consoles. That's relevant to CP2077 too I would think. v1ld fucked around with this message at 19:28 on Jun 29, 2020 |
# ? Jun 29, 2020 19:12 |
|
I think it’s also quite possible that Medium Settings of Compromise will look better on CP2077 than they did in TW3, given general advances on the software and asset side as well.
|
# ? Jun 29, 2020 19:18 |
|
v1ld posted:DLSS is going to be a very good boost for FPS and folks with cards that have it will have breathing room at higher resolutions. But I personally don't see them designing high end visual features that are only accessible with DLSS since that's work that none but a small set of folks will be able to experience. Well, two things here: I think you're right that the console hardware bit is relevant--the next gen consoles (which are obviously more of the target than the current gen ones) are considerably more powerful relative to average PCs right now than the PS4/XBox1 were when they launched. So that's likely gonna bump things up a notch for target systems, but that it'll at least release on the PS4/XBox1 says that at least it'll run on charity-case hardware. The other part is that CDProjekt Red absolutely designs high end visual features and settings knowing that they're only going to be usable by a small segment of the population at release. They did this with TW3, as well. HairWorks by itself cost a Titan X almost 20FPS at 1080p. Foliage visibility had a similar impact of around 20fps per step, HBAO cost 10FPS, Shadow quality could eat 10FPS per step, etc. Turning half those things on at the same time and your Titan suddenly was struggling to stay at 1080p@60. And the impacts were more pronounced on lower-end hardware, so most people didn't play with that stuff on Ultra--especially because it generally looked great even on Medium/Low. With that in mind--and the level of interest NVidia has taken with the game--I don't see how you come away thinking that the game won't heavily leverage DLSS, RTX, etc., to really push the bounds of what you can do with it. I also have no doubt that it'll look great without them, since they do have to keep the consoles in mind, but everything points to it being a "this is why you should buy a 30-series card" title so far.
|
# ? Jun 29, 2020 19:45 |
|
Subjunctive posted:I think it’s also quite possible that Medium Settings of Compromise will look better on CP2077 than they did in TW3, given general advances on the software and asset side as well. I'd much rather lower resolution scale first than settings, TAA/TXAA and image sharpening/scaling has come a long way. Doom eternal for example looks a million times clearer and sharper at 1080p than destiny 2 does at 4k, simply because the latter has obsolete AA I will likely set cyberpunk to standard high settings, set resolution to 1440p and lower the resolution scale down gradually until its solid 60 at the most taxing point I can find in the first area. It's what I do for every game. So far I've not had anything where it had to go below 75%, so touch wood. There's always one or two settings that are ridiculously expensive, I usually ding those down to medium as well. Mhw 's volumetric rendering and witcher 3's hair works come to mind. Zedsdeadbaby fucked around with this message at 19:56 on Jun 29, 2020 |
# ? Jun 29, 2020 19:53 |
|
DrDork posted:With that in mind--and the level of interest NVidia has taken with the game--I don't see how you come away thinking that the game won't heavily leverage DLSS, RTX, etc., to really push the bounds of what you can do with it. I also have no doubt that it'll look great without them, since they do have to keep the consoles in mind, but everything points to it being a "this is why you should buy a 30-series card" title so far. I can agree with this (minus Hairworks) but there're two different meanings to "DLSS is required" here since the original statement on what DLSS as a requirement means, made by Taima not you, is considerably stronger than your statement which I can get behind: Taima posted:The lack of DLSS 2.0 support will make your GPU look like poo poo in 2077. It will run, and if that's your concern then more power to you, but it will be absolutely nothing compared to even a 2000 series, let alone a 3 series. The difference is in whether the game will look good or like poo poo without DLSS. I think the game will look great on current high end PCs but may run at 45 fps close to Ultra even without the additional beauty/fidelity of RTX without DLSS. And that it will look even better on the next gen consoles even without DLSS. Re: Hairworks. I turned that off as soon as I installed the game and never thought of it again. It was purely in the "cool experiment, bro" feature set for me and I never considered it part of the core visual features of the game until the last two pages made it clear I was quite alone in that thinking. It's possible that there may be similar features in CP2077 if NVidia is this actively involved again.
|
# ? Jun 29, 2020 20:02 |
|
v1ld posted:The difference is in whether the game will look good or like poo poo without DLSS. I think the game will look great on current high end PCs but may run at 45 fps close to Ultra even without the additional beauty/fidelity of RTX without DLSS. And that it will look even better on the next gen consoles even without DLSS. I guess that depends on what your definition of "looks like poo poo by comparison" means. I don't think a Pascal or earlier card is gonna be able to handle RTX options any better than they do with current games (read: basically unable to do them at good framerates), and in that sense there'll be a noticeable visual difference between card generations. But I don't think TW3 looks like poo poo, and that game is how old now? So yeah, it's all about frame of reference in that sense. I don't think HairWorks was a "core" visual feature, but it certainly was there, in the same way that pushing a setting to Ultra isn't required, but it's presented as an option. In that vein, RTX options won't be required or make the game just terrible to live without, but they'll be included and probably heavily promoted because it's an excellent way for NVidia to convince people to finally move off 9/10 series cards. And while RTX options certainly make more of an impact on the overall scene than HairWorks ever did, it's also way more expensive. At 1080p, BF V's RTX High option dropped a 1080Ti from ~170FPS to 55, with mins in the low 30's. So, yeah, if you don't have a 20/30-series card, you're turning that poo poo off immediately. Honestly, one thing I'm real interested to see is how they do RTX-style options on the next-gen consoles, given that AMD has promised to support ray-tracing. Obviously they can't simply use RTX, since that's NVidia only, so what sort of translation is going to be done and how effective will it be?
|
# ? Jun 29, 2020 20:22 |
|
DrDork posted:Honestly, one thing I'm real interested to see is how they do RTX-style options on the next-gen consoles, given that AMD has promised to support ray-tracing. Obviously they can't simply use RTX, since that's NVidia only, so what sort of translation is going to be done and how effective will it be? "RTX" is just branding, there isn't really anything Nvidia specific about the raytracing implementations we've seen so far other than they supported it first. Games with "RTX" features are actually using the standardized DX12 DXR interface which AMD (and probably Intel) will have participated in developing. repiv fucked around with this message at 20:47 on Jun 29, 2020 |
# ? Jun 29, 2020 20:29 |
|
There's a decent amount of footage of ray traced ps5 games, it's mostly for reflections and ambient occlusion, nothing fancy like global illumination though. They seem to be using it to complement rasterized graphics, which is about what can be expected of consoles.
|
# ? Jun 29, 2020 20:30 |
|
repiv posted:"RTX" is just branding, there isn't really anything Nvidia specific about the raytracing implementations we've seen so far other than they supported it first. Ah, gotcha. For some reason I'd thought they'd taken the DX12 parts and then tacked on NVidia-only extensions / features. But if that's not the case, then all's well the better for the industry in general, since having consoles and PCs not be able to share implementations would hobble adoption considerably.
|
# ? Jun 29, 2020 20:50 |
|
DrDork posted:Ah, gotcha. For some reason I'd thought they'd taken the DX12 parts and then tacked on NVidia-only extensions / features. But if that's not the case, then all's well the better for the industry in general, since having consoles and PCs not be able to share implementations would hobble adoption considerably. Yeah Nvidia actually played nice and avoided lock-in as much as was feasible. They did make a proprietary raytracing extension for Vulkan but it was more out of necessity, Vulkan has a much larger committee so everything takes longer and they still haven't fully finalized their official raytracing extension as of today. That means Wolfenstein Youngbloods raytracing probably won't work on AMD since it uses the NV extension but that game sucks so it's not a huge loss.
|
# ? Jun 29, 2020 20:54 |
|
Maybe that's what it was: I was crossing the streams between their Vulkan extensions and the DX12/RTX ones. In any event, that's good for moving forward with a less fragmented industry. I do think that the PS5/XBoxXXX won't really have the oomph to do much with RT stuff other than the reflections and minor window dressings we've seen so far. It's still a step in the right direction, but there's only so much you can do with that hardware.
|
# ? Jun 29, 2020 21:02 |
|
I wonder if Nvidia making early moves to get standardized RT into DirectX actually pushed AMD to start developing their own raytracing hardware earlier than they would have otherwise. AMD may not have known about Turing but NV kramering into the DirectX working group and going "HEY WOULDN'T IT BE COOL IF WE HAD A RAYTRACING ABSTRACTION" probably gave the game away.
|
# ? Jun 29, 2020 21:07 |
|
I'm still very very curious to learn more about crytek's software ray tracing
|
# ? Jun 29, 2020 21:14 |
|
The extended gameplay trailer from last week didn’t include any RTX effects so it sounds like it’s going to come in hot. I’m still skeptical about how ubiquitous RTX/DLSS2 will be in the short term - people should also look at whether other AAA games e.g Halo Infinite end up supporting it. If we’re talking about a gradual rollout, that just shifts focus to when the 4-series launches.
|
# ? Jun 29, 2020 21:28 |
|
shrike82 posted:The extended gameplay trailer from last week didn’t include any RTX effects so it sounds like it’s going to come in hot. Is the RTX stuff supported on the streaming thing they used?
|
# ? Jun 29, 2020 22:51 |
|
shrike82 posted:The extended gameplay trailer from last week didn’t include any RTX effects so it sounds like it’s going to come in hot. Short term we'll see little of either (I'm assuming you mean ray tracing in general when you say RTX), but they are both too revolutionary and too much of a paradigm shift for nvidia/amd to simply let wither away. In the long term they will both become the standard for sure. You can bet AMD software engineers will be beavering away at their own implementation of DLSS. There is an impetus on them to develop their own upscaling/reconstruction method especially as PS5 pro/XSX refreshes are expected to push 8k output (bit like how ps4 pro/x1x pushed 4k output) Zedsdeadbaby fucked around with this message at 23:08 on Jun 29, 2020 |
# ? Jun 29, 2020 23:04 |
|
Yeah. I get the skepticism for DLSS 2 (though to be fair that is primarily from just a few people, in this thread) because the history of graphics cards is filled to the brim with techniques that more or less turned out to be nonsense. I really do get that, but DLSS2 is substantial, and crucially, multi-purpose in a way that we haven't really seen from an in-house GPU tech. It's not niche, it benefits almost literally everyone from the high-end gamer to the people still rocking 960s or whatever, along with Nvidia on a marketshare level. Its use intersects perfectly between user and platform. Which is to say, both Nvidia, and the end user of Nvidia cards, both have extremely strong interest in making this tech proliferate. That stands in stark contrast to somewhat similar initiatives like PhysX and Hairworks, where Nvidia had a far, far greater stake in their adoption than users ever did. Therefore we can expect that Nvidia will spend obscene amounts of money to help AAA developers implement the tech. Hell, they already did that with DLSS 1.0, until it turned out to be poo poo, but that was clearly their game plan from the get go. If it was just a tech that devs had to painstakingly implement, maybe things would be different, but Nvidia's developer outreach arm is absurdly well funded and that puts this thing over the edge from merely "amazing but niche" to "This is totally loving happening, at the VERY least on the AAA level".
|
# ? Jun 29, 2020 23:42 |
|
Something worth noting for the CP2077 discussion - one of the people who got to play the preview said that he was getting 60-80 FPS at 1080p on a 2080Ti. Don't know if that's DLSS on or off, but that's a ballpark for how demanding the game is with things cranked up. On the other hand, yes, it also has to run ~30 fps-ish on what is essential a decade old laptop CPU and a Radeon 7950 or whatever.
|
# ? Jun 29, 2020 23:48 |
|
Wasn’t there some confusion about whether that was through GeForce Now which is limited to 1080p? The 3-series is going to end up another transitional generation given how immature the RTX and DLSS implementations are. 3060 and 3070 aren’t going to run games with RTX effects at high settings and I suspect 3080 and 3090 buyers would have bought them regardless of their feature set. Anyway pricing should be interesting. People have been claiming they’re going to cut prices vs the 2-series so that’d be a much better sales pitch.
|
# ? Jun 29, 2020 23:55 |
|
Taima posted:Man, this is the worst general timeframe in graphics cards- that stagnant middle ground where you know much, MUCH better stuff will be available in a few months, but you're just forced to wait with no information (especially since Nvidia seems to be playing chicken with AMD and holding their cards (as in playing cards, not graphics cards) till the last possible second, creating a scenario where we know next to nothing for sure. As a 970 owner waiting to upgrade... I feel this post in the depths of my bones.
|
# ? Jun 29, 2020 23:55 |
|
|
# ? May 11, 2024 10:00 |
|
Yeah, I'm eager to upgrade my PC, but it's just a bad time between the RTX 30 series and the Zen 3 coming out as (at least rumored) big leaps forward. Also everything is out of stock or price hiked because of the covid anyway.
|
# ? Jun 30, 2020 00:30 |