|
Also, your daily dose of WCCFTECH rumor goodness: https://wccftech.com/nvidia-geforce-rtx-30-ampere-gaming-graphics-card-rtx-3080-ti-rtx-3080-launching-in-24-gb-20-gb-10-gb-variants/ (for the record these have been fairly accurate/collaborated recently as we move so close to launch, relatively speaking)
|
# ? Aug 12, 2020 19:05 |
|
|
# ? Apr 29, 2024 16:41 |
|
I can't help but think the "10GB at launch, 20GB a month later" thing is horseshit. Why would they sabotage their own launch like that? I could see "10GB at launch, 20GB 6-9 months later," but not a month.
|
# ? Aug 12, 2020 19:06 |
|
I'm not sure, they have a chart up that shows the 3080Ti being 20gb and the 3080 being 10. Where are you referring to specifically? Definitely not carrying water for WCCF of all places, just curious.
|
# ? Aug 12, 2020 19:11 |
|
Taima posted:I'm not sure, they have a chart up that shows the 3080Ti being 20gb and the 3080 being 10. Where are you referring to specifically? https://videocardz.com/newz/nvidia-might-also-launch-geforce-rtx-ampere-graphics-card-with-20gb-of-memory
|
# ? Aug 12, 2020 19:12 |
|
BIG HEADLINE posted:I can't help but think the "10GB at launch, 20GB a month later" thing is horseshit. Why would they sabotage their own launch like that? I can't see either, honestly. That's a pretty big chunk to be adding to the BOM, and it's unlikely that GDDR6X prices are going to halve in even 6 months--maybe 24 months, at best. Gaps that big point to different cards entirely--I think 20GB sounds a bit large for a 3080Ti, frankly, especially with new VRAM, and I'd expect 16GB to be a bit more likely. 10GB would be fine for a 3080, as it'd still be a step up from the 2080 without breaking the bank. They might be right in the sense that a 24GB Titan/3090 and 10GB 3080 could be launch-day, with a 16-20GB 3080Ti somewhere down the line.
|
# ? Aug 12, 2020 19:12 |
|
BIG HEADLINE posted:I can't help but think the "10GB at launch, 20GB a month later" thing is horseshit. Why would they sabotage their own launch like that? Titan-24GB 3080ti-20GB (comes out later) 3080-10GB That seems reasonably inline with how they've always done it, doesn't seem like they'd be sabotaging anything....
|
# ? Aug 12, 2020 19:13 |
|
Paul MaudDib posted:Intel launching enthusiast+midrange GPUs with raytracing in 2021 based on "external foundry". Didn't want to let that die on the last page. We'll see what happens when you give Raja infinite dollars!
|
# ? Aug 12, 2020 19:28 |
|
I’m in for the 24GB Titan SKU. The Ti doubling to 20GB while the 3080 going up by only 25% leads me to suspect they’re going to increase the price of the Ti to lie more evenly between the Titan and the x80. Bad news for people who always go for the Ti I guess.
|
# ? Aug 12, 2020 19:32 |
|
shrike82 posted:Bad news for people who always go for the Ti I guess. Or it's just wccftech fud and "based on their sources" they are "full of poo poo" as they "often are." I mean, that's part of their modus operandi: make a leak story about every possible combination of options so they can look back later and say they got it right. For real, though, NVidia's official announcements can't come soon enough so we can stop speculating and start figuring out if we've all saved enough for the card we want.
|
# ? Aug 12, 2020 19:39 |
|
I have a 144hz 1440p monitor which has g-sync. I suppose the g-sync means that I don't have to worry too much about always being able to hit some kind of fps target, once I get a new video card that supports it. But anyway: if I want to play games with everything on ultra, how many gb of video card ram do I need? Like of course everyone wants the best always, but I'm not planning on buying the best thing that comes out. Does anyone know if there's some kind of rule of thumb like 'if you're doing 1440p you need at least 8gb of ram'? The only game I played that showed me how much video ram I had vs how much I needed to turn on any option, was resident evil 2. At 1080p, I think I was able to put most stuff but not all of it, on max on my geforce 980. According to google, the 980 has 4gb of ram. I upgraded my monitor recently and just have not been touching any fps type game because I doubt my 980 can really handle 1440p. So when I read about the new cards having 10gb, but then later there will be a 20 or 24gb card... should I care? is 10 enough for me?
|
# ? Aug 12, 2020 19:40 |
|
BIG HEADLINE posted:https://videocardz.com/newz/nvidia-might-also-launch-geforce-rtx-ampere-graphics-card-with-20gb-of-memory It's hard to follow all of the rumor trails but I don't think that's being pushed anymore. At least, it doesn't seem to be in the newest writeups. I agree that it was weird. For a week or two there they had some really odd, conflicting sku rumors. The one they've settled on seems more realistic though as far as I can tell.
|
# ? Aug 12, 2020 19:40 |
|
redreader posted:I have a 144hz 1440p monitor which has g-sync. I suppose the g-sync means that I don't have to worry too much about always being able to hit some kind of fps target, once I get a new video card that supports it. But anyway: if I want to play games with everything on ultra, how many gb of video card ram do I need? Like of course everyone wants the best always, but I'm not planning on buying the best thing that comes out. Does anyone know if there's some kind of rule of thumb like 'if you're doing 1440p you need at least 8gb of ram'? For 1440p anything from like 8+ should be overkill.
|
# ? Aug 12, 2020 19:43 |
|
the comedy pairing of a Ryzen CPU and Xe GPU
|
# ? Aug 12, 2020 19:48 |
|
redreader posted:if I want to play games with everything on ultra, how many gb of video card ram do I need? Like of course everyone wants the best always, but I'm not planning on buying the best thing that comes out. Does anyone know if there's some kind of rule of thumb like 'if you're doing 1440p you need at least 8gb of ram'? These days it's less about "how much VRAM do you need" and a lot more about "how much GPU horsepower do you need." And, mostly, these things are tied together, anyhow--it's exceptionally rare that you get much of a choice in terms of VRAM at a given performance level. Frankly, VRAM is not the thing you should be looking at first (or at all, honestly), as you will almost never run out of VRAM before you've long since run out of GPU processing power (some funky low-end cards excepted). A 1440p@144Hz screen should be ably served by a 2070 Super or a 2080 Super, depending on how high you want to push the frames. You won't be hitting 144Hz with everything to Ultra in a lot of newer games, but that's fine--GSync will take a bunch of the sting out of that, and you honestly don't need to be running everything at Ultra, especially for fast-moving FPS games where you're not really sitting there and basking in the glory of every last option being shoved all the way to the right. So for what you "should" get, if you're happy with your 980 for now, wait until reviews actually drop of the 30-series. I'd be looking specifically at whatever the xx70 part is, as that should not only be powerful enough to give you a good experience, but is often the sweet-spot in terms of price:performance before pricing really starts to get crazy with the xx80 and above levels. If Ampere really is as sweet as it sounds, you might even be ok with the xx60 part if you're willing to turn a few visually-unimportant things down and accept a somewhat lower framerate.
|
# ? Aug 12, 2020 19:49 |
|
redreader posted:I have a 144hz 1440p monitor which has g-sync. I suppose the g-sync means that I don't have to worry too much about always being able to hit some kind of fps target, once I get a new video card that supports it. But anyway: if I want to play games with everything on ultra, how many gb of video card ram do I need? Like of course everyone wants the best always, but I'm not planning on buying the best thing that comes out. Does anyone know if there's some kind of rule of thumb like 'if you're doing 1440p you need at least 8gb of ram'? 8 is most likely enough, though that might change over the next few years if consoles use more, though I don't think that's like a giant risk. And PC GPU sizes are going to drive GPU memory usage more than the other way around. I don't think any games will release that will see significant impact without 10+ GB of VRAM because that will only be available to a relatively small % of the userbase. If you are on 4K then VRAM might be a little more of a worry, but even then I don't think VRAM is going to be a big concern for a while yet.
|
# ? Aug 12, 2020 19:50 |
|
1440p should be the sweet spot for PC gaming on a mid-range card given developers are going to target 4K30/60/120 for next-gen consoles.
|
# ? Aug 12, 2020 19:51 |
|
The only time I’m really hitting near the limit of my 8GB of VRAM is when I’m playing something like Monster Hunter World with the high def texture packs. We’re talking textures that double the install size of the game there, and it still fits in 8GB.
|
# ? Aug 12, 2020 19:52 |
|
NewFatMike posted:Didn't want to let that die on the last page. You get a 2080 (Super if you're lucky) that's 2-3 years late and an unproven driver team.
|
# ? Aug 12, 2020 20:08 |
|
the shroud is gonna be so cool though raja already used yinmn blue so next he's gotta use vantablack
|
# ? Aug 12, 2020 20:13 |
|
Ugly In The Morning posted:The only time I’m really hitting near the limit of my 8GB of VRAM is when I’m playing something like Monster Hunter World with the high def texture packs. We’re talking textures that double the install size of the game there, and it still fits in 8GB. Yeah I think I have 11gb of VRAM and I don't think half of it gets used.
|
# ? Aug 12, 2020 20:13 |
|
I expect the Xe "enthusiast" card to be a wet fart. Also, I want that Ampere FE. That cooler looks p. nice. Even tho I won't look at it all right after installing it.
|
# ? Aug 12, 2020 20:14 |
|
The funniest outcome for Intel would be for Raja to become CEO.
|
# ? Aug 12, 2020 20:14 |
|
Zedsdeadbaby posted:I haven't actually read anyone's GPU lists here Me either, it's so boring and narcissistic that they think anyone cares. On another note, even when much improved, DLSS performance comparisons will require an asterix for the foreseeable future. It's awesome, but not perfect, and does cause problems. Most of it is small, hardly noticeable stuff, but other things are obvious artifacts. https://www.youtube.com/watch?v=9ggro8CyZK4&t=942s
|
# ? Aug 12, 2020 20:30 |
|
repiv posted:the shroud is gonna be so cool though they already teased concepts for the shroud and they're total xxxtreme gamer poo poo https://wccftech.com/intel-xe-graphics-card-design-concepts-computex-2019/ Paul MaudDib fucked around with this message at 21:29 on Aug 12, 2020 |
# ? Aug 12, 2020 21:16 |
|
LRADIKAL posted:Me either, it's so boring and narcissistic that they think anyone cares. Woah there, I think it's just people remembering the old stuff they had, for fun, let's not make a big deal out of it; I imagine car guys share lists of cars, this is just a nerd version. It happens now and then..
|
# ? Aug 12, 2020 21:29 |
|
I assume everyone posting GPU lists has read everyone else's list with great interest, so there is actually a point for some people in posting it. I haven't read any but let people have their fun. My first GPU was a zx spectrum 48k (upgraded from 16k)
|
# ? Aug 12, 2020 21:31 |
|
HalloKitty posted:Woah there, I think it's just people remembering the old stuff they had, for fun, let's not make a big deal out of it; I imagine car guys share lists of cars, this is just a nerd version. It happens now and then.. Yeah, I'm pretty sure it's just people going, "Hey, remember when [old thing] was a thing?"
|
# ? Aug 12, 2020 21:31 |
|
sean10mm posted:Yeah, I'm pretty sure it's just people going, "Hey, remember when [old thing] was a thing?" I’ve been reading the lists and been like “oh man, the [insert card here], drat, I wonder how that was, I always wanted one of those”. It’s not like there’s a finite number of posts people can make in a thread, it’s just kinda fun.
|
# ? Aug 12, 2020 21:32 |
|
LRADIKAL posted:On another note, even when much improved, DLSS performance comparisons will require an asterix for the foreseeable future. It's awesome, but not perfect, and does cause problems. Most of it is small, hardly noticeable stuff, but other things are obvious artifacts. While you're right, I find it funny that--at least to me--the "wrong" DLSS implementation there almost looks more sensible for what's going on in the scene than the original version. I'm sure we'll find other one-offs where it's more jarringly wrong, though.
|
# ? Aug 12, 2020 21:32 |
|
FuturePastNow posted:the comedy pairing of a Ryzen CPU and Xe GPU I shed a tear for kaby lake g the greatest cpu no one cared about
|
# ? Aug 12, 2020 21:39 |
|
There's a latency cost as well, that, while minor, is significant for some people. I am, of course excited for it. My 1070 runs death stranding high detail at 60 fps @ 1440p, but I would love that level of detail at higher fps.
|
# ? Aug 12, 2020 21:42 |
|
[citation needed] on DLSS increasing latency
|
# ? Aug 12, 2020 21:47 |
|
Paul MaudDib posted:they already teased concepts for the shroud and they're total xxxtreme gamer poo poo I thought those were all fan renders?
|
# ? Aug 12, 2020 21:47 |
|
Malcolm XML posted:I shed a tear for kaby lake g the greatest cpu no one cared about kinda sad that the driver situation has turned ugly, it sounds like Intel more or less wanted to stop paying to support them? shrike82 posted:The funniest outcome for Intel would be for Raja to become CEO. objectively correct
|
# ? Aug 12, 2020 21:51 |
|
Ugly In The Morning posted:I’ve been reading the lists and been like “oh man, the [insert card here], drat, I wonder how that was, I always wanted one of those”. It’s not like there’s a finite number of posts people can make in a thread, it’s just kinda fun. yeah I thought it was a good break to just talk about some old cards and see what people did? weird to get all worked up over but i dunno. i guess some people want to keep arguing DLSS forever instead of talkin about cool cards like the Kyro 2 (this thread should be dedicated to only talking about the kyro 2 cause it kicked so much rear end) Cygni fucked around with this message at 22:09 on Aug 12, 2020 |
# ? Aug 12, 2020 22:06 |
|
The next logical step will be to name-and-shame anyone who plans on buying an Ampere FE card and then ripping the heatsink off it to slap an AIO on there.
|
# ? Aug 12, 2020 22:10 |
|
LRADIKAL posted:Me either, it's so boring and narcissistic that they think anyone cares. That's just KojiPro failing to put motion vectors on something that's moving. You can't really blame DLSS for that. The reality is that DLSS in general does look better than native, and the places where it looks worse than native are generally not noticeable because they're high motion., The increased framerate you're getting is far more important than minor sampling imperfections that disappear as soon as something as moving slowly enough for you to see that detail. Assuming a competent implementation, I would use DLSS even when I have nothing to gain framerate wise because the increased detail in typical scenarios is so worth it. LRADIKAL posted:There's a latency cost as well, that, while minor, is significant for some people. I am, of course excited for it. My 1070 runs death stranding high detail at 60 fps @ 1440p, but I would love that level of detail at higher fps. The latency cost usually not a cost because of the framerate increase. If you go from 100 to 130 FPS, your frame to frame interval is dropping by 2.3ms. With DLSS processing taking just over 1ms at 1440p on a 2060S, you're still coming out ahead in terms of how fresh the image your eyes are looking at at any given point, plus the significant benefits of extra temporal resolution. The one place where it could become a cost would be if you were using DLSS quality mode for increased detail when you're already capping your framerate. Even then it's probably quite small because you're still reducing your render time before you tack the DLSS processing time on, so you're probably looking at well under 1ms of latency and that may be worth the tradeoff in some competitive games for increased perception. repiv posted:[citation needed] on DLSS increasing latency There is processing time, but it's rarely going to be an actual latency increase. K8.0 fucked around with this message at 22:24 on Aug 12, 2020 |
# ? Aug 12, 2020 22:21 |
|
One thing I'm REALLY into is the idea of ray-tracing. I do want to be able to play everything with ray tracing turned on. I suppose if dlss2 becomes widely adopted I won't really have any problems with ray-tracing even with a 3060 but I suppose it's worth waiting for reviews rather than just buying the first thing that comes out. I've got a big enough backlog and ps4 game list that I should be fine for a while longer!
|
# ? Aug 12, 2020 22:24 |
|
Paul MaudDib posted:they already teased concepts for the shroud and they're total xxxtreme gamer poo poo Those are all deeply embarrassing. Just one step up from female on blower.
|
# ? Aug 12, 2020 22:25 |
|
|
# ? Apr 29, 2024 16:41 |
|
Yeah exactly, the frametime spent on DLSS is more than offset by the decrease in frametime spent shading pixels. That's the whole point. Reducing the total frametime means latency is decreased, the only exception would be if DLSS buffered an extra frame ahead but it doesn't do that.
|
# ? Aug 12, 2020 22:41 |