Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
Also, your daily dose of WCCFTECH rumor goodness:
https://wccftech.com/nvidia-geforce-rtx-30-ampere-gaming-graphics-card-rtx-3080-ti-rtx-3080-launching-in-24-gb-20-gb-10-gb-variants/

(for the record these have been fairly accurate/collaborated recently as we move so close to launch, relatively speaking)

Adbot
ADBOT LOVES YOU

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
I can't help but think the "10GB at launch, 20GB a month later" thing is horseshit. Why would they sabotage their own launch like that?

I could see "10GB at launch, 20GB 6-9 months later," but not a month.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
I'm not sure, they have a chart up that shows the 3080Ti being 20gb and the 3080 being 10. Where are you referring to specifically?

Definitely not carrying water for WCCF of all places, just curious.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

Taima posted:

I'm not sure, they have a chart up that shows the 3080Ti being 20gb and the 3080 being 10. Where are you referring to specifically?

Definitely not carrying water for WCCF of all places, just curious.

https://videocardz.com/newz/nvidia-might-also-launch-geforce-rtx-ampere-graphics-card-with-20gb-of-memory

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

BIG HEADLINE posted:

I can't help but think the "10GB at launch, 20GB a month later" thing is horseshit. Why would they sabotage their own launch like that?

I could see "10GB at launch, 20GB 6-9 months later," but not a month.

I can't see either, honestly. That's a pretty big chunk to be adding to the BOM, and it's unlikely that GDDR6X prices are going to halve in even 6 months--maybe 24 months, at best. Gaps that big point to different cards entirely--I think 20GB sounds a bit large for a 3080Ti, frankly, especially with new VRAM, and I'd expect 16GB to be a bit more likely. 10GB would be fine for a 3080, as it'd still be a step up from the 2080 without breaking the bank. They might be right in the sense that a 24GB Titan/3090 and 10GB 3080 could be launch-day, with a 16-20GB 3080Ti somewhere down the line.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

BIG HEADLINE posted:

I can't help but think the "10GB at launch, 20GB a month later" thing is horseshit. Why would they sabotage their own launch like that?

I could see "10GB at launch, 20GB 6-9 months later," but not a month.

Titan-24GB
3080ti-20GB (comes out later)
3080-10GB

That seems reasonably inline with how they've always done it, doesn't seem like they'd be sabotaging anything....

NewFatMike
Jun 11, 2015


Didn't want to let that die on the last page.

We'll see what happens when you give Raja infinite dollars!

shrike82
Jun 11, 2005

I’m in for the 24GB Titan SKU.
The Ti doubling to 20GB while the 3080 going up by only 25% leads me to suspect they’re going to increase the price of the Ti to lie more evenly between the Titan and the x80.

Bad news for people who always go for the Ti I guess.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

shrike82 posted:

Bad news for people who always go for the Ti I guess.

Or it's just wccftech fud and "based on their sources" they are "full of poo poo" as they "often are."

I mean, that's part of their modus operandi: make a leak story about every possible combination of options so they can look back later and say they got it right.

For real, though, NVidia's official announcements can't come soon enough so we can stop speculating and start figuring out if we've all saved enough for the card we want.

redreader
Nov 2, 2009

I am the coolest person ever with my pirate chalice. Seriously.

Dinosaur Gum
I have a 144hz 1440p monitor which has g-sync. I suppose the g-sync means that I don't have to worry too much about always being able to hit some kind of fps target, once I get a new video card that supports it. But anyway: if I want to play games with everything on ultra, how many gb of video card ram do I need? Like of course everyone wants the best always, but I'm not planning on buying the best thing that comes out. Does anyone know if there's some kind of rule of thumb like 'if you're doing 1440p you need at least 8gb of ram'?

The only game I played that showed me how much video ram I had vs how much I needed to turn on any option, was resident evil 2. At 1080p, I think I was able to put most stuff but not all of it, on max on my geforce 980. According to google, the 980 has 4gb of ram. I upgraded my monitor recently and just have not been touching any fps type game because I doubt my 980 can really handle 1440p.

So when I read about the new cards having 10gb, but then later there will be a 20 or 24gb card... should I care? is 10 enough for me?

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

It's hard to follow all of the rumor trails but I don't think that's being pushed anymore. At least, it doesn't seem to be in the newest writeups. I agree that it was weird. For a week or two there they had some really odd, conflicting sku rumors. The one they've settled on seems more realistic though as far as I can tell.

sean10mm
Jun 29, 2005

It's a Mad, Mad, Mad, MAD-2R World

redreader posted:

I have a 144hz 1440p monitor which has g-sync. I suppose the g-sync means that I don't have to worry too much about always being able to hit some kind of fps target, once I get a new video card that supports it. But anyway: if I want to play games with everything on ultra, how many gb of video card ram do I need? Like of course everyone wants the best always, but I'm not planning on buying the best thing that comes out. Does anyone know if there's some kind of rule of thumb like 'if you're doing 1440p you need at least 8gb of ram'?

The only game I played that showed me how much video ram I had vs how much I needed to turn on any option, was resident evil 2. At 1080p, I think I was able to put most stuff but not all of it, on max on my geforce 980. According to google, the 980 has 4gb of ram. I upgraded my monitor recently and just have not been touching any fps type game because I doubt my 980 can really handle 1440p.

So when I read about the new cards having 10gb, but then later there will be a 20 or 24gb card... should I care? is 10 enough for me?

For 1440p anything from like 8+ should be overkill.

FuturePastNow
May 19, 2014


the comedy pairing of a Ryzen CPU and Xe GPU

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

redreader posted:

if I want to play games with everything on ultra, how many gb of video card ram do I need? Like of course everyone wants the best always, but I'm not planning on buying the best thing that comes out. Does anyone know if there's some kind of rule of thumb like 'if you're doing 1440p you need at least 8gb of ram'?

These days it's less about "how much VRAM do you need" and a lot more about "how much GPU horsepower do you need." And, mostly, these things are tied together, anyhow--it's exceptionally rare that you get much of a choice in terms of VRAM at a given performance level. Frankly, VRAM is not the thing you should be looking at first (or at all, honestly), as you will almost never run out of VRAM before you've long since run out of GPU processing power (some funky low-end cards excepted).

A 1440p@144Hz screen should be ably served by a 2070 Super or a 2080 Super, depending on how high you want to push the frames. You won't be hitting 144Hz with everything to Ultra in a lot of newer games, but that's fine--GSync will take a bunch of the sting out of that, and you honestly don't need to be running everything at Ultra, especially for fast-moving FPS games where you're not really sitting there and basking in the glory of every last option being shoved all the way to the right.

So for what you "should" get, if you're happy with your 980 for now, wait until reviews actually drop of the 30-series. I'd be looking specifically at whatever the xx70 part is, as that should not only be powerful enough to give you a good experience, but is often the sweet-spot in terms of price:performance before pricing really starts to get crazy with the xx80 and above levels. If Ampere really is as sweet as it sounds, you might even be ok with the xx60 part if you're willing to turn a few visually-unimportant things down and accept a somewhat lower framerate.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

redreader posted:

I have a 144hz 1440p monitor which has g-sync. I suppose the g-sync means that I don't have to worry too much about always being able to hit some kind of fps target, once I get a new video card that supports it. But anyway: if I want to play games with everything on ultra, how many gb of video card ram do I need? Like of course everyone wants the best always, but I'm not planning on buying the best thing that comes out. Does anyone know if there's some kind of rule of thumb like 'if you're doing 1440p you need at least 8gb of ram'?

The only game I played that showed me how much video ram I had vs how much I needed to turn on any option, was resident evil 2. At 1080p, I think I was able to put most stuff but not all of it, on max on my geforce 980. According to google, the 980 has 4gb of ram. I upgraded my monitor recently and just have not been touching any fps type game because I doubt my 980 can really handle 1440p.

So when I read about the new cards having 10gb, but then later there will be a 20 or 24gb card... should I care? is 10 enough for me?

8 is most likely enough, though that might change over the next few years if consoles use more, though I don't think that's like a giant risk. And PC GPU sizes are going to drive GPU memory usage more than the other way around. I don't think any games will release that will see significant impact without 10+ GB of VRAM because that will only be available to a relatively small % of the userbase.

If you are on 4K then VRAM might be a little more of a worry, but even then I don't think VRAM is going to be a big concern for a while yet.

shrike82
Jun 11, 2005

1440p should be the sweet spot for PC gaming on a mid-range card given developers are going to target 4K30/60/120 for next-gen consoles.

Ugly In The Morning
Jul 1, 2010
Pillbug
The only time I’m really hitting near the limit of my 8GB of VRAM is when I’m playing something like Monster Hunter World with the high def texture packs. We’re talking textures that double the install size of the game there, and it still fits in 8GB.

sauer kraut
Oct 2, 2004

NewFatMike posted:

Didn't want to let that die on the last page.

We'll see what happens when you give Raja infinite dollars!

You get a 2080 (Super if you're lucky) that's 2-3 years late and an unproven driver team.

repiv
Aug 13, 2009

the shroud is gonna be so cool though

raja already used yinmn blue so next he's gotta use vantablack

VelociBacon
Dec 8, 2009

Ugly In The Morning posted:

The only time I’m really hitting near the limit of my 8GB of VRAM is when I’m playing something like Monster Hunter World with the high def texture packs. We’re talking textures that double the install size of the game there, and it still fits in 8GB.

Yeah I think I have 11gb of VRAM and I don't think half of it gets used.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
I expect the Xe "enthusiast" card to be a wet fart.

Also, I want that Ampere FE. That cooler looks p. nice. Even tho I won't look at it all right after installing it.

shrike82
Jun 11, 2005

The funniest outcome for Intel would be for Raja to become CEO.

LRADIKAL
Jun 10, 2001

Fun Shoe

Zedsdeadbaby posted:

I haven't actually read anyone's GPU lists here

Me either, it's so boring and narcissistic that they think anyone cares.

On another note, even when much improved, DLSS performance comparisons will require an asterix for the foreseeable future. It's awesome, but not perfect, and does cause problems. Most of it is small, hardly noticeable stuff, but other things are obvious artifacts.

https://www.youtube.com/watch?v=9ggro8CyZK4&t=942s

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

repiv posted:

the shroud is gonna be so cool though

raja already used yinmn blue so next he's gotta use vantablack

they already teased concepts for the shroud and they're total xxxtreme gamer poo poo











https://wccftech.com/intel-xe-graphics-card-design-concepts-computex-2019/

Paul MaudDib fucked around with this message at 21:29 on Aug 12, 2020

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

LRADIKAL posted:

Me either, it's so boring and narcissistic that they think anyone cares.

Woah there, I think it's just people remembering the old stuff they had, for fun, let's not make a big deal out of it; I imagine car guys share lists of cars, this is just a nerd version. It happens now and then..

redreader
Nov 2, 2009

I am the coolest person ever with my pirate chalice. Seriously.

Dinosaur Gum
I assume everyone posting GPU lists has read everyone else's list with great interest, so there is actually a point for some people in posting it. I haven't read any but let people have their fun.

My first GPU was a zx spectrum 48k (upgraded from 16k)

sean10mm
Jun 29, 2005

It's a Mad, Mad, Mad, MAD-2R World

HalloKitty posted:

Woah there, I think it's just people remembering the old stuff they had, for fun, let's not make a big deal out of it; I imagine car guys share lists of cars, this is just a nerd version. It happens now and then..

Yeah, I'm pretty sure it's just people going, "Hey, remember when [old thing] was a thing?" :v:

Ugly In The Morning
Jul 1, 2010
Pillbug

sean10mm posted:

Yeah, I'm pretty sure it's just people going, "Hey, remember when [old thing] was a thing?" :v:

I’ve been reading the lists and been like “oh man, the [insert card here], drat, I wonder how that was, I always wanted one of those”. It’s not like there’s a finite number of posts people can make in a thread, it’s just kinda fun.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

LRADIKAL posted:

On another note, even when much improved, DLSS performance comparisons will require an asterix for the foreseeable future. It's awesome, but not perfect, and does cause problems. Most of it is small, hardly noticeable stuff, but other things are obvious artifacts.

While you're right, I find it funny that--at least to me--the "wrong" DLSS implementation there almost looks more sensible for what's going on in the scene than the original version. I'm sure we'll find other one-offs where it's more jarringly wrong, though.

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

FuturePastNow posted:

the comedy pairing of a Ryzen CPU and Xe GPU

I shed a tear for kaby lake g the greatest cpu no one cared about

LRADIKAL
Jun 10, 2001

Fun Shoe
There's a latency cost as well, that, while minor, is significant for some people. I am, of course excited for it. My 1070 runs death stranding high detail at 60 fps @ 1440p, but I would love that level of detail at higher fps.

repiv
Aug 13, 2009

[citation needed] on DLSS increasing latency

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

Paul MaudDib posted:

they already teased concepts for the shroud and they're total xxxtreme gamer poo poo











https://wccftech.com/intel-xe-graphics-card-design-concepts-computex-2019/

I thought those were all fan renders?

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Malcolm XML posted:

I shed a tear for kaby lake g the greatest cpu no one cared about

kinda sad that the driver situation has turned ugly, it sounds like Intel more or less wanted to stop paying to support them?

shrike82 posted:

The funniest outcome for Intel would be for Raja to become CEO.

objectively correct

Cygni
Nov 12, 2005

raring to post

Ugly In The Morning posted:

I’ve been reading the lists and been like “oh man, the [insert card here], drat, I wonder how that was, I always wanted one of those”. It’s not like there’s a finite number of posts people can make in a thread, it’s just kinda fun.

yeah I thought it was a good break to just talk about some old cards and see what people did? weird to get all worked up over but i dunno.

i guess some people want to keep arguing DLSS forever instead of talkin about cool cards like the Kyro 2 (this thread should be dedicated to only talking about the kyro 2 cause it kicked so much rear end)

Cygni fucked around with this message at 22:09 on Aug 12, 2020

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
The next logical step will be to name-and-shame anyone who plans on buying an Ampere FE card and then ripping the heatsink off it to slap an AIO on there.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

LRADIKAL posted:

Me either, it's so boring and narcissistic that they think anyone cares.

On another note, even when much improved, DLSS performance comparisons will require an asterix for the foreseeable future. It's awesome, but not perfect, and does cause problems. Most of it is small, hardly noticeable stuff, but other things are obvious artifacts.

https://www.youtube.com/watch?v=9ggro8CyZK4&t=942s

That's just KojiPro failing to put motion vectors on something that's moving. You can't really blame DLSS for that.

The reality is that DLSS in general does look better than native, and the places where it looks worse than native are generally not noticeable because they're high motion., The increased framerate you're getting is far more important than minor sampling imperfections that disappear as soon as something as moving slowly enough for you to see that detail. Assuming a competent implementation, I would use DLSS even when I have nothing to gain framerate wise because the increased detail in typical scenarios is so worth it.

LRADIKAL posted:

There's a latency cost as well, that, while minor, is significant for some people. I am, of course excited for it. My 1070 runs death stranding high detail at 60 fps @ 1440p, but I would love that level of detail at higher fps.

The latency cost usually not a cost because of the framerate increase. If you go from 100 to 130 FPS, your frame to frame interval is dropping by 2.3ms. With DLSS processing taking just over 1ms at 1440p on a 2060S, you're still coming out ahead in terms of how fresh the image your eyes are looking at at any given point, plus the significant benefits of extra temporal resolution. The one place where it could become a cost would be if you were using DLSS quality mode for increased detail when you're already capping your framerate. Even then it's probably quite small because you're still reducing your render time before you tack the DLSS processing time on, so you're probably looking at well under 1ms of latency and that may be worth the tradeoff in some competitive games for increased perception.

repiv posted:

[citation needed] on DLSS increasing latency

There is processing time, but it's rarely going to be an actual latency increase.

K8.0 fucked around with this message at 22:24 on Aug 12, 2020

redreader
Nov 2, 2009

I am the coolest person ever with my pirate chalice. Seriously.

Dinosaur Gum
One thing I'm REALLY into is the idea of ray-tracing. I do want to be able to play everything with ray tracing turned on. I suppose if dlss2 becomes widely adopted I won't really have any problems with ray-tracing even with a 3060 but I suppose it's worth waiting for reviews rather than just buying the first thing that comes out. I've got a big enough backlog and ps4 game list that I should be fine for a while longer!

FuturePastNow
May 19, 2014


Paul MaudDib posted:

they already teased concepts for the shroud and they're total xxxtreme gamer poo poo











https://wccftech.com/intel-xe-graphics-card-design-concepts-computex-2019/

Those are all deeply embarrassing. Just one step up from female on blower.

Adbot
ADBOT LOVES YOU

repiv
Aug 13, 2009


Yeah exactly, the frametime spent on DLSS is more than offset by the decrease in frametime spent shading pixels. That's the whole point.

Reducing the total frametime means latency is decreased, the only exception would be if DLSS buffered an extra frame ahead but it doesn't do that.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply