Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
spunkshui
Oct 5, 2011



Ihmemies posted:

So I tried DOOM with 3080 TuF OC, gpu usage was maybe 50%. Power draw from wall was around 400W. Civ6 ingame was maybe 60% gpu usage, 430W.

I am using a 500W passive PSU. I am slightly concerned...

I'm shocked it does not immediately turn off when you fire up a game.

It probably will depending on the game.

Adbot
ADBOT LOVES YOU

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?
that extra ram gonna matter?

Truga
May 4, 2014
Lipstick Apathy
some years down the line, 100%

right now? extremely does not matter.

i'm on a 980Ti, and firefox performance goes to poo poo while i'm playing some games, just 720p youtube vids stuttering to gently caress with more than 2-3 tabs open, because 6gb is just not enough anymore. i either have to close the game or close a tab, and it's super annoying. if i had 8, that wouldn't be an issue

i imagine 10 vs 16 is gonna see similar poo poo ~5 years down the road :v:

Cygni
Nov 12, 2005

raring to post

Truga posted:

i'm on a 980Ti, and firefox performance goes to poo poo while i'm playing some games, just 720p youtube vids stuttering to gently caress with more than 2-3 tabs open, because 6gb is just not enough anymore. i either have to close the game or close a tab, and it's super annoying. if i had 8, that wouldn't be an issue

i imagine 10 vs 16 is gonna see similar poo poo ~5 years down the road :v:

You’ve got other issues tbh. That’s not vram, and certainly not 6gb vram problems.

repiv
Aug 13, 2009

it's hard to project how VRAM requirements are going to scale this generation, assets continue to get bigger but with the push towards faster and finer-grained streaming we won't need as many of them in VRAM at any one time

Ihmemies
Oct 6, 2012

spunkshui posted:

I'm shocked it does not immediately turn off when you fire up a game.

It probably will depending on the game.

Well isn't the PSU supposed to provide 500W if the label states 500W?

I have to try out some more games.

Maybe undervolt the card a bit with Afterburner. Some articles have gotten 50W power savings with neligible fps drop with undervolting.

6800XT sure has a lot better perf/watt. 3080 is just.. bad.. ancient... legacy tech compared to 6800XT.

Truga
May 4, 2014
Lipstick Apathy

Cygni posted:

You’ve got other issues tbh. That’s not vram, and certainly not 6gb vram problems.

:shrug: it started happening as i replaced my dead 1600p monitor with a 4k one, with nothing else changing. i've since upgraded to 3950x and 64gb of ram and reinstalled windows, the only thing still in common is the gpu

Numinous
May 20, 2001

College Slice
Just received my EVGA email for ordering.

XC3 Ultra (should have gone with the base model but I didn't hit the notify for that one)

I signed up on 9/17/2020 8:58:48 AM PT

Based on the spreadsheet on reddit it looks like their starting to speed up a little??

shrike82
Jun 11, 2005

https://twitter.com/PlayGodfall/status/1329095241333432323?s=20

lmao

v1ld
Apr 16, 2012


Won't that be automatic for every game that has RT features on the new consoles?

Truga
May 4, 2014
Lipstick Apathy
glad to see RT is gonna be crapshoot for another 5 years so i don't have to care about it lmao

Twain of Pain
Dec 14, 2006

Numinous posted:

Just received my EVGA email for ordering.

XC3 Ultra (should have gone with the base model but I didn't hit the notify for that one)

I signed up on 9/17/2020 8:58:48 AM PT

Based on the spreadsheet on reddit it looks like their starting to speed up a little??

I've got a 9/16 notify for the base model and still haven't received the email so don't feel too bad about not going with it.

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

Ihmemies posted:

Well isn't the PSU supposed to provide 500W if the label states 500W?

The number on a psu box can mean a lot of things. On a quality one basically, dodgier units have been known to have an underperforming 12v rail and get tricky with the other ones to add up to a nice number.

A passive psu is probably a quality one, but if you're buying white box computer hardware down the line for whatever reason...

Pablo Bluth
Sep 7, 2007

I've made a huge mistake.
Surely that is as expected? They'd made absolutely zero reference to it at any point, a clear sign it was going to be unchanged to any non-trival degree. Best case scenario is they're working on a major rework that couldn't be delivered in time rather than putting in a minor incremental improvement for the sake of marketing.

Os Furoris
Aug 19, 2002

For those of us that don’t have the patience to monitor drops, is the EVGA step up worth it? It seems like I can buy whatever then immediately apply to step up to a 3070.

Any suggestions for an EVGA card to tide me over in the meantime?

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."
Yikes at this Hardware Unboxed review of the 6800xt. Completely dismisses raytracing (two titles tested, Shadow of the Tomb Raider and Dirt 5), no mention of DLSS. You can perhaps downlplay those two features until more games support them, but to basically hand-wave them away at this point, especially as the new consoles will utilize RT extensively is bizarre.

hobbesmaster
Jan 28, 2008

Happy_Misanthrope posted:

Yikes at this Hardware Unboxed review of the 6800xt. Completely dismisses raytracing (two titles tested, Shadow of the Tomb Raider and Dirt 5), no mention of DLSS. You can perhaps downlplay those two features until more games support them, but to basically hand-wave them away at this point, especially as the new consoles will utilize RT extensively is bizarre.

When discussing RT on the consoles anything the consoles can do RDNA2 will do better on PC because its the exact same hardware except more of it.

The Gadfly
Sep 23, 2012

Cygni posted:

You’ve got other issues tbh. That’s not vram, and certainly not 6gb vram problems.

Hardware acceleration in browsers can use a lot of vram

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Happy_Misanthrope posted:

You can perhaps downlplay those two features until more games support them, but to basically hand-wave them away at this point, especially as the new consoles will utilize RT extensively is bizarre.

Some people really want AMD to be a legitimate competitor again, and aren't letting anything stop them.

Which is weird, because they're legit decent cards for a lot of games. But, uh, yeah. Raytracing is A Thing these days, guys.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

v1ld posted:

Seems like NVidia's 30xx pricing anticipated this level of performance from the AMD cards even if no one else did after the Jensen presentation.

Wonder if this increases the likelihood of a 3080 Ti release as early as Jan. They don't have much room to play with between the 3080 and 3090 other than in price, memory bus width and size I think, but haven't been following closely? So it'll be interesting to see how they price it if they do release one.

there is also the BOM cost angle. NVIDIA is on Samsung, AMD is on TSMC and it's much more expensive (some rumors say twice as much). Getting aggressive with price hurts AMD more than it does NVIDIA, but AMD pretty much has to undercut NVIDIA at least a little unless they completely blow NVIDIA away in performance. The cache cuts BOM for the memory chips themselves but AMD has more VRAM (albeit cheaper, and with simpler PCBs), but probably at the cost of some additional die area for the cache (although you save some area on the memory PHYs as well).

which is to say, they not only anticipated the level of performance but they analyzed the economics of AMD's launch, AMD cuts their own throat more at any given price point and they have to undercut, so an aggressive price point favors NVIDIA. Like, I'm sure AMD knows that fifty bucks for similar raster, no DLSS, and weak RT performance probably isn't a winner but they can't cut their own throat much further than knocking 50 bucks off, and the economics of using (nearly) GA102 sized silicon for a 3070 competitor is not great at all, that is why the 6800 non-XT is priced so bad. NVIDIA is using a smaller GA104 die to compete in that segment, AMD has to use their bigboi GA102 competitor.

So aside from the marketing war of who has the best card at X price, I'm not sure AMD is ultimately winning the economic war here. NVIDIA probably still runs better margins, and every GPU AMD produces is a bunch of CPUs that didn't get produced, and they likely make more profit on those, and can insta-sell anything they produce. To some extent I'm surprised they're going forward this aggressively (Kyle Bennett thinks AIB drops are going to be large) but I guess they kinda have to if they don't want to lose mindshare.

to some extent the feature deficit is part of "BOM cost reduction" as well - ultimately NVIDIA simply spends more transistors on tensor cores and RTX than AMD does, so they get more RT and ML performance. NVIDIA spent about 10% of the die on RTX (tensor+RT cores) last generation, if AMD implemented those features fully their would be almost 10% bigger (probably roughly equalizing the perf/mm2 gap) and the die would be correspondingly more expensive to manufacture. So you are paying with features, in a sense.

I still wonder if GA102 is as big as it's going to get - it's big but it's not reticle-limit (or at least not at TSMC). You could still do a TU102-style behemoth chip at TSMC with maybe 30-40% larger die (700-750mm2), although I'm sure the economics would be 2080 Ti-style bad (consumers only get cutdowns and it's still $1500-2000). Arguing against that is the really poor scaling showing up even at the 3090 level, I suppose - 40% more shaders doesn't mean 40% more fps.

Paul MaudDib fucked around with this message at 19:28 on Nov 18, 2020

VorpalFish
Mar 22, 2007
reasonably awesometm

Ihmemies posted:

Well isn't the PSU supposed to provide 500W if the label states 500W?

I have to try out some more games.

Maybe undervolt the card a bit with Afterburner. Some articles have gotten 50W power savings with neligible fps drop with undervolting.

6800XT sure has a lot better perf/watt. 3080 is just.. bad.. ancient... legacy tech compared to 6800XT.

I mean if you want to go by labels, doesn't the label on the 3080 say it requires a 750w psu?

(Yes those are always pretty conservative...)

terrorist ambulance
Nov 5, 2009

Happy_Misanthrope posted:

Yikes at this Hardware Unboxed review of the 6800xt. Completely dismisses raytracing (two titles tested, Shadow of the Tomb Raider and Dirt 5), no mention of DLSS. You can perhaps downlplay those two features until more games support them, but to basically hand-wave them away at this point, especially as the new consoles will utilize RT extensively is bizarre.

Other weird poo poo going on in that review. He has SAM giving pretty significant performance benefits whereas other reviews have it giving very slight boosts, or larger boosts but where the 6800xt is already struggling.

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

DrDork posted:

Some people really want AMD to be a legitimate competitor again, and aren't letting anything stop them.

I mean hell, I'm one of them! I don't think I would even be using RT that much based on current performance in most games, and I think 'DLSS or bust' is still a little premature until we see more games with it (and it's quality, while vastly improved with Ver 2+, can still vary somewhat). But this was just so blatant in disregarding them out of hand it's amazing, it's like a review from a 2-year old time capsule.

VorpalFish
Mar 22, 2007
reasonably awesometm

Paul MaudDib posted:

there is also the BOM cost angle. NVIDIA is on Samsung, AMD is on TSMC and it's much more expensive (some rumors say twice as much). Getting aggressive with price hurts AMD more than it does NVIDIA, but AMD pretty much has to undercut NVIDIA at least a little unless they completely blow NVIDIA away in performance.

which is to say, they not only anticipated the level of performance but they analyzed the economics of AMD's launch, AMD cuts their own throat more at any given price point and they have to undercut, so an aggressive price point favors NVIDIA. Like, I'm sure AMD knows that fifty bucks for similar raster, no DLSS, and weak RT performance probably isn't a winner but they can't cut their own throat much further than knocking 50 bucks off, and the economics of using (nearly) GA102 sized silicon for a 3070 competitor is not great at all, that is why the 6800 non-XT is priced so bad.

to some extent the feature deficit is part of "BOM cost reduction" as well - ultimately NVIDIA simply spends more transistors on tensor cores and RTX than AMD does, so they get more RT and ML performance. NVIDIA spent about 10% of the die on RTX (tensor+RT cores) last generation, if AMD implemented those features fully their would be almost 10% bigger (probably roughly equalizing the perf/mm2 gap) and the die would be correspondingly more expensive to manufacture. So you are paying with features, in a sense.

I still wonder if GA102 is as big as it's going to get - it's big but it's not reticle-limit (or at least not at TSMC). You could still do a TU102-style behemoth chip at TSMC with maybe 30-40% larger die (700-750mm2), although I'm sure the economics would be 2080 Ti-style bad (consumers only get cutdowns and it's still $1500-2000). Arguing against that is the really poor scaling showing up even at the 3090 level, I suppose - 40% more shaders doesn't mean 40% more fps.

Flip side of that is AMD probably could have launched $100 more expensive than they did and not undercut nvidia at all and still sold every card. It's kind of weird they've both opted for aggressive pricing combined with 0 availability.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

VorpalFish posted:

Flip side of that is AMD probably could have launched $100 more expensive than they did and not undercut nvidia at all and still sold every card. It's kind of weird they've both opted for aggressive pricing combined with 0 availability.

NVidia, at least, I think legitimately did not understand the amount of demand for their cards, or they probably would have launched higher. AMD has no choice but to follow suit, because the PR hit from launching at prices above Ampere with performance below it would inevitably dog the card's reputation for the entire cycle, which could end up in a net loss for them once inventory stabilizes.

b0ner of doom
Mar 17, 2006
the problem with dlss is that few games support it and even fewer actually worth playing - I can understand dismissing it at this point cause most gamers just wanna play games at good framerates and don't care about a technology that's been really slow to adopt

repiv
Aug 13, 2009


lol every DXR game to date works on RDNA2, despite the developers not having any RDNA2 hardware to test against, but AMD partners start shipping raytracing and suddenly cross-vendor is too hard

that's one way to avoid losing to nvidia in benchmarks

spunkshui
Oct 5, 2011



VorpalFish posted:

I mean if you want to go by labels, doesn't the label on the 3080 say it requires a 750w psu?

(Yes those are always pretty conservative...)

600 feels like trying to skirt by. 500 just seems like a bad idea.

I mean if it works it works but don’t complain if the computer turns off suddenly when you are playing a game.

Nfcknblvbl
Jul 15, 2002

b0ner of doom posted:

the problem with dlss is that few games support it and even fewer actually worth playing - I can understand dismissing it at this point cause most gamers just wanna play games at good framerates and don't care about a technology that's been really slow to adopt

DLSS is being added to already-existing games right now, War Thunder being my mostly played one. It's a very relevant feature that improves quality, and raises frame rates. 4K60+ won't be realistically achievable without it.

space marine todd
Nov 7, 2014



b0ner of doom posted:

the problem with dlss is that few games support it and even fewer actually worth playing - I can understand dismissing it at this point cause most gamers just wanna play games at good framerates and don't care about a technology that's been really slow to adopt

It's weird to talk about how few games support DLSS and then talk about "most gamers"...who are mostly playing a few games that do support it and are extremely framerate sensitive, like Fortnite and Call of Duty (although the new one's implementation of DLSS isn't great?). Most gamers in this market are playing just a handful of multiplayer FPS games and want as many frames per second as possible; that's going to be the market that Nvidia and AMD will be fighting over in terms of video card upgrades.

The success metric I would measure for DLSS is "total hours played across all games that support DLSS", not "how many games support DLSS".

space marine todd fucked around with this message at 19:41 on Nov 18, 2020

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Os Furoris posted:

For those of us that don’t have the patience to monitor drops, is the EVGA step up worth it? It seems like I can buy whatever then immediately apply to step up to a 3070.

Any suggestions for an EVGA card to tide me over in the meantime?

Good luck getting one to use for Step Up

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

space marine todd posted:

It's weird to talk about how few games support DLSS and then talk about "most gamers"...who are mostly playing a few games that do support it and are extremely framerate sensitive, like Fortnite and Call of Duty (although the new one's implementation of DLSS isn't great?). Most gamers in this market are playing just a handful of multiplayer FPS games and want as many frames per second as possible.

The metric I would be measuring is "total hours played across all games that support DLSS", not "how many games support DLSS".

Ehh, if you're going to look at fortnites numbers you have to factor in how many fortniters are using DLSS capable hardware. I can't imagine "people with a $500+ gpu" (or even $300, really) comprise a significant portion of the playerbase.

hobbesmaster
Jan 28, 2008

Some Goon posted:

Ehh, if you're going to look at fortnites numbers you have to factor in how many fortniters are using DLSS capable hardware. I can't imagine "people with a $500+ gpu" (or even $300, really) comprise a significant portion of the playerbase.

Everyone that is thinking they're playing competitively at a high level is going to be running the fastest they can afford. to get above whatever their monitor can do (144/240)

Kunabomber
Oct 1, 2002


Pillbug
So I guess 3070 is the way to go for mid-range? I was waiting for the 6800 benchmarks to drop and it looks like you get a bump in standard performance but lose a lot in terms of featureset for $80 more. I have a 4k60 monitor but I think I'll be happy with going 1440p60 on high settings by futzing with the render res if necessary - running a 1660 super right now and it dips frames a bit too much. Add in DLSS and now you got a stew going.

hobbesmaster
Jan 28, 2008

Kunabomber posted:

So I guess 3070 is the way to go for mid-range? I was waiting for the 6800 benchmarks to drop and it looks like you get a bump in standard performance but lose a lot in terms of featureset for $80 more. I have a 4k60 monitor but I think I'll be happy with going 1440p60 on high settings by futzing with the render res if necessary - running a 1660 super right now and it dips frames a bit too much. Add in DLSS and now you got a stew going.

Those leaked 3060 TI numbers looked pretty nutty for $300 but are probably fake so 3070 is it for the moment.

Riflen
Mar 13, 2009

"Cheating bitch"
Bleak Gremlin
Hah. Watching this interview with RTG people concerning 6000 series.
https://www.youtube.com/watch?v=FvE7GeaPjjA

When asked about their reply to DLSS, they said their solution is being developed in partnership with the console vendors and game developers. The bullet points were:

- RTG don't want any performance hit
- Want really good image quality / scaling
- RTG claimed developers beg them not to make an AMD-specific solution
- Goal is for it to work on all GPUs including Intel and Nvidia (suggesting a shader-based solution)

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

hobbesmaster posted:

Everyone that is thinking they're playing competitively at a high level is going to be running the fastest they can afford. to get above whatever their monitor can do (144/240)

Sure, but the millions of teenagers the are the bulk of the playerbase just don't have the money. The people you're talking about are out there, but I question their statistical significance. They're also most likely playing at 1080p for that sweet 240, which even a 1650 can hit at low, as would-be competitive players are wont to do.

repiv
Aug 13, 2009

Riflen posted:

When asked about their reply to DLSS, they said their solution is being developed in partnership with the console vendors and game developers. The bullet points were:

- RTG don't want any performance hit
- Want really good image quality / scaling
- RTG claimed developers beg them not to make an AMD-specific solution
- Goal is for it to work on all GPUs including Intel and Nvidia (suggesting a shader-based solution)

The comedy option would be if AMD comes up with an ML based solution but implements it in vanilla compute shaders instead of DirectML, so Nvidia can't take advantage of their tensor cores

LGD
Sep 25, 2004

hobbesmaster posted:

Everyone that is thinking they're playing competitively at a high level is going to be running the fastest they can afford. to get above whatever their monitor can do (144/240)

people who think they're playing competitively at a high level and going for MAXIMUM FRAMES over all else are also exactly the sort of people who are amenable to arguments that superior performance at rasterization trumps pretty much anything to do with RT

DLSS is a big advantage for Nvidia, but everything RT related is inherently predicated in a quality argument at this point in time

Adbot
ADBOT LOVES YOU

VorpalFish
Mar 22, 2007
reasonably awesometm

Kunabomber posted:

So I guess 3070 is the way to go for mid-range? I was waiting for the 6800 benchmarks to drop and it looks like you get a bump in standard performance but lose a lot in terms of featureset for $80 more. I have a 4k60 monitor but I think I'll be happy with going 1440p60 on high settings by futzing with the render res if necessary - running a 1660 super right now and it dips frames a bit too much. Add in DLSS and now you got a stew going.

The way to go at this point is whatever card you can successfully check out with, then make a sacrifice to the elder gods so your order doesn't get cancelled.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply