|
Ihmemies posted:So I tried DOOM with 3080 TuF OC, gpu usage was maybe 50%. Power draw from wall was around 400W. Civ6 ingame was maybe 60% gpu usage, 430W. I'm shocked it does not immediately turn off when you fire up a game. It probably will depending on the game.
|
# ? Nov 18, 2020 18:23 |
|
|
# ? Apr 28, 2024 22:32 |
|
that extra ram gonna matter?
|
# ? Nov 18, 2020 18:23 |
|
some years down the line, 100% right now? extremely does not matter. i'm on a 980Ti, and firefox performance goes to poo poo while i'm playing some games, just 720p youtube vids stuttering to gently caress with more than 2-3 tabs open, because 6gb is just not enough anymore. i either have to close the game or close a tab, and it's super annoying. if i had 8, that wouldn't be an issue i imagine 10 vs 16 is gonna see similar poo poo ~5 years down the road
|
# ? Nov 18, 2020 18:27 |
|
Truga posted:i'm on a 980Ti, and firefox performance goes to poo poo while i'm playing some games, just 720p youtube vids stuttering to gently caress with more than 2-3 tabs open, because 6gb is just not enough anymore. i either have to close the game or close a tab, and it's super annoying. if i had 8, that wouldn't be an issue You’ve got other issues tbh. That’s not vram, and certainly not 6gb vram problems.
|
# ? Nov 18, 2020 18:30 |
|
it's hard to project how VRAM requirements are going to scale this generation, assets continue to get bigger but with the push towards faster and finer-grained streaming we won't need as many of them in VRAM at any one time
|
# ? Nov 18, 2020 18:30 |
|
spunkshui posted:I'm shocked it does not immediately turn off when you fire up a game. Well isn't the PSU supposed to provide 500W if the label states 500W? I have to try out some more games. Maybe undervolt the card a bit with Afterburner. Some articles have gotten 50W power savings with neligible fps drop with undervolting. 6800XT sure has a lot better perf/watt. 3080 is just.. bad.. ancient... legacy tech compared to 6800XT.
|
# ? Nov 18, 2020 18:31 |
|
Cygni posted:You’ve got other issues tbh. That’s not vram, and certainly not 6gb vram problems. it started happening as i replaced my dead 1600p monitor with a 4k one, with nothing else changing. i've since upgraded to 3950x and 64gb of ram and reinstalled windows, the only thing still in common is the gpu
|
# ? Nov 18, 2020 18:33 |
|
Just received my EVGA email for ordering. XC3 Ultra (should have gone with the base model but I didn't hit the notify for that one) I signed up on 9/17/2020 8:58:48 AM PT Based on the spreadsheet on reddit it looks like their starting to speed up a little??
|
# ? Nov 18, 2020 18:35 |
|
https://twitter.com/PlayGodfall/status/1329095241333432323?s=20 lmao
|
# ? Nov 18, 2020 18:37 |
|
Won't that be automatic for every game that has RT features on the new consoles?
|
# ? Nov 18, 2020 18:39 |
|
glad to see RT is gonna be crapshoot for another 5 years so i don't have to care about it lmao
|
# ? Nov 18, 2020 18:40 |
|
Numinous posted:Just received my EVGA email for ordering. I've got a 9/16 notify for the base model and still haven't received the email so don't feel too bad about not going with it.
|
# ? Nov 18, 2020 18:41 |
|
Ihmemies posted:Well isn't the PSU supposed to provide 500W if the label states 500W? The number on a psu box can mean a lot of things. On a quality one basically, dodgier units have been known to have an underperforming 12v rail and get tricky with the other ones to add up to a nice number. A passive psu is probably a quality one, but if you're buying white box computer hardware down the line for whatever reason...
|
# ? Nov 18, 2020 18:54 |
|
Surely that is as expected? They'd made absolutely zero reference to it at any point, a clear sign it was going to be unchanged to any non-trival degree. Best case scenario is they're working on a major rework that couldn't be delivered in time rather than putting in a minor incremental improvement for the sake of marketing.
|
# ? Nov 18, 2020 18:57 |
|
For those of us that don’t have the patience to monitor drops, is the EVGA step up worth it? It seems like I can buy whatever then immediately apply to step up to a 3070. Any suggestions for an EVGA card to tide me over in the meantime?
|
# ? Nov 18, 2020 18:57 |
|
Yikes at this Hardware Unboxed review of the 6800xt. Completely dismisses raytracing (two titles tested, Shadow of the Tomb Raider and Dirt 5), no mention of DLSS. You can perhaps downlplay those two features until more games support them, but to basically hand-wave them away at this point, especially as the new consoles will utilize RT extensively is bizarre.
|
# ? Nov 18, 2020 19:00 |
|
Happy_Misanthrope posted:Yikes at this Hardware Unboxed review of the 6800xt. Completely dismisses raytracing (two titles tested, Shadow of the Tomb Raider and Dirt 5), no mention of DLSS. You can perhaps downlplay those two features until more games support them, but to basically hand-wave them away at this point, especially as the new consoles will utilize RT extensively is bizarre. When discussing RT on the consoles anything the consoles can do RDNA2 will do better on PC because its the exact same hardware except more of it.
|
# ? Nov 18, 2020 19:03 |
|
Cygni posted:You’ve got other issues tbh. That’s not vram, and certainly not 6gb vram problems. Hardware acceleration in browsers can use a lot of vram
|
# ? Nov 18, 2020 19:08 |
|
Happy_Misanthrope posted:You can perhaps downlplay those two features until more games support them, but to basically hand-wave them away at this point, especially as the new consoles will utilize RT extensively is bizarre. Some people really want AMD to be a legitimate competitor again, and aren't letting anything stop them. Which is weird, because they're legit decent cards for a lot of games. But, uh, yeah. Raytracing is A Thing these days, guys.
|
# ? Nov 18, 2020 19:10 |
|
v1ld posted:Seems like NVidia's 30xx pricing anticipated this level of performance from the AMD cards even if no one else did after the Jensen presentation. there is also the BOM cost angle. NVIDIA is on Samsung, AMD is on TSMC and it's much more expensive (some rumors say twice as much). Getting aggressive with price hurts AMD more than it does NVIDIA, but AMD pretty much has to undercut NVIDIA at least a little unless they completely blow NVIDIA away in performance. The cache cuts BOM for the memory chips themselves but AMD has more VRAM (albeit cheaper, and with simpler PCBs), but probably at the cost of some additional die area for the cache (although you save some area on the memory PHYs as well). which is to say, they not only anticipated the level of performance but they analyzed the economics of AMD's launch, AMD cuts their own throat more at any given price point and they have to undercut, so an aggressive price point favors NVIDIA. Like, I'm sure AMD knows that fifty bucks for similar raster, no DLSS, and weak RT performance probably isn't a winner but they can't cut their own throat much further than knocking 50 bucks off, and the economics of using (nearly) GA102 sized silicon for a 3070 competitor is not great at all, that is why the 6800 non-XT is priced so bad. NVIDIA is using a smaller GA104 die to compete in that segment, AMD has to use their bigboi GA102 competitor. So aside from the marketing war of who has the best card at X price, I'm not sure AMD is ultimately winning the economic war here. NVIDIA probably still runs better margins, and every GPU AMD produces is a bunch of CPUs that didn't get produced, and they likely make more profit on those, and can insta-sell anything they produce. To some extent I'm surprised they're going forward this aggressively (Kyle Bennett thinks AIB drops are going to be large) but I guess they kinda have to if they don't want to lose mindshare. to some extent the feature deficit is part of "BOM cost reduction" as well - ultimately NVIDIA simply spends more transistors on tensor cores and RTX than AMD does, so they get more RT and ML performance. NVIDIA spent about 10% of the die on RTX (tensor+RT cores) last generation, if AMD implemented those features fully their would be almost 10% bigger (probably roughly equalizing the perf/mm2 gap) and the die would be correspondingly more expensive to manufacture. So you are paying with features, in a sense. I still wonder if GA102 is as big as it's going to get - it's big but it's not reticle-limit (or at least not at TSMC). You could still do a TU102-style behemoth chip at TSMC with maybe 30-40% larger die (700-750mm2), although I'm sure the economics would be 2080 Ti-style bad (consumers only get cutdowns and it's still $1500-2000). Arguing against that is the really poor scaling showing up even at the 3090 level, I suppose - 40% more shaders doesn't mean 40% more fps. Paul MaudDib fucked around with this message at 19:28 on Nov 18, 2020 |
# ? Nov 18, 2020 19:11 |
|
Ihmemies posted:Well isn't the PSU supposed to provide 500W if the label states 500W? I mean if you want to go by labels, doesn't the label on the 3080 say it requires a 750w psu? (Yes those are always pretty conservative...)
|
# ? Nov 18, 2020 19:11 |
|
Happy_Misanthrope posted:Yikes at this Hardware Unboxed review of the 6800xt. Completely dismisses raytracing (two titles tested, Shadow of the Tomb Raider and Dirt 5), no mention of DLSS. You can perhaps downlplay those two features until more games support them, but to basically hand-wave them away at this point, especially as the new consoles will utilize RT extensively is bizarre. Other weird poo poo going on in that review. He has SAM giving pretty significant performance benefits whereas other reviews have it giving very slight boosts, or larger boosts but where the 6800xt is already struggling.
|
# ? Nov 18, 2020 19:12 |
|
DrDork posted:Some people really want AMD to be a legitimate competitor again, and aren't letting anything stop them. I mean hell, I'm one of them! I don't think I would even be using RT that much based on current performance in most games, and I think 'DLSS or bust' is still a little premature until we see more games with it (and it's quality, while vastly improved with Ver 2+, can still vary somewhat). But this was just so blatant in disregarding them out of hand it's amazing, it's like a review from a 2-year old time capsule.
|
# ? Nov 18, 2020 19:15 |
|
Paul MaudDib posted:there is also the BOM cost angle. NVIDIA is on Samsung, AMD is on TSMC and it's much more expensive (some rumors say twice as much). Getting aggressive with price hurts AMD more than it does NVIDIA, but AMD pretty much has to undercut NVIDIA at least a little unless they completely blow NVIDIA away in performance. Flip side of that is AMD probably could have launched $100 more expensive than they did and not undercut nvidia at all and still sold every card. It's kind of weird they've both opted for aggressive pricing combined with 0 availability.
|
# ? Nov 18, 2020 19:15 |
|
VorpalFish posted:Flip side of that is AMD probably could have launched $100 more expensive than they did and not undercut nvidia at all and still sold every card. It's kind of weird they've both opted for aggressive pricing combined with 0 availability. NVidia, at least, I think legitimately did not understand the amount of demand for their cards, or they probably would have launched higher. AMD has no choice but to follow suit, because the PR hit from launching at prices above Ampere with performance below it would inevitably dog the card's reputation for the entire cycle, which could end up in a net loss for them once inventory stabilizes.
|
# ? Nov 18, 2020 19:20 |
|
the problem with dlss is that few games support it and even fewer actually worth playing - I can understand dismissing it at this point cause most gamers just wanna play games at good framerates and don't care about a technology that's been really slow to adopt
|
# ? Nov 18, 2020 19:21 |
|
lol every DXR game to date works on RDNA2, despite the developers not having any RDNA2 hardware to test against, but AMD partners start shipping raytracing and suddenly cross-vendor is too hard that's one way to avoid losing to nvidia in benchmarks
|
# ? Nov 18, 2020 19:28 |
|
VorpalFish posted:I mean if you want to go by labels, doesn't the label on the 3080 say it requires a 750w psu? 600 feels like trying to skirt by. 500 just seems like a bad idea. I mean if it works it works but don’t complain if the computer turns off suddenly when you are playing a game.
|
# ? Nov 18, 2020 19:29 |
|
b0ner of doom posted:the problem with dlss is that few games support it and even fewer actually worth playing - I can understand dismissing it at this point cause most gamers just wanna play games at good framerates and don't care about a technology that's been really slow to adopt DLSS is being added to already-existing games right now, War Thunder being my mostly played one. It's a very relevant feature that improves quality, and raises frame rates. 4K60+ won't be realistically achievable without it.
|
# ? Nov 18, 2020 19:30 |
|
b0ner of doom posted:the problem with dlss is that few games support it and even fewer actually worth playing - I can understand dismissing it at this point cause most gamers just wanna play games at good framerates and don't care about a technology that's been really slow to adopt It's weird to talk about how few games support DLSS and then talk about "most gamers"...who are mostly playing a few games that do support it and are extremely framerate sensitive, like Fortnite and Call of Duty (although the new one's implementation of DLSS isn't great?). Most gamers in this market are playing just a handful of multiplayer FPS games and want as many frames per second as possible; that's going to be the market that Nvidia and AMD will be fighting over in terms of video card upgrades. The success metric I would measure for DLSS is "total hours played across all games that support DLSS", not "how many games support DLSS". space marine todd fucked around with this message at 19:41 on Nov 18, 2020 |
# ? Nov 18, 2020 19:37 |
|
Os Furoris posted:For those of us that don’t have the patience to monitor drops, is the EVGA step up worth it? It seems like I can buy whatever then immediately apply to step up to a 3070. Good luck getting one to use for Step Up
|
# ? Nov 18, 2020 19:38 |
|
space marine todd posted:It's weird to talk about how few games support DLSS and then talk about "most gamers"...who are mostly playing a few games that do support it and are extremely framerate sensitive, like Fortnite and Call of Duty (although the new one's implementation of DLSS isn't great?). Most gamers in this market are playing just a handful of multiplayer FPS games and want as many frames per second as possible. Ehh, if you're going to look at fortnites numbers you have to factor in how many fortniters are using DLSS capable hardware. I can't imagine "people with a $500+ gpu" (or even $300, really) comprise a significant portion of the playerbase.
|
# ? Nov 18, 2020 19:44 |
|
Some Goon posted:Ehh, if you're going to look at fortnites numbers you have to factor in how many fortniters are using DLSS capable hardware. I can't imagine "people with a $500+ gpu" (or even $300, really) comprise a significant portion of the playerbase. Everyone that is thinking they're playing competitively at a high level is going to be running the fastest they can afford. to get above whatever their monitor can do (144/240)
|
# ? Nov 18, 2020 19:45 |
|
So I guess 3070 is the way to go for mid-range? I was waiting for the 6800 benchmarks to drop and it looks like you get a bump in standard performance but lose a lot in terms of featureset for $80 more. I have a 4k60 monitor but I think I'll be happy with going 1440p60 on high settings by futzing with the render res if necessary - running a 1660 super right now and it dips frames a bit too much. Add in DLSS and now you got a stew going.
|
# ? Nov 18, 2020 19:46 |
|
Kunabomber posted:So I guess 3070 is the way to go for mid-range? I was waiting for the 6800 benchmarks to drop and it looks like you get a bump in standard performance but lose a lot in terms of featureset for $80 more. I have a 4k60 monitor but I think I'll be happy with going 1440p60 on high settings by futzing with the render res if necessary - running a 1660 super right now and it dips frames a bit too much. Add in DLSS and now you got a stew going. Those leaked 3060 TI numbers looked pretty nutty for $300 but are probably fake so 3070 is it for the moment.
|
# ? Nov 18, 2020 19:47 |
|
Hah. Watching this interview with RTG people concerning 6000 series. https://www.youtube.com/watch?v=FvE7GeaPjjA When asked about their reply to DLSS, they said their solution is being developed in partnership with the console vendors and game developers. The bullet points were: - RTG don't want any performance hit - Want really good image quality / scaling - RTG claimed developers beg them not to make an AMD-specific solution - Goal is for it to work on all GPUs including Intel and Nvidia (suggesting a shader-based solution)
|
# ? Nov 18, 2020 19:51 |
|
hobbesmaster posted:Everyone that is thinking they're playing competitively at a high level is going to be running the fastest they can afford. to get above whatever their monitor can do (144/240) Sure, but the millions of teenagers the are the bulk of the playerbase just don't have the money. The people you're talking about are out there, but I question their statistical significance. They're also most likely playing at 1080p for that sweet 240, which even a 1650 can hit at low, as would-be competitive players are wont to do.
|
# ? Nov 18, 2020 19:51 |
|
Riflen posted:When asked about their reply to DLSS, they said their solution is being developed in partnership with the console vendors and game developers. The bullet points were: The comedy option would be if AMD comes up with an ML based solution but implements it in vanilla compute shaders instead of DirectML, so Nvidia can't take advantage of their tensor cores
|
# ? Nov 18, 2020 19:56 |
|
hobbesmaster posted:Everyone that is thinking they're playing competitively at a high level is going to be running the fastest they can afford. to get above whatever their monitor can do (144/240) people who think they're playing competitively at a high level and going for MAXIMUM FRAMES over all else are also exactly the sort of people who are amenable to arguments that superior performance at rasterization trumps pretty much anything to do with RT DLSS is a big advantage for Nvidia, but everything RT related is inherently predicated in a quality argument at this point in time
|
# ? Nov 18, 2020 19:57 |
|
|
# ? Apr 28, 2024 22:32 |
|
Kunabomber posted:So I guess 3070 is the way to go for mid-range? I was waiting for the 6800 benchmarks to drop and it looks like you get a bump in standard performance but lose a lot in terms of featureset for $80 more. I have a 4k60 monitor but I think I'll be happy with going 1440p60 on high settings by futzing with the render res if necessary - running a 1660 super right now and it dips frames a bit too much. Add in DLSS and now you got a stew going. The way to go at this point is whatever card you can successfully check out with, then make a sacrifice to the elder gods so your order doesn't get cancelled.
|
# ? Nov 18, 2020 20:01 |