Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Truga
May 4, 2014
Lipstick Apathy

akadajet posted:

I have a 750 Watt Power supply with a Ryzen 3900X. I'm guessing I'm still good for grabbing any of the cards being announced?

yes, 3900x is like 150w if you *really* push it.

Adbot
ADBOT LOVES YOU

akadajet
Sep 14, 2003

I see all these articles going "oh my god you need 850 watts" and then I look at power supply calculators showing like 470 watts being what I actually need.

VorpalFish
Mar 22, 2007
reasonably awesometm

Reminder that recommended wattage is/was often made assuming the cheapest, garbage psu you could possibly by that may or may not even hit its rated specs.

If you have a good platform made by a good oem you can probably get away with much less.

Like I'd have zero qualms running a 400w gpu on a 750w seasonic unit.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

akadajet posted:

I have a 750 Watt Power supply with a Ryzen 3900X. I'm guessing I'm still good for grabbing any of the cards being announced?

Absolutely. As noted, a 3900X should stay under 150W unless you're gettin' real weird, and even if the 3090 can actually run at like 400W under load, that's still 200W you've got to play with for everything else. A normal motherboard is like ~75W, and RAM + SSDs is like...10W. You won't be at the peak efficiency for your PSU, but that's pretty much irrelevant.

Most quality PSUs can actually support transient loads above their rated wattage, too, so a momentary spike shouldn't be a problem, but obviously you don't want to be running it continuously at/very near its max limit.

I think a lot of people worried about this sort of thing would benefit from getting a simple $20 kill-o-watt or similar and checking out how much power their current rig actually uses and working from there.

MikeC
Jul 19, 2004
BITCH ASS NARC

Mr.PayDay posted:

So how many more weeks or months will AMD be stalling leaks and news?

Will they openly allow losing the 3070 and 3080 target group? Because they will if the rumors are true they won’t launch new AMD GPUs before November.

That’s like 2 months of Nvidia selling next gen GPUs without any competition.

They won't say a word until Jensen takes the stage. There is no reason to tip your hand first of you cant launch first.

It's not like cards are on sale right now. It is unlikely they will take the performance crown anyways so they can let the 3090 have its day. They just need to counter punch before the 3070 hits.

Worf
Sep 12, 2017

If only Seth would love me like I love him!

Mr.PayDay posted:

Will they openly allow losing the 3070 and 3080 target group?

100%

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

MikeC posted:

They won't say a word until Jensen takes the stage. There is no reason to tip your hand first of you cant launch first.

I disagree. While I wouldn't be dropping price points or specific performance notes yet, they could certainly have started priming the hype train by now and at least keeping themselves in people's minds so they don't automatically just buy NVidia for a lack of any other options. They did this before, albeit poorly, with the "Poor volta" stuff. Instead there's been...nothing.

It's gonna be real hard for them to do anything useful once Ampere launches unless they're ready to commit to both price and performance targets like the day after Ampere is revealed, and even then they'd have to have cards near ready to go, which by all accounts they do not.

redreader
Nov 2, 2009

I am the coolest person ever with my pirate chalice. Seriously.

Dinosaur Gum
Is the 3070 supposed to come later? I figure with a 2k 144hz monitor, I'll just get a 3070 and call it a day. Over my 980 I'll get 4 more gb of vram, ray tracing and dlss compatibility, and g-sync for my monitor. Rather than getting the 3080 for the extra 2gb of vram, I'll just save money and get the next series if I need to. Same as my ryzen 3600, I'll get the cheaper option and upgrade when I feel like I need to.

MikeC
Jul 19, 2004
BITCH ASS NARC

DrDork posted:

I disagree. While I wouldn't be dropping price points or specific performance notes yet, they could certainly have started priming the hype train by now and at least keeping themselves in people's minds so they don't automatically just buy NVidia for a lack of any other options. They did this before, albeit poorly, with the "Poor volta" stuff. Instead there's been...nothing.

It's gonna be real hard for them to do anything useful once Ampere launches unless they're ready to commit to both price and performance targets like the day after Ampere is revealed, and even then they'd have to have cards near ready to go, which by all accounts they do not.

It's a damned if you do and damned if you don't situation. If they are vague then any effort they put in will just be swept away by Jensen the moment they get onstage. They might even be openly mocked because people will say they are too scared to show what they have.

If they say this is what Big Navi can do, then Jensen can react appropriately on stage and have the ability to react AND launch first so why would you do that?

It's not like the cards are going on sale the moment Jensen walks out. Dodge the hype train and counter punch once you know the pricing and specs you are competing against.

FuturePastNow
May 19, 2014


akadajet posted:

I see all these articles going "oh my god you need 850 watts" and then I look at power supply calculators showing like 470 watts being what I actually need.

VorpalFish posted:

Reminder that recommended wattage is/was often made assuming the cheapest, garbage psu you could possibly by that may or may not even hit its rated specs.

The recommendations basically have to keep two things in mind:

1) People who, as VorpalFish said, have trash quality gray PSUs that can't deliver half their advertised output

2) Worst case scenario users with an overclocked Threadripper, 8 overclocked DIMMs, an overkill huge pump for liquid cooling and twenty hard drives full of porn

the latter is probably a more likely target audience for the 3090 but you never know

FuturePastNow fucked around with this message at 18:43 on Aug 28, 2020

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

MikeC posted:

It's a damned if you do and damned if you don't situation. If they are vague then any effort they put in will just be swept away by Jensen the moment they get onstage. They might even be openly mocked because people will say they are too scared to show what they have.

If they say this is what Big Navi can do, then Jensen can react appropriately on stage and have the ability to react AND launch first so why would you do that?

It's not like the cards are going on sale the moment Jensen walks out. Dodge the hype train and counter punch once you know the pricing and specs you are competing against.

They've done same-day (or near enough) launches before. Paper launches, to be sure, but that's good enough for people with $1000 looking for a new card.

I agree with you that it would be a bad play to show actual specs/performance/prices prior to Ampere's reveal. But to have pretty much not even mentioned that they're working on new cards for the last 6 months or so does not suggest much confidence in their product. Maybe that's the take, then: they know even Big Navi is gonna struggle against anything above the 3070 (and maybe even there), so why waste marketing dollars on trying to get hype and mindshare and attention when they're just gonna have to play the value card again?

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
The3080/3090 being on 7nm is great news. That was basically my final concern. 3090 it is.

That also dulls the prospect of a superior 7nm super refresh that people were worried about. IMO the 3090 will be the best card until Hopper. I don't think they'll surpass it (on purpose or just because the 3090 silicon is pushed to its limits), though I would assume they would let a theoretical 3080 super get within 10-20%.

Very pleased rn.

Taima fucked around with this message at 18:59 on Aug 28, 2020

repiv
Aug 13, 2009

It's hard to make out but it looks like there's only two 8pins on this board

https://twitter.com/sfdxpro/status/1299403085043957768

Cygni
Nov 12, 2005

raring to post

Some bigger pics, def 8pins (2 and 3 depending on the model) and not the 12s.

I kinda feel like the only downside to the 12 is the adapter cable wont be as purdy as my cablemod extenders?

https://wccftech.com/zotac-geforce-rtx-3090-geforce-rtx-3080-geforce-rtx-3070-custom-graphics-cards-pictured/

Worf
Sep 12, 2017

If only Seth would love me like I love him!



that is not a better alternative to a new 12 pin cable imho

MikeC
Jul 19, 2004
BITCH ASS NARC

DrDork posted:

But to have pretty much not even mentioned that they're working on new cards for the last 6 months or so does not suggest much confidence in their product. Maybe that's the take, then: they know even Big Navi is gonna struggle against anything above the 3070 (and maybe even there), so why waste marketing dollars on trying to get hype and mindshare and attention when they're just gonna have to play the value card again?

What are you talking about? RDNA 2.0 has been on every roadmap and presentation. They publicly talk about the fact it will be on an enhanced 7nm node. You have to be under a rock to not know RDNA 2 doesnt exist.

They are just keeping their marketing powder dry for reasons already stated. And as stated before something is very wrong if they can only match a 3070 on a 80 CU chip when the XBOX gets close on a 56 CU SoC.

Worf
Sep 12, 2017

If only Seth would love me like I love him!

MikeC posted:

What are you talking about? RDNA 2.0 has been on every roadmap and presentation. They publicly talk about the fact it will be on an enhanced 7nm node. You have to be under a rock to not know RDNA 2 doesnt exist.

They are just keeping their marketing powder dry for reasons already stated. And as stated before something is very wrong if they can only match a 3070 on a 80 CU chip when the XBOX gets close on a 56 CU SoC.

which AMD GPUs at what price points and specs is RDNA2 on the 7nm enhanced node going to be in

it would probably shock even most staunch AMD fans if they were able to compete with a 3070 any time soon.

it would shock me if they could compete with Turing any time soon.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

MikeC posted:

They are just keeping their marketing powder dry for reasons already stated. And as stated before something is very wrong if they can only match a 3070 on a 80 CU chip when the XBOX gets close on a 56 CU SoC.

Thing is, none of that's actually been shown. We have TFLOPs out of the XBox, which is...uh...vaguely useful in a general sense, but as we've seen before, AMD cannot always translate raw FLOPs into meaningful gaming performance.

And, sure, RDNA2 "exists." And that's...about it. No indication of how big they're even making it--the 80 CU part is a rumor, that's it. The only part they've actually put much into is saying that RDNA 2 should offer up to 50% better perf/watt than RDNA 1, which is cool, but doesn't say much for final performance.

AMD has said pretty much diddly about any actual PC cards using RDNA2, and instead has let Sony/Microsoft chatter about FLOPs and leave everything else as a wild unknown.

Statutory Ape posted:

it would probably shock even most staunch AMD fans if they were able to compete with a 3070 any time soon.

Eh. If an 80-CU part does exist, and if we can assume a flat 50% increase in performance (lol), it could very well beat a 3070. Of course, an 80 CU part would also be hilariously expensive, so it might be a case of a $1000+ AMD card trying to trade punches with a $600-$800 NVidia card. :iiam:

DrDork fucked around with this message at 19:27 on Aug 28, 2020

ijyt
Apr 10, 2012

Statutory Ape posted:



that is not a better alternative to a new 12 pin cable imho

What do you mean dude, this is perfectly fine and tenable and will serve us well for another 20 years.

Worf
Sep 12, 2017

If only Seth would love me like I love him!

ijyt posted:

What do you mean dude, this is perfectly fine and tenable and will serve us well for another 20 years.

in 20 years there will just be 3 more power cables and its going to look like locutus of gpu

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

if it's 7nm after all and they still went balls-to-the-wall on die size and wattage...

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Paul MaudDib posted:

if it's 7nm after all and they still went balls-to-the-wall on die size and wattage...

Then I will finally not have to justify to any of ya'll hecklers about wanting to watercool GPUs!

Though in all seriousness I do wonder how the GDDR6X heat bit is gonna work out. My current case has a big ol 8" fan pointed generally at the GPU area, so I'm not too worried as long as some mild airflow with stick-on heatsinks will be enough, but I'll be mighty annoyed if they need more than that.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

Paul MaudDib posted:

if it's 7nm after all and they still went balls-to-the-wall on die size and wattage...

I really think it's the GDDR6X, though they are obviously going nuts on everything else. There's an enormous jump in TDP between the 3080 and 3070, 100w, with the 3080 using GDDR6X and the 3070 using GDDR6. The difference between a 2070 and a 2080 was 45w, both on GDDR5X.

My guess is that 24GB of GDDR6X is probably eating close to 100w of power on its own.

axeil
Feb 14, 2006

Not on the list :negative:

Guess I'm stuck with AMD then.

To be fair, I run at 1080p so I don't think it really matters whether I go Nvidia or AMD. The only reason I'm looking into upgrading is because I ordered an HP Reverb G2 VR set and would like to run it at full resolution rather than half resolution.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

axeil posted:

Not on the list :negative:

Guess I'm stuck with AMD then.

To be fair, I run at 1080p so I don't think it really matters whether I go Nvidia or AMD. The only reason I'm looking into upgrading is because I ordered an HP Reverb G2 VR set and would like to run it at full resolution rather than half resolution.

You can use any Displayport using Freesync monitor on Nvidia cards now. There's a list of "G-Sync Compatible" monitors which are monitors Nvidia has specifically tested and approved their VRR implementation. But you just click a box in the NVCP and tell it to enable G-Sync on your Freesync monitor if it hasn't been validated yet. My monitor (LG 34GK950F-B) is not on the approved list, but works perfectly with G-Sync on Nvidia.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

axeil posted:

Not on the list :negative:

Guess I'm stuck with AMD then.

To be fair, I run at 1080p so I don't think it really matters whether I go Nvidia or AMD. The only reason I'm looking into upgrading is because I ordered an HP Reverb G2 VR set and would like to run it at full resolution rather than half resolution.

just quit beating around the bush and tell us which monitor and which GPU you've got/are looking to get

Paul MaudDib fucked around with this message at 19:48 on Aug 28, 2020

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Beautiful Ninja posted:

I really think it's the GDDR6X, though they are obviously going nuts on everything else. There's an enormous jump in TDP between the 3080 and 3070, 100w, with the 3080 using GDDR6X and the 3070 using GDDR6. The difference between a 2070 and a 2080 was 45w, both on GDDR5X.

My guess is that 24GB of GDDR6X is probably eating close to 100w of power on its own.

That'd be crazy if it turns out to be the case. It's hard to find actual wattage numbers for GDDR6 dies, but some stuff I found said that 8GB (8x1Gb packages) of GDDR5X ran about 20W. GDDR6 is ~15% more power efficient than GDDR5, so if that held for the X we'd expect 8GB for ~17W. But if we assume 50W of the jump from the 3070 to 3080 is due to the GDDR6X, that's 40W/8GB extra, which would mean they'd be like...60W/8GB total? 300% more than GDDR5X? Juicing them a little I can see, but that's crazy.

e; while I think some power increase due to GDDR6X is likely, it makes me wonder even more if they're gonna do a software segmentation, too: like the 3080/90 out of the box come with all the previous power-target sliders slammed all the way to the right, so they're operating at max power immediately, while the 3070 on down come set with a more "normal" power target that may or may not be allowed to be increased by the user. I mean, if we consider that a maxed out loaded 2080Ti would hit 330+W no problem, the reported 3080/3090 TDPs seem a lot more reasonable.

DrDork fucked around with this message at 19:52 on Aug 28, 2020

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

DrDork posted:

Then I will finally not have to justify to any of ya'll hecklers about wanting to watercool GPUs!

Though in all seriousness I do wonder how the GDDR6X heat bit is gonna work out. My current case has a big ol 8" fan pointed generally at the GPU area, so I'm not too worried as long as some mild airflow with stick-on heatsinks will be enough, but I'll be mighty annoyed if they need more than that.

I think that adds up, you'll note that it's only the 3090 that is rumored to come with the special bigboi double-sided cooler right? Because that's the only one that has memory modules on the back...

If true an earlier comment that G10/G12/Morpheus style coolers are probably not going to be compatible probably is correct, a little airflow isn't going to cut it, you'll want at least a small amount of real cooling there and and you may even need a second waterblock module on the back.

100W of GDDR6X would be pretty spicy.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Paul MaudDib posted:

I think that adds up, you'll note that it's only the 3090 that is rumored to come with the special bigboi double-sided cooler right? Because that's the only one that has memory modules on the back...

Yeah, I could see even if they just needed to dissipate ~2.5W/package wanting to have active cooling on the backside for the YOLO crowd who will try to find a way to shove it in a SFF case.

Which, if I'm being perfectly honest, would probably be me if I had any reason to go top-of-the-line. I'll probably end up with a 3080 if the pricing rumors are anywhere near accurate, as I'd prefer smaller and quieter over the extra power these days.

Worf
Sep 12, 2017

If only Seth would love me like I love him!

DrDork posted:

as I'd prefer smaller and quieter over the extra power these days.

as always i am extremely excited to see what kind of sick performance we get at the slot powered and 10/2060 level



considering the difference in power req, i was very pleased with how the 1650 compared to the 1060.

the super version benchmarks higher, but iirc needs more than a PCIE rail for power

VorpalFish
Mar 22, 2007
reasonably awesometm

Re: gddr6x, anandtech's article has some power information.

https://www.anandtech.com/show/15978/micron-spills-on-gddr6x-pam4-signaling-for-higher-rates-coming-to-nvidias-rtx-3090

Basically slightly better efficiency than gddr6 but once you account for the much higher data rate they expect 25% higher consumption in absolute terms. Then factor in more than double the amount of the 2080ti and...

akadajet
Sep 14, 2003


this long



this thick




axeil
Feb 14, 2006

Beautiful Ninja posted:

You can use any Displayport using Freesync monitor on Nvidia cards now. There's a list of "G-Sync Compatible" monitors which are monitors Nvidia has specifically tested and approved their VRR implementation. But you just click a box in the NVCP and tell it to enable G-Sync on your Freesync monitor if it hasn't been validated yet. My monitor (LG 34GK950F-B) is not on the approved list, but works perfectly with G-Sync on Nvidia.

Oh dang, that's really good to know. Okay, I'm now Very Interested in what Nvidia is going to say next week, especially if AMD isn't going to put anything out until Xmas. My monitor is a weird AOC variant (AOC 2460 G4).

Guessing I'd probably want the 3070 for VR stuff, right?

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
It still depends on WHAT Freesync monitor you have, so like Paul said you should tell us. There is a huge variety of monitors with varying levels of Freesync support, some of them are not even worth bothering with and others are really good.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

axeil posted:

Guessing I'd probably want the 3070 for VR stuff, right?

Depends on how balls to the wall you want to go, but yeah, I'd probably start there and then move up depending on your budget/need to run everything at MAXXX.

axeil
Feb 14, 2006

K8.0 posted:

It still depends on WHAT Freesync monitor you have, so like Paul said you should tell us. There is a huge variety of monitors with varying levels of Freesync support, some of them are not even worth bothering with and others are really good.

Yeah fair enough. It's an AOC G2460PF: https://www.amazon.com/AOC-G2460PF-1920x1080-Adjustable-DisplayPort/dp/B01BV1XBEI

A reddit thread seems to indicate it'll work but I trust y'all's opinion more than them.

https://old.reddit.com/r/pcmasterrace/comments/ago5e3/enabling_freesync_on_aoc_g2460pf_for_nvidia_new_g/

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
that's listed as FreeSync Premium which is AMD's version of GSync Compatible, so it should be fine. It'll run Adaptive Sync on any 10-series card or newer.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

VorpalFish posted:

Re: gddr6x, anandtech's article has some power information.

https://www.anandtech.com/show/15978/micron-spills-on-gddr6x-pam4-signaling-for-higher-rates-coming-to-nvidias-rtx-3090

Basically slightly better efficiency than gddr6 but once you account for the much higher data rate they expect 25% higher consumption in absolute terms. Then factor in more than double the amount of the 2080ti and...

quote:

PAM4 itself is not a new technology, but up until now it’s been the domain of ultra-high-end networking standards like 200G Ethernet, where the amount of space available for more physical channels is even more limited. As a result, the industry already has a few years of experience working with the signaling standard, and with their own bandwidth needs continuing to grow, the PCI-SIG has decided to bring it inside the chassis by basing the next generation of PCIe upon it.
...
Thus far, PAM4 signaling has only been used for networking and expansion buses, so using it for a memory bus, though a logical extension, would represent a big leap in technology. Now Micron has to develop memory that can not only do clean PAM4 modulation – which is not a simple task – but NVIDIA needs a matching memory controller on the other end. It’s doable, and probably inevitable, but it’s a big change from how memory buses have traditionally operated – even high-speed buses like those used for GDDR.

this is a really interesting bit in light of NVIDIA's acquisition of Mellanox. They may be getting synergies with their enterprise hardware, reusing parts of or the whole PHYs designed around multi-level signalling.

Paul MaudDib fucked around with this message at 20:54 on Aug 28, 2020

FuturePastNow
May 19, 2014


Does the "gsync on a freesync" monitor thing still require use of DisplayPort? I don't think it works over the HDMI connection even if that will support freesync with a Radeon.

Adbot
ADBOT LOVES YOU

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

FuturePastNow posted:

Does the "gsync on a freesync" monitor thing still require use of DisplayPort? I don't think it works over the HDMI connection even if that will support freesync with a Radeon.

correct, unless it's HDMI VRR. FreeSync over HDMI is a proprietary Radeon extension and NVIDIA doesn't implement that. Long term that's going away and HDMI VRR is the new standard for that.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply