Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
repiv
Aug 13, 2009

Taima posted:

It's dawning on me that if I want to have an appropriate cable ready for when this GPU comes I'm probably going to have to roll the dice on some bullshit and hope for the best. sigh. At least I can return it if it doesn't work I guess.

i would be very surprised if this random company that primarily makes phone cases has managed to make an actually working hdmi 2.1 optical cable before any reputable brand

Adbot
ADBOT LOVES YOU

Mercrom
Jul 17, 2009

repiv posted:

https://twitter.com/Dachsjaeger/status/1301234503386570757

Performance is quite a bit better now, and they added some more sample RTX worlds to gawk at

Is the view distance still really limited compared to normal Minecraft? What's the bottleneck there anyway?

sean10mm
Jun 29, 2005

It's a Mad, Mad, Mad, MAD-2R World

Zero VGS posted:

I mean that all sounds wishy-washy and I don't care what's going on in Cinebench or whatever. I just wanna see like, how many watts is each system pulling from the wall while running newest Call of Duty benchmark in 4K, and what are the average FPS or whatever. So I can get close to real life gaming. How many watts and what am I really getting out of each.

What those graphs show is that, all else being equal, Intel guzzles way more power than AMD, and as you overclock the gap widens.

It's not the exact use case of a game, it's actually a WAY tougher one for the CPU. But it proves out the basic principle that Intel burns more power to do the same work.

People don't really test the exact case you're describing because extreme gamer websites don't care about differences in CPU power consumption when the GPU is the thing doing all the real work and guzzling the most the power anyway.

Like, if you're concerned with 4K game performance, you're stuck using the GPU that guzzles 320-350 watts. In that scenario the intel won't guzzle 235 watts, but the AMD won't need to use 160 or whatever either - both will be less by some amount because it's less work for the CPU either way.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

sean10mm posted:

Like, if you're concerned with 4K game performance, you're stuck using the GPU that guzzles 320-350 watts. In that scenario the intel won't guzzle 235 watts, but the AMD won't need to use 160 or whatever either - both will be less by some amount because it's less work for the CPU either way.

Yeah, the end information you're looking for here is:

A Ryzen 3600 system will eat about 150-175W gaming, plus whatever the hell the GPU is doing. If you're looking at a 3080 and it's 320W sticker, that means you'll need at least a 550W PSU to run everything. Might be able to get away with a 500W one if you're ok with undervolting to bring power use down a little. If you've got some 450W SFF PSU, then you should be looking at the 3070 and be ecstatic that it'll get you 2080Ti-level performance for $600.

repiv
Aug 13, 2009

Mercrom posted:

Is the view distance still really limited compared to normal Minecraft? What's the bottleneck there anyway?

It's still limited but raising the draw distance doesn't kill performance nearly as much as it used to. The max draw distance they allow with RT enabled is 24 chunks.

I'm running around one of the sample worlds and getting 70-80fps at 1440p with 24 chunk draw distance and DLSS enabled, on a 2070 super

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Taima posted:

It's dawning on me that if I want to have an appropriate cable ready for when this GPU comes I'm probably going to have to roll the dice on some bullshit and hope for the best. sigh. At least I can return it if it doesn't work I guess.

If that's how you wanna do it you should stick with Amazon (who I believe are cutting some slack on return windows due to Covid but you can ask in their chat to make sure) or Monoprice who do honor their warranties with regards to performing to spec (but maybe wait a few weeks to see if they actually push out a 2.1).

sean10mm posted:

What those graphs show is that, all else being equal, Intel guzzles way more power than AMD, and as you overclock the gap widens.

It's not the exact use case of a game, it's actually a WAY tougher one for the CPU. But it proves out the basic principle that Intel burns more power to do the same work.

People don't really test the exact case you're describing because extreme gamer websites don't care about differences in CPU power consumption when the GPU is the thing doing all the real work and guzzling the most the power anyway.

Like, if you're concerned with 4K game performance, you're stuck using the GPU that guzzles 320-350 watts. In that scenario the intel won't guzzle 235 watts, but the AMD won't need to use 160 or whatever either - both will be less by some amount because it's less work for the CPU either way.

I hear ya, it's just that my total budget is 480 watts and if the CPU spikes a few seconds longer or x watts higher than it should, my PC will shut off from overcurrent condition.

Comedy option is that I could go full :homebrew: and run a second power supply in parallel so that I have 12v at 80 amps, assuming my silver-coated copper wires don't melt, lol

This is my 12v 40a fanless PSU, by the way: https://www.digikey.com/product-detail/en/mean-well-usa-inc/HEP-600-12/1866-2285-ND/7703841

DrDork posted:

If you're looking at a 3080 and it's 320W sticker, that means you'll need at least a 550W PSU to run everything. Might be able to get away with a 500W one if you're ok with undervolting to bring power use down a little. If you've got some 450W SFF PSU, then you should be looking at the 3070 and be ecstatic that it'll get you 2080Ti-level performance for $600.

Yeah so looking at the whitepaper for my PSU, it is rated for 480w constant, and claims to have a "105%-125% overcurrent limit". I take that range to mean it is dependent on it's temperature, like as long as it stays cool it might be able to get close to 600w.

The other thing in my favor is that I believe normal PSU are rated for how much wattage they can pull from the wall, i.e. a 600w ATX power supply at 90% efficiency can actually only supply 540w to the system, whereas my PSU which is more of a bench power supply which is made to maintain the watts it is rated for.

mobby_6kl
Aug 9, 2009

by Fluffdaddy
Gamers Nexus had exactly this kind of analysis just before the Ampere release actually. https://www.youtube.com/watch?v=X_wtoCBahhM&t=576s tl;dw: doesn't seem like there's much to it once the GPU gets involved.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Zero VGS posted:

Yeah so looking at the whitepaper for my PSU, it is rated for 480w constant, and claims to have a "105%-125% overcurrent limit". I take that range to mean it is dependent on it's temperature, like as long as it stays cool it might be able to get close to 600w.

The other thing in my favor is that I believe normal PSU are rated for how much wattage they can pull from the wall, i.e. a 600w ATX power supply at 90% efficiency can actually only supply 540w to the system, whereas my PSU which is more of a bench power supply which is made to maintain the watts it is rated for.

That overcurrent limit is supposed to be for transient loads. Like so that if your CPU spikes for 1/10th a second to 490W and then goes back down below 480W it doesn't shut off. You shouldn't be running >480W for any extended time.

You're welcome to try and find out, but also keep in mind that in the past NVidia's TDP numbers have been, uh, something of white lies. That "280W" 2080Ti would regularly run at 330W, for example. So you could probably keep a 3080 at 320W, but you'd almost certainly be leaving performance on the table by doing so.

e; You're also incorrect about the power labeling. PC PSUs are labeled according to what they can supply to the computers, not by how much they draw from the wall. If you take a look at the label on a PC PSU, you'll see it has the various voltage rails and max load amps, and a combined wattage output section. That wattage output is what gets slapped on the side as a "550W" or whatever PSU. Input power is impossible for them to calculate since it depends on load and whether you're using 115 or 230v mains. A Bronze 80% efficient 500W PSU running full tilt would be pulling ~600W from the wall. So I don't think you're getting any help there.

DrDork fucked around with this message at 23:39 on Sep 2, 2020

Llamadeus
Dec 20, 2005

Zero VGS posted:

The other thing in my favor is that I believe normal PSU are rated for how much wattage they can pull from the wall, i.e. a 600w ATX power supply at 90% efficiency can actually only supply 540w to the system
Pretty sure it works the opposite way, otherwise people would have to be looking up efficiency curves to figure out what PSU they actually needed.

Cinara
Jul 15, 2007

Llamadeus posted:

Pretty sure it works the opposite way, otherwise people would have to be looking up efficiency curves to figure out what PSU they actually needed.

Correct. PSU Efficiency rating is how efficient it is at pulling the listed power at different wattage levels. So a 600w 90% efficiency would actually be pulling aroung 666w at max load.

E: Yea what DrDork said.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Llamadeus posted:

Pretty sure it works the opposite way, otherwise people would have to be looking up efficiency curves to figure out what PSU they actually needed.

Ah okay guess I was misremembering. At least I can run a parallel PSU if this doesn't cut it. Because the PSU puts out a very precise 12.0 volts I could even wire one PSU directly to the GPU pins with an on/off switch and the other to the PC.

BOOTY-ADE
Aug 30, 2006

BIG KOOL TELLIN' Y'ALL TO KEEP IT TIGHT

Cinara posted:

666w at max load.

THE WATTAGE OF THE BEAST

Kraftwerk
Aug 13, 2011
i do not have 10,000 bircoins, please stop asking

Is there any appreciable difference between gaming in 1440P and 4K to justify jumping to 4K monitors? I thought 1440P Was very impressive compared to 1080 and have been quite happy with it but everyone seems to be talking up 4k now.

krysmopompas
Jan 17, 2004
hi

repiv posted:

who knows, it's a random unaccountable chinese brand that could be buying cables with hdmi 2.0 optical transceivers and selling them as hdmi 2.1 capable

by the time anyone actually gets a hdmi 2.1 source device and realises they got scammed it's too late
I’ve got several 10 meter runs of both displayport and hdmi (2.0) of the cosemi optical cables, never had a problem. No issues using random power injectors for unpowered hdmi sources either.

I’d try their 2.1 cables based on my experience so far.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

Kraftwerk posted:

Is there any appreciable difference between gaming in 1440P and 4K to justify jumping to 4K monitors? I thought 1440P Was very impressive compared to 1080 and have been quite happy with it but everyone seems to be talking up 4k now.

It's a giant difference if you will be gaming on a large format HDMI 2.1 display. Diminishing returns on small displays.

ufarn
May 30, 2009

Kraftwerk posted:

Is there any appreciable difference between gaming in 1440P and 4K to justify jumping to 4K monitors? I thought 1440P Was very impressive compared to 1080 and have been quite happy with it but everyone seems to be talking up 4k now.
If more games would have a native render scale feature in their graphics settings, you'd probably be fine in 1440p.

DLSS 2.0 makes 4K very feasible in the like five games that support it, but 1440p is still the best high-end pick for both quality and performance.

4K people might also have Ultrawide and other weird setups beyond 27".

Gaming on a TV is pretty cool, but it's kind of a mess to set up, and Steam Link has been pretty wonky when I tried it.

repiv
Aug 13, 2009

4K high-hz monitors are still meh because DisplayPort 1.4 doesn't have enough bandwidth to run them without compromising image quality

I'd wait for models with HDMI 2.1 to arrive before investing in one

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Zero VGS posted:

Ah okay guess I was misremembering. At least I can run a parallel PSU if this doesn't cut it. Because the PSU puts out a very precise 12.0 volts I could even wire one PSU directly to the GPU pins with an on/off switch and the other to the PC.

If you do this please post a picture. Mostly because I've never seen the style of PSU you linked before, and I would be intrigued to see what your actual setup is.

I say this as someone who has so far missed the SFF train because I could never jam everything I wanted into the cases and power envelopes available. I'd love me a Corsair One style deal, but I need 10Gb networking, and their boards don't support TB3 or an additional PCIe card, so...

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

repiv posted:

4K high-hz monitors are still meh because DisplayPort 1.4 doesn't have enough bandwidth to run them without compromising image quality

I'd wait for models with HDMI 2.1 to arrive before investing in one

What cutoff do you feel is appropriate here? I'm only seeking 4k/120 because my display is 65 inches, where it makes absolute 100% sense (I have another display for work and desktop-specific games).

I have to imagine that for a regular, say, 24-30 inch monitor, the diminishing returns would be horrifying. But I must confess that I have not gamed on a 4K monitor of that size, so that's just speculation.

repiv
Aug 13, 2009

Taima posted:

What cutoff do you feel is appropriate here? I'm only seeking 4k/120 because my display is 65 inches, where it makes absolute 100% sense (I have another display for work and desktop-specific games).

I have to imagine that for a regular, say, 24-30 inch monitor, the diminishing returns would be horrifying. But I must confess that I have not gamed on a 4K monitor of that size, so that's just speculation.

The janky bandwidth limitations are automatically disqualifying for me so I haven't given it much thought. I'll decide whether 4K is worth it once there's 4K monitors without the jank :shrug:

shrike82
Jun 11, 2005

It looks like the 3090 will be impossible to buy at launch - the reliable twitter leakers have pretty much said it's a paper launch for it.
Given the dumb 8K gaming tagline and how little emphasis they've placed on it, makes you wonder why Nvidia allowed AIBs to sell 3090s. You have to recall that the Titan SKU has been an in-house thing previously.

Endymion FRS MK1
Oct 29, 2011

I don't know what this thing is, and I don't care. I'm just tired of seeing your stupid newbie av from 2011.
I'm very curious about reviews of the 3080, mostly for power consumption. I watched the GN video, but I'm still leery about it. My system probably can handle it, it's an 8086K/2070S running under a 2 year old Seasonic Focus Plus Gold 650W. My old Steam machine ran a Haswell i5 and a 780 with a 450W PSU so that's helped my fears

shrike82
Jun 11, 2005

Be prepared not to get your card till next year

quote:

https://www.tweaktown.com/news/7491..._medium=twitter
an industry source who told me that post-launch there will be "no stock will be available till the end of the year". The first wave of cards is said to be small, very, very small -- possibly the smallest launch in many years.

Another source had something much more damning to say, but I want to flesh that out before I write it. For now, I'm being told stock will be extremely low for the next couple of months. Why? Samsung 8nm yields are unknown at this point, NVIDIA might not want to make too many before the yields improve.

lol, going to be lots of pissed-off people trying to get it ahead of CP2077

8-bit Miniboss
May 24, 2005

CORPO COPS CAME FOR MY :filez:
I mean if you're someone that's been ignoring the ongoing global pandemic that caused China and Taiwanese manufacturers to completely stop for a period of time. You deserve what you get.

cheesetriangles
Jan 5, 2011





Didn't want that card anyway.

shrike82
Jun 11, 2005

i think the issue is more for people on Pascal or older cards who've been holding off a GPU purchase because of Ampere - buying a Turing card would have been an even worse choice given pricing

this might actually drive console sales, lol. doing a paper launch right before CP2077

Kraftwerk
Aug 13, 2011
i do not have 10,000 bircoins, please stop asking

shrike82 posted:

Be prepared not to get your card till next year


lol, going to be lots of pissed-off people trying to get it ahead of CP2077

I loving knew it. I was hoping for the best but this pretty much seals my fate.
I'll make a last ditch attempt to grab one on launch day and if I get out-sniped by the bots and other hardcorers then I'll just have to tough it out on the 1070.

sean10mm
Jun 29, 2005

It's a Mad, Mad, Mad, MAD-2R World
What's the point of having a big hoopla launch event if you won't have poo poo to sell?

It's one thing to have problems with 3090 yields limiting it to a trickle at first, but if they can't even manufacture a 3070 it's really a joke.

e: I mean it's just one dumb rumor so there's no point in getting :emo: over it.

sean10mm fucked around with this message at 00:50 on Sep 3, 2020

Space Gopher
Jul 31, 2006

BLITHERING IDIOT AND HARDCORE DURIAN APOLOGIST. LET ME TELL YOU WHY THIS SHIT DON'T STINK EVEN THOUGH WE ALL KNOW IT DOES BECAUSE I'M SUPER CULTURED.

sean10mm posted:

What's the point of having a big hoopla launch event if you won't have poo poo to sell?

It's one thing to have problems with 3090 yields limiting it to a trickle at first, but if they can't even manufacture a 3070 it's really a joke.

If AMD is going to release cards that at least compete at the 20xx level, and they’re working with suppliers who can actually supply, then this means Nvidia can get out in front of them with a product that people will wait for.

Cygni
Nov 12, 2005

raring to post

Space Gopher posted:

If AMD is going to release cards that at least compete at the 20xx level, and they’re working with suppliers who can actually supply,

uhhhh let me just stop you there

shrike82
Jun 11, 2005

if it's driven by concerns about yields on a "new" node, better to launch soon than later, and shake the bugs out

Kraftwerk
Aug 13, 2011
i do not have 10,000 bircoins, please stop asking

sean10mm posted:

What's the point of having a big hoopla launch event if you won't have poo poo to sell?

It's one thing to have problems with 3090 yields limiting it to a trickle at first, but if they can't even manufacture a 3070 it's really a joke.

e: I mean it's just one dumb rumor so there's no point in getting :emo: over it.

Yeah, time to take a deep breath and realize with the absence of information from Nvidia people are trying to cash in on all the hype (And our worst case scenario assumptions) to drive as many clicks as possible and profit from the buzz.

shrike82
Jun 11, 2005

https://twitter.com/JayzTwoCents/status/1301271705503936515?s=20

lol

8-bit Miniboss
May 24, 2005

CORPO COPS CAME FOR MY :filez:

MSI said similar in their stream earlier today.

Kraftwerk
Aug 13, 2011
i do not have 10,000 bircoins, please stop asking


Are there services that loan you a 2080 until you can get the 3080?

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
So in other words, if you ARE lucky enough to get two, congrats on your "free" Ampere card, as I'm sure the Saudi Princes and billionaire kids of the world will happily pay +200% to have the new hotness.

MarcusSA
Sep 23, 2007


I mean my 1080Ti is perfectly fine but still I wanted the new card lol

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

I don't see what difference this makes literally at all. If there's only X units then there's only X units, it doesn't matter whether NVIDIA makes you pay for them on launch day or 2 weeks before. If they did preorders then people would be f5'ing the page to slam in a preorder at the first second too before they ran through their shipment allocation and sold out.

shrike82
Jun 11, 2005

i think it's more an indication of how limited launch quantities are going to be if they don't think allowing preorders makes sense

Adbot
ADBOT LOVES YOU

MarcusSA
Sep 23, 2007

shrike82 posted:

i think it's more an indication of how limited launch quantities are going to be if they don't think allowing preorders makes sense

I guess I don't understand why they wouldn't just backorder it kinda how apple does with their launches.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply