Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Happy Noodle Boy
Jul 3, 2002


AirRaid posted:

Yeah but conversely -

:stare:

God drat it off to Micro Center this weekend for a loving PSU.

Adbot
ADBOT LOVES YOU

TheCoach
Mar 11, 2014
So if you also run an overclocked intel CPU you are really getting close to drawing 1kW from the wall.
Vacuum cleaner levels of power, lmao.

Craptacular!
Jul 9, 2001

Fuck the DH
I guess this is because the 3070 is further out, but AIBs do not want to talk dimensions on it, which is disappointing. So far only ASUS has shown a seperate tier for the 3070. MSI and EVGA are showing cards wearing the same coolers, but the 3080 cards are all really long. The exception is PNY who want you know that, yes, the 3070 cards are all going to be nearly a whole foot in length like the 3080 will.

Whether or not I buy a new case to accommodate a more expensive GPU, I think I'm 99% certain to buy Founders Edition. Even though I air cool my CPU, it simply isn't ugly. Everybody's design is ugly as sin this time

Scarecow
May 20, 2008

3200mhz RAM is literally the Devil. Literally.
Lipstick Apathy

AirRaid posted:

Yeah but conversely -

:stare:

yeah but thats the power levels a 2080ti gets to so I'm not too worried

exquisite tea
Apr 21, 2007

Carly shook her glass, willing the ice to melt. "You still haven't told me what the mission is."

She leaned forward. "We are going to assassinate the bad men of Hollywood."


Since this has never happened to me before, what happens if your power supply can't draw enough power from the wall? Does your CPU/GPU automatically downclock? Does your system explode? Does Jensen Huang pop out of your I/O port and flip you the bird?

AirRaid
Dec 21, 2004

Nose Manual + Super Sonic Spin Attack
Cutting it mighty close for those of us with 750W PSUs though. I reckon I have 450-500 of overhead for my GPU so I might not be letting it do any hilarious OCs.

exquisite tea posted:

Since this has never happened to me before, what happens if your power supply can't draw enough power from the wall? Does your CPU/GPU automatically downclock? Does your system explode? Does Jensen Huang pop out of your I/O port and flip you the bird?

You mean if your actual wall supply falters/fails? The system will crash/shut down. My friend who lives in the sticks and has intermittent power sometimes has a UPS for this reason.

B-Mac
Apr 21, 2003
I'll never catch "the gay"!

Scarecow posted:

yeah but thats the power levels a 2080ti gets to so I'm not too worried

What normal 2080 Ti uses 480W of power? Most bios limit between 300W and 330W max.

Riflen
Mar 13, 2009

"Cheating bitch"
Bleak Gremlin

Scarecow posted:

yeah but thats the power levels a 2080ti gets to so I'm not too worried

What the hell are you doing with your 2080 Ti to get it to suck 479 Watts? I've one under water and at 2025 MHz it only draws ~300 Watts.

GA102 is definitely going to draw a lot more than TU102.

shrike82
Jun 11, 2005

AirRaid posted:

Yeah but conversely -

:stare:

:rip:

AirRaid
Dec 21, 2004

Nose Manual + Super Sonic Spin Attack
Haven't seen this article posted -

https://videocardz.com/newz/nvidia-provides-further-details-on-geforce-rtx-30-series

NVidia answered a bunch of questions on Reddit clarifying a couple of things, such as why the Cuda Core count suddenly doubled, and a few other things. Nothing mind-blowing but some interesting bits.

Scarecow
May 20, 2008

3200mhz RAM is literally the Devil. Literally.
Lipstick Apathy

Riflen posted:

What the hell are you doing with your 2080 Ti to get it to suck 479 Watts? I've one under water and at 2025 MHz it only draws ~300 Watts.

GA102 is definitely going to draw a lot more than TU102.

woops the review I was looking at was talking about total system power, Im going to guess that GDDR6x is power hungry

Riflen
Mar 13, 2009

"Cheating bitch"
Bleak Gremlin

Scarecow posted:

woops the review I was looking at was talking about total system power, Im going to guess that GDDR6x is power hungry

Was going to say, you'd have to be hardware modding and probably using LN2.

Yes, it looks like GDDR6X does like the juice.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
so that RTX IO thing that they're plugging - it allows game data (if the game is written to support it using the proper API) to be read directly by/into the GPU with less (or no?) CPU overhead, which is kinda similar to the thing that next-gen consoles were supposed to have been bragging about...

... but it's only RTX cards that support it/are capable of it? at least that we know of?

ijyt
Apr 10, 2012

Man the power consumption kinda sucks. Got my 1080 overclocked and undervolted even.

Shipon
Nov 7, 2005

AirRaid posted:

Haven't seen this article posted -

https://videocardz.com/newz/nvidia-provides-further-details-on-geforce-rtx-30-series

NVidia answered a bunch of questions on Reddit clarifying a couple of things, such as why the Cuda Core count suddenly doubled, and a few other things. Nothing mind-blowing but some interesting bits.

quote:

New ultra performance mode for 8K gaming. Delivers 8K gaming on GeForce RTX 3090 with a new 9x scaling option.

So it's upscaling from sub-4k to get to "8K"?

AirRaid
Dec 21, 2004

Nose Manual + Super Sonic Spin Attack
Well yeah a lot of their big FPS numbers at high res includes DLSS upscaling. But what you have to remember is DLSS isn't just straight upscaling with a loss in quality, it's essentially witchcraft and can produce a full res image with more detail than the original had.

Josh Lyman
May 24, 2009


Zedsdeadbaby posted:

I went nvme just to do away with clunky storage

Less power consumption and better airflow too
You need heatsinks for NVME drives which I think a lot of people overlook.

Josh Lyman
May 24, 2009


What’s with this proliferation of 450W PSUs? My 3570K/970 system from 2012 has a 650W Antec and that’s only because I never planned to get a flagship GPU. I have 4 HDDs though.

cheesetriangles
Jan 5, 2011





480 watt GPU power draw and won't need heat in the winter.

MarcusSA
Sep 23, 2007

gradenko_2000 posted:

so that RTX IO thing that they're plugging - it allows game data (if the game is written to support it using the proper API) to be read directly by/into the GPU with less (or no?) CPU overhead, which is kinda similar to the thing that next-gen consoles were supposed to have been bragging about...

... but it's only RTX cards that support it/are capable of it? at least that we know of?

No one actually knows.

It’s a bit vague.

Ugly In The Morning
Jul 1, 2010
Pillbug

gradenko_2000 posted:

so that RTX IO thing that they're plugging - it allows game data (if the game is written to support it using the proper API) to be read directly by/into the GPU with less (or no?) CPU overhead, which is kinda similar to the thing that next-gen consoles were supposed to have been bragging about...

... but it's only RTX cards that support it/are capable of it? at least that we know of?

If it actually can pull off the doubling they’re talking about, with a 4th gen SSD it’ll put it at 10GB a second-ish with a 4th gen NVMe drive (approx 5GB/sec), which would be a smidge faster than the 9GB a second the PS5 claims. I figure it won’t be 2:1, at least not right away, but with a good 4th gen NVMe drive and the RTX IO stuff I would bet you can get close. Even with current NVMe drives you should be able to at least get some crazy speeds with it. Eventually. I’m gonna bet it’s gonna be like DLSS, where there’s an eh first version and then a crazy witchcraft later one.

Riflen
Mar 13, 2009

"Cheating bitch"
Bleak Gremlin

MarcusSA posted:

No one actually knows.

It’s a bit vague.

Disagree. It's not vague, they're being very clear.

https://www.nvidia.com/en-gb/geforce/news/rtx-io-gpu-accelerated-storage-technology/

Supported by Turing and Ampere GPUs.

track day bro!
Feb 17, 2005

#essereFerrari
Grimey Drawer
The 3080 can't need a 750w minimum psu right? I mean my old 5820K oc'ed at 4.5ghz and Vega 64 (don't laugh) drew like 400w at torture test loads.

I've got an evga platinum 750w unit surely that should be enough, or if I end up buying one of those haswell-e 8 core xeons is it gonna die. Obviously the answer is to wait for reviews because theres no way I'll be able to get a card on launch anyway.

ufarn
May 30, 2009

track day bro! posted:

The 3080 can't need a 750w minimum psu right? I mean my old 5820K oc'ed at 4.5ghz and Vega 64 (don't laugh) drew like 400w at torture test loads.

I've got an evga platinum 750w unit surely that should be enough, or if I end up buying one of those haswell-e 8 core xeons is it gonna die. Obviously the answer is to wait for reviews because theres no way I'll be able to get a card on launch anyway.
You'll be fine as long as you don't use that weird dodgy OC tool they mention in the latency video.

Cantide
Jun 13, 2001
Pillbug
Can anyone take a guess if RTX IO will work with drive pools like https://stablebit.com/DrivePool. I'm currently using it to pool 4 SSDs into one virtual drive to install all my games on. I fear the answer may be no.

repiv
Aug 13, 2009

DirectStorage/RTXIO needs to copy bits verbatim from the physical drive to the GPU, so storage abstraction layers will probably break it. Those need the CPU to piece the data back together.

That means soft-RAID or other pooling won't work unless the implementation is DirectStorage-aware and can remap the commands to point to their physical location

That's theoretically possible at least, but software full-disk encryption and filesystem compression will probably never be compatible

repiv fucked around with this message at 13:29 on Sep 3, 2020

Vir
Dec 14, 2007

Does it tickle when your Body Thetans flap their wings, eh Beatrice?
It would be kind of funny if they make it work in WINE/Proton. Then I can have one regular NVMe PCIe gen 3 drive for the Linux OS, and one Gen 4 drive formatted for Windows with the textures on it. How soon can we plug the SSD into the GPU directly?

Cantide
Jun 13, 2001
Pillbug

repiv posted:

DirectStorage/RTXIO needs to copy bits verbatim from the physical drive to the GPU, so storage abstraction layers will probably break it. Those need the CPU to piece the data back together.

That means no software full disk encryption, no filesystem compression, no soft-RAID or other pooling I think

yeah that sadly makes a lot of sense. Well we'll see if and when the first titles start using it and how much of a difference it will make. Maybe large SSDs (>=3tb) have depreciated a bit until then. Who am I kidding :emo:

exquisite tea
Apr 21, 2007

Carly shook her glass, willing the ice to melt. "You still haven't told me what the mission is."

She leaned forward. "We are going to assassinate the bad men of Hollywood."


Listen here nerds I don't need a fancy pants lecture, just tell me if it makes the video game number go up or down!!

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

Vir posted:

How soon can we plug the SSD into the GPU directly?

aaaaaand we're circling right back to the GPU just being a computer all by itself :v:

sean10mm
Jun 29, 2005

It's a Mad, Mad, Mad, MAD-2R World

exquisite tea posted:

Listen here nerds I don't need a fancy pants lecture, just tell me if it makes the synthetic benchmark number go up or down!!

Cancelbot
Nov 22, 2006

Canceling spam since 1928


I have a TX650W and an 8086k, it's 100W below "recommended" but the 10900K draws 100W more than an 8700 does and that's what they base the recommended PSU on, therefore nothing will go wrong :v:

exquisite tea
Apr 21, 2007

Carly shook her glass, willing the ice to melt. "You still haven't told me what the mission is."

She leaned forward. "We are going to assassinate the bad men of Hollywood."


We're all PC gamers here, the idea of video games just meaning benchmark performance is already implied.

Vir
Dec 14, 2007

Does it tickle when your Body Thetans flap their wings, eh Beatrice?
Unless each pixel emits only one photon per frame, the FPS isn't high enough.

Cancelbot posted:

therefore nothing will go wrong :v:
If you're still within warranty on the PSU, that is. Wouldn't want the PSU to destroy your PC without a warranty to fall back on.

Cancelbot
Nov 22, 2006

Canceling spam since 1928

Vir posted:

Unless each pixel emits only one photon per frame, the FPS isn't high enough.

If you're still within warranty on the PSU, that is. Wouldn't want the PSU to destroy your PC without a warranty to fall back on.

Oh crap, I got it in 2013... yep that's new PSU time.

Rolo
Nov 16, 2005

Hmm, what have we here?
So wait if I’m running a stock Ryzen 7 3700x and 3080 will I be maybe not fine with 750w?

AirRaid
Dec 21, 2004

Nose Manual + Super Sonic Spin Attack
Reposting this again

https://www.youtube.com/watch?v=X_wtoCBahhM

You'll be fine.

E: for your usecase specifically, they actually ran a 3700X with a Radeon RX5700XT. That GPU is rated at 276W, so add ~50w to the results for that setup in that video to give an estimate on your usage with a 3080. Generally not even topping 400W total.

But really watch the video it's really informative and features cats at several points.

AirRaid fucked around with this message at 14:12 on Sep 3, 2020

Cancelbot
Nov 22, 2006

Canceling spam since 1928

Oh yeah I'll wait for system tests and wattages to come out, but I figure a 7+ year old PSU needs changing as the warranty expired 2 years ago.

Riflen
Mar 13, 2009

"Cheating bitch"
Bleak Gremlin
https://www.youtube.com/watch?v=A7nYy7ZucxM

Another performance indicator.

Adbot
ADBOT LOVES YOU

Vir
Dec 14, 2007

Does it tickle when your Body Thetans flap their wings, eh Beatrice?
The PSU might still keep running for 20 years, but risking a whole new PC and GPU just isn't worth it.

Turn your old PC into a protein folding space heater with some cheap second hand GPUs; we have a goon team. That is, unless literal :catdrugs: makes the exercise unnecessary.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply