Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

tehinternet posted:

In laptops wouldn’t a 1070 be better than a 2060 since it sips power by comparison? Or is Turing not that much more power hungry?

Turing is still more power efficient in absolute terms of performance-per-watt. The label of Turing as being power-hungry is that they shoved the sliders to 11 and, as a consequence, many of the same-tier (eg 1080 vs 2080) parts saw a TDP increase.

If you look at the 1080Ti vs the 2080, the raw performance is pretty comparable, but the 1080Ti draws ~280W vs the 2080's ~225W, a 20% power savings. You also have to remember that there's the difference between "desktop" and "Max-Q" laptop parts: the desktop 1070 is ~150W, but the 1070 Max-Q is ~115W.

Raw clocks suggest the 2060 may be closer to the 1070Ti. If that's true, and both get similar power savings from the down-clocked and binned Max-Q versions (which seems fair), a 2060 Max-Q should probably come in around 120W, or almost the same as the 1070 Max-Q, while offering 15-20% better performance, which is right in line with the 2080 vs 1080Ti comparison.

Adbot
ADBOT LOVES YOU

B-Mac
Apr 21, 2003
I'll never catch "the gay"!
Laptop 1070s were a little odd too as they actually have more CUDA cores than the desktop variant, 2048 vs 1920.

tehinternet
Feb 14, 2005

Semantically, "you" is both singular and plural, though syntactically it is always plural. It always takes a verb form that originally marked the word as plural.

Also, there is no plural when the context is an argument with an individual rather than a group. Somfin shouldn't put words in my mouth.

B-Mac posted:

Laptop 1070s were a little odd too as they actually have more CUDA cores than the desktop variant, 2048 vs 1920.

That’s super weird.

DrDork posted:

Turing is still more power efficient in absolute terms of performance-per-watt. The label of Turing as being power-hungry is that they shoved the sliders to 11 and, as a consequence, many of the same-tier (eg 1080 vs 2080) parts saw a TDP increase.

If you look at the 1080Ti vs the 2080, the raw performance is pretty comparable, but the 1080Ti draws ~280W vs the 2080's ~225W, a 20% power savings. You also have to remember that there's the difference between "desktop" and "Max-Q" laptop parts: the desktop 1070 is ~150W, but the 1070 Max-Q is ~115W.

Raw clocks suggest the 2060 may be closer to the 1070Ti. If that's true, and both get similar power savings from the down-clocked and binned Max-Q versions (which seems fair), a 2060 Max-Q should probably come in around 120W, or almost the same as the 1070 Max-Q, while offering 15-20% better performance, which is right in line with the 2080 vs 1080Ti comparison.

Super good post, thanks!

EdEddnEddy
Apr 5, 2012



Yep and while the non Max-Q 1070 is a bit hotter, it is downright fast for a laptop chip and performs really good to the point that the jump from it to a Max-Q 1080 is pretty much a wash.

After watching laptop GPUs go from the Geforce 2 Go (I still have a logo with this in it), to the 980M, the 10XX series was a beautiful breath of fresh air for portable gaming and portable VR.

I can play games like Destiny 2 and Sea of Thieves in 4K(Ish)at 60FPS on a laptop. It's still baffling to me some days.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

EdEddnEddy posted:

I can play games like Destiny 2 and Sea of Thieves in 4K(Ish)at 60FPS on a laptop. It's still baffling to me some days.

Yeah, that I can do Overwatch at 1080p@60FPS locked on a ~4lbs laptop I bought second-hand for $600 is thrilling to me. I remember having a T430 with the NVS 5400M and it...struggled. A lot. To do much of anything graphically intensive.

You kids these days don't know how good you have it! :bahgawd:

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.
I got a Prostar laptop with a 1060 6GB in it 18 months ago for under $1000 and it's still a really nice 1080p machine, portable or no. The Pascal architecture was really, really good.

TheFluff
Dec 13, 2006

FRIENDS, LISTEN TO ME
I AM A SEAGULL
OF WEALTH AND TASTE
https://www.guru3d.com/news-story/nvidia-titan-v-raytraces-battelfield-v-in-rtx-mode-at-proper-perf-(but-does-not-have-any-rt-cores).html

quote:

It is a bit of a remarkable story really, but users have enabled RTX mode on a Nvidia Titan V, which works quite well and performs as fast as the RTX 2080 Ti. Titan V, however, is Volta, and Volta does not have any RT cores.

Of course, Volta is a very powerful card with loads of headroom and tensor cores. DX-R mode enabled, in theory, can work on any compatible DirectX 12 graphics card as long as it is fast enough, as the DirectX Raytracing is merely an extension to the DX12 API. If the hardware is not supported, it could run it in a software modus. However, the findings raise the question, are dedicated RT in hardware cores actually needed?

In the example, the Titan V with a Volta GV100 GPU according to users in this thread over at 3dcenter, would produce framerates that are almost equal to those of the GeForce RTX 2080 Ti. One user reports 69 fps in Ultra HD resolution at set Ultra high with both cards. Both graphics cards have been overclocked and cooled with water cooling with equal framerates and image quality. Yet another user reports he was playing in a 60-100 fps at 1440p ultra with 80 fps average. In our own review, the RTX 2080 Ti with DXR raytracing averaged 84.3 fps which is pretty drat close towards our results as tested in our Battlefield V article.

These new findings do make us wonder what the actual effect of RT has on performance. It has to be stated, these are merely user reports and we cannot verify any test methodologies used here (testing in an RT enhanced environment would be one concern that comes to mind).

Regardless, these are interesting find but for now, please do take them with caution and a few grains of salt. Below two settings screenshot the users used, as well as a screengrab with RTX enabled on the Volta card.

:thunk:

what the gently caress nvidia, if this is true (and tbh, it's pretty likely that it isn't) what on earth are those rt cores actually doing?

TheFluff fucked around with this message at 05:19 on Jan 3, 2019

Worf
Sep 12, 2017

If only Seth would love me like I love him!

selling hope

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
When Guru3D, the website that's never given a bad review on anything, ever, says something controversial about nVidia it's cause to raise one's eyebrows.

lDDQD
Apr 16, 2006

Volta has tensor cores, which are more or less the same thing as "RT" cores, under a different name. On the other hand, Mr. Huang could not shut the gently caress up about how just one 2080ti has the same amount of gigarays as a $40,000 workstation with four (4) Volta-based Teslas...
e: article seems to be taken down, so maybe they figured out how idiotic it is before I got the chance to read it
e2: never mind, it's just the url parser that doesn't like brackets

lDDQD fucked around with this message at 05:24 on Jan 3, 2019

repiv
Aug 13, 2009

Developers own comparisons between Volta and Turing running the same DXR code showed a clear win for Turing, so something doesn't add up.

Stanley Pain
Jun 16, 2001

by Fluffdaddy

TheFluff posted:

https://www.guru3d.com/news-story/nvidia-titan-v-raytraces-battelfield-v-in-rtx-mode-at-proper-perf-(but-does-not-have-any-rt-cores).html


:thunk:

what the gently caress nvidia, if this is true (and tbh, it's pretty likely that it isn't) what on earth are those rt cores actually doing?

Isn't it only one scene in BF V where the difference was 8FPS? The rest had 20 to 30 FPS difference

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
Also remember these are FPS reported in BF V, rather than raw performance values, so there's the alternate explanation that the BF V RTX mode is a jumbled mess and not properly utilizing Turing--maybe by forcing everything through some amount of software rendering regardless of the tensor cores.

repiv
Aug 13, 2009

I think GamersNexus still have their Titan V, someone go @ steve to ask for some proper benchmarks :cheeky:

Seamonster
Apr 30, 2007

IMMER SIEGREICH

DrDork posted:

Also remember these are FPS reported in BF V, rather than raw performance values, so there's the alternate explanation that the BF V RTX mode is a jumbled mess and not properly utilizing Turing--maybe by forcing everything through some amount of software rendering regardless of the tensor cores.

Somebody redo the :dice: but make it shiny.

Worf
Sep 12, 2017

If only Seth would love me like I love him!

Seamonster posted:

Somebody redo the :dice: but make it shiny.

also reduce the framerate

betterinsodapop
Apr 4, 2004

64:3
So, does Agreed's old boost 2.0 overclocking for dummies guide still basically apply for Pascal?

iirc:
open Precision/Afterburner
set voltage to max
set power target as high as it will go
set temp target as high as it will go
prioritize temp target

Hace
Feb 13, 2012

<<Mobius 1, Engage.>>
GPU Boost literally about just finding a reasonable temp range and aggressively cooling to that as hard as possible.

VelociBacon
Dec 8, 2009

betterinsodapop posted:

So, does Agreed's old boost 2.0 overclocking for dummies guide still basically apply for Pascal?

iirc:
open Precision/Afterburner
set voltage to max
set power target as high as it will go
set temp target as high as it will go
prioritize temp target

After that you set clock offsets and see the most you can stably get.

craig588
Nov 19, 2005

by Nyc_Tattoo
Depending on what you're doing raising the voltage might not be worth it. I can get 26 more MHz with my current card adding .05V. I added .05 to a 680 (raising the voltage to 770 stock levels) and got 300MHz. My 980 was even more disappointing, maxed out the voltage increase was one 13MHz step.

betterinsodapop
Apr 4, 2004

64:3

VelociBacon posted:

After that you set clock offsets and see the most you can stably get.
Alright, thanks! Trial and error time.

VelociBacon
Dec 8, 2009

betterinsodapop posted:

Alright, thanks! Trial and error time.

Once you have a clock offset that is stable you'll want to do the same for your memory offset. I'd give it a week or so of playing games etc at a certain clock offset without touching memory offset so that if you get a crash you know what caused it and you aren't stuck with two variables to mess with.

So here I have a +150 offset on my gpu clock and a +700 offset on my memory. The memory clock offset makes very little difference relative to the GPU clock.

1gnoirents
Jun 28, 2014

hello :)
Uh oh, I think my 2080ti is done for. I'm down to reinstalling Windows as far as diagnostic steps. I started seeing low GPU utilization in BF5, assumed it was just the game, then it started spreading to other games very suddenly and there is very noticeable extreme stuttering under any GPU load. What's strange is it starts fine but within a minute it degrades until its unplayable. 3d Mark is showing me 25% of the score I had when I first got the card.

Another FE bites the dust, maybe, ill see how the reinstall goes but my hopes aren't high

1gnoirents fucked around with this message at 02:49 on Jan 6, 2019

LRADIKAL
Jun 10, 2001

Fun Shoe
Sounds like a heat issue, almost. I assume you've checked all that though.

1gnoirents
Jun 28, 2014

hello :)

LRADIKAL posted:

Sounds like a heat issue, almost. I assume you've checked all that though.

Yeah I actually just watercooled the thing but the problem started before and after that, though this is such a weird one to me with the degradation behavior I'm not ruling anything out. I hope by posting its going bad I just reverse jinxed it and the Windows reinstall will work

edit: drat same behavior after Windows reinstall. Its power limit is roughly in the 50% range. I messed with a lot of stuff and rebuilt the whole computer into another case, I wonder if something else is loving up but drat what a downer

1gnoirents fucked around with this message at 02:48 on Jan 6, 2019

Worf
Sep 12, 2017

If only Seth would love me like I love him!

Have you tossed it in a different computer yet entirely? different mobo/psu? tried a different video card in your current system?

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

TheFluff posted:

https://www.guru3d.com/news-story/nvidia-titan-v-raytraces-battelfield-v-in-rtx-mode-at-proper-perf-(but-does-not-have-any-rt-cores).html


:thunk:

what the gently caress nvidia, if this is true (and tbh, it's pretty likely that it isn't) what on earth are those rt cores actually doing?

It would be the most :dice: thing ever if they made the game always use software raytracing.

SwissArmyDruid
Feb 14, 2014

by sebmojo
https://www.tomshardware.com/news/nvidia-geforce-gtx-1160,38301.html

Lenovo leaked listing for a GTX 1160 to go into their gaming laptops.

Unconfirmed reports says it looks like a GTX 2060, minus the tensor cores, with a 40W TDP drop.

SwissArmyDruid
Feb 14, 2014

by sebmojo
Techspot article benchmarking used video cards. So, valid for only another...... twelve hours before price move in response, thus making the article completely worthless again.

https://www.techspot.com/article/1775-guide-buying-used-graphics-card/

craig588
Nov 19, 2005

by Nyc_Tattoo
The photo of all of the videocards is impressive, I could probably only make 1 of those stacks if I emptied out all my computers.

Cygni
Nov 12, 2005

raring to post

Thats a friggen fantastic resource for "should I upgrade" questions, especially with AT Bench getting way out of date lately.

I am pretty unimpressed with the 1030 when you put it into context with the used market, assuming thats the DDR5 and not DDR4 model. I guess thats always true of the value end of the spectrum, but seems pretty stark. They price the 1030 at $62, too, when on Newegg the cheapest DDR5 model I can find is actually $85. Oof.

Also I feel like AMDs lovely rebranding horseshit made the used prices lower, cause nobody has any idea what is what and what to search for and prices are way down because of it. :v:

Cygni fucked around with this message at 21:51 on Jan 5, 2019

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler
I'm pretty sure the point of the 1030 is that it's the easiest way in Nvidia's lineup to get an HDMI 2.0 port if you want to connect a 4K TV to a computer, and half of them are fanless so you couldn't hear them in an HTPC either. The gaming performance is secondary at best.

1gnoirents
Jun 28, 2014

hello :)

Statutory Ape posted:

Have you tossed it in a different computer yet entirely? different mobo/psu? tried a different video card in your current system?

Well I shouldnt have blamed the gpu so quickly. I'm sitting in a pool of parts from Amazon today, poo poo literally everywhere trying every combo of parts I can think of. Sometimes 1 ram stick would allow the computer to boot, sometimes 2 worked, performance was always poor though and sometimes it didn't boot flagging VGA, RAM, and CPU as the culprit at various times across two motherboards. I figured finally it must be the CPU somehow. I should mention I cleaned the CPU both sides at this point and even cleaned the tops of the motherboard socket pins.

As a last step I put the CPU in a bowl of rubbing alcohol for 10 minutes while contemplating a CPU RMA. But I dont have to because now it works. It works with both motherboards, all new and old ram in every configuration, and immediately both my CPU and GPU scored exactly as well as they should have in 3d Mark and there are no further signs of issues. My guess is there was some tiny debris shorting a surface mount device on underside of the cpu in the center where all those are and I simply washed it away.

What an ordeal. But I'm happy to say I finally have a working watercooled 2080ti / 9700k combo packed in a SG13 case all neat, tidy, and cute

Worf
Sep 12, 2017

If only Seth would love me like I love him!

Yet Another RTX 2080ti: works just fine

Glad to hear it man

buglord
Jul 31, 2010

Cheating at a raffle? I sentence you to 1 year in jail! No! Two years! Three! Four! Five years! Ah! Ah! Ah! Ah!

Buglord
The SG13 is such an adorable yet awesome case. I wish it had the ability to mount a SFX PSU in front though. That would provide so much more room for a larger air cooler.

betterinsodapop
Apr 4, 2004

64:3

VelociBacon posted:

Once you have a clock offset that is stable you'll want to do the same for your memory offset. I'd give it a week or so of playing games etc at a certain clock offset without touching memory offset so that if you get a crash you know what caused it and you aren't stuck with two variables to mess with.

So here I have a +150 offset on my gpu clock and a +700 offset on my memory. The memory clock offset makes very little difference relative to the GPU clock.


Started with trial and error GPU offset, and kept going up by 13. I hit a wall at 208, so backed off a step to 195. It seems happy here so far. Gonna let it sit here for a while and then start messing around w/the memory offset. Thank you for the advice.

VelociBacon
Dec 8, 2009

betterinsodapop posted:

Started with trial and error GPU offset, and kept going up by 13. I hit a wall at 208, so backed off a step to 195. It seems happy here so far. Gonna let it sit here for a while and then start messing around w/the memory offset. Thank you for the advice.

Sounds good! +195 is quite an offset, what card is it again? If that's stable in games you've got a good one I guess.

1gnoirents
Jun 28, 2014

hello :)

buglord posted:

The SG13 is such an adorable yet awesome case. I wish it had the ability to mount a SFX PSU in front though. That would provide so much more room for a larger air cooler.

It is, I like it more than I would have thought. However building in it has been an epic pain in the rear end. My previous small RVZ02B was arguably easier to build in than a regular sized case because of its layout, the SG13 put me in the other end of the pain spectrum. Granted I am water cooling a 2080ti in it with an AIO which was basically just theoretically possible rather than recommended by anyone. I'll be fully done with it on Tuesday and ill post a pic so others may bear witness to the suffering. A lot more dremel needed than I was hoping for (which was none, ideally)

Even in its 90% finished hacked together state its clear the payoff will be worth it though. Its so drat tiny and with giant bouncy rubber feet you can pick it up so easily, and it gurgles a bit when you do too

Stickman
Feb 1, 2004

buglord posted:

The SG13 is such an adorable yet awesome case. I wish it had the ability to mount a SFX PSU in front though. That would provide so much more room for a larger air cooler.

The biggest problem with SG13 is that they don't make the pink version anymore :argh:

Adbot
ADBOT LOVES YOU

EdEddnEddy
Apr 5, 2012



So I just discovered how to actually do the software overclocking with the EVGA Precision Mobile on my SC17. (Literally didn't find any sort of guide until I stumbled on a post on how to activate the tabs on the left and, finally I can tweak things outside of the bios. Go figure..)
Anyway, finally tinkering with the GPU overclock and while Precision X1 sees my 1070 but can't do much more than that.

Precision Mobile has the two offsets you need though and so far I have tested up to +1000 ok the memory, and +220 on the core which hits ~1800mhz core and 5003mhz on the memory and while I haven't had time to test that with games, Haven and 3DMark though have passed with flying colors which, even though those aren't real stress test full time, is a bit surprising. Pushing the memory that hard while also bringing the core up to Desktop 1070 levels seems a bit crazy but I suppose figuring out if it can hold this in games is the next step.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply