|
tehinternet posted:In laptops wouldn’t a 1070 be better than a 2060 since it sips power by comparison? Or is Turing not that much more power hungry? Turing is still more power efficient in absolute terms of performance-per-watt. The label of Turing as being power-hungry is that they shoved the sliders to 11 and, as a consequence, many of the same-tier (eg 1080 vs 2080) parts saw a TDP increase. If you look at the 1080Ti vs the 2080, the raw performance is pretty comparable, but the 1080Ti draws ~280W vs the 2080's ~225W, a 20% power savings. You also have to remember that there's the difference between "desktop" and "Max-Q" laptop parts: the desktop 1070 is ~150W, but the 1070 Max-Q is ~115W. Raw clocks suggest the 2060 may be closer to the 1070Ti. If that's true, and both get similar power savings from the down-clocked and binned Max-Q versions (which seems fair), a 2060 Max-Q should probably come in around 120W, or almost the same as the 1070 Max-Q, while offering 15-20% better performance, which is right in line with the 2080 vs 1080Ti comparison.
|
# ? Jan 3, 2019 03:20 |
|
|
# ? May 10, 2024 09:46 |
|
Laptop 1070s were a little odd too as they actually have more CUDA cores than the desktop variant, 2048 vs 1920.
|
# ? Jan 3, 2019 03:28 |
|
B-Mac posted:Laptop 1070s were a little odd too as they actually have more CUDA cores than the desktop variant, 2048 vs 1920. That’s super weird. DrDork posted:Turing is still more power efficient in absolute terms of performance-per-watt. The label of Turing as being power-hungry is that they shoved the sliders to 11 and, as a consequence, many of the same-tier (eg 1080 vs 2080) parts saw a TDP increase. Super good post, thanks!
|
# ? Jan 3, 2019 03:31 |
|
Yep and while the non Max-Q 1070 is a bit hotter, it is downright fast for a laptop chip and performs really good to the point that the jump from it to a Max-Q 1080 is pretty much a wash. After watching laptop GPUs go from the Geforce 2 Go (I still have a logo with this in it), to the 980M, the 10XX series was a beautiful breath of fresh air for portable gaming and portable VR. I can play games like Destiny 2 and Sea of Thieves in 4K(Ish)at 60FPS on a laptop. It's still baffling to me some days.
|
# ? Jan 3, 2019 04:01 |
|
EdEddnEddy posted:I can play games like Destiny 2 and Sea of Thieves in 4K(Ish)at 60FPS on a laptop. It's still baffling to me some days. Yeah, that I can do Overwatch at 1080p@60FPS locked on a ~4lbs laptop I bought second-hand for $600 is thrilling to me. I remember having a T430 with the NVS 5400M and it...struggled. A lot. To do much of anything graphically intensive. You kids these days don't know how good you have it!
|
# ? Jan 3, 2019 04:27 |
|
I got a Prostar laptop with a 1060 6GB in it 18 months ago for under $1000 and it's still a really nice 1080p machine, portable or no. The Pascal architecture was really, really good.
|
# ? Jan 3, 2019 04:28 |
|
https://www.guru3d.com/news-story/nvidia-titan-v-raytraces-battelfield-v-in-rtx-mode-at-proper-perf-(but-does-not-have-any-rt-cores).htmlquote:It is a bit of a remarkable story really, but users have enabled RTX mode on a Nvidia Titan V, which works quite well and performs as fast as the RTX 2080 Ti. Titan V, however, is Volta, and Volta does not have any RT cores. what the gently caress nvidia, if this is true (and tbh, it's pretty likely that it isn't) what on earth are those rt cores actually doing? TheFluff fucked around with this message at 05:19 on Jan 3, 2019 |
# ? Jan 3, 2019 05:05 |
|
selling hope
|
# ? Jan 3, 2019 05:08 |
|
When Guru3D, the website that's never given a bad review on anything, ever, says something controversial about nVidia it's cause to raise one's eyebrows.
|
# ? Jan 3, 2019 05:11 |
|
TheFluff posted:https://www.guru3d.com/news-story/nvidia-titan-v-raytraces-battelfield-v-in-rtx-mode-at-proper-perf-(but-does-not-have-any-rt-cores).html Volta has tensor cores, which are more or less the same thing as "RT" cores, under a different name. On the other hand, Mr. Huang could not shut the gently caress up about how just one 2080ti has the same amount of gigarays as a $40,000 workstation with four (4) Volta-based Teslas... e: article seems to be taken down, so maybe they figured out how idiotic it is before I got the chance to read it e2: never mind, it's just the url parser that doesn't like brackets lDDQD fucked around with this message at 05:24 on Jan 3, 2019 |
# ? Jan 3, 2019 05:17 |
|
Developers own comparisons between Volta and Turing running the same DXR code showed a clear win for Turing, so something doesn't add up.
|
# ? Jan 3, 2019 05:27 |
|
TheFluff posted:https://www.guru3d.com/news-story/nvidia-titan-v-raytraces-battelfield-v-in-rtx-mode-at-proper-perf-(but-does-not-have-any-rt-cores).html Isn't it only one scene in BF V where the difference was 8FPS? The rest had 20 to 30 FPS difference
|
# ? Jan 3, 2019 05:44 |
|
Also remember these are FPS reported in BF V, rather than raw performance values, so there's the alternate explanation that the BF V RTX mode is a jumbled mess and not properly utilizing Turing--maybe by forcing everything through some amount of software rendering regardless of the tensor cores.
|
# ? Jan 3, 2019 05:45 |
|
I think GamersNexus still have their Titan V, someone go @ steve to ask for some proper benchmarks
|
# ? Jan 3, 2019 05:50 |
|
DrDork posted:Also remember these are FPS reported in BF V, rather than raw performance values, so there's the alternate explanation that the BF V RTX mode is a jumbled mess and not properly utilizing Turing--maybe by forcing everything through some amount of software rendering regardless of the tensor cores. Somebody redo the but make it shiny.
|
# ? Jan 3, 2019 06:15 |
|
Seamonster posted:Somebody redo the but make it shiny. also reduce the framerate
|
# ? Jan 3, 2019 06:22 |
|
So, does Agreed's old boost 2.0 overclocking for dummies guide still basically apply for Pascal? iirc: open Precision/Afterburner set voltage to max set power target as high as it will go set temp target as high as it will go prioritize temp target
|
# ? Jan 3, 2019 20:44 |
|
GPU Boost literally about just finding a reasonable temp range and aggressively cooling to that as hard as possible.
|
# ? Jan 3, 2019 20:50 |
|
betterinsodapop posted:So, does Agreed's old boost 2.0 overclocking for dummies guide still basically apply for Pascal? After that you set clock offsets and see the most you can stably get.
|
# ? Jan 3, 2019 21:14 |
|
Depending on what you're doing raising the voltage might not be worth it. I can get 26 more MHz with my current card adding .05V. I added .05 to a 680 (raising the voltage to 770 stock levels) and got 300MHz. My 980 was even more disappointing, maxed out the voltage increase was one 13MHz step.
|
# ? Jan 3, 2019 21:25 |
|
VelociBacon posted:After that you set clock offsets and see the most you can stably get.
|
# ? Jan 3, 2019 21:45 |
|
betterinsodapop posted:Alright, thanks! Trial and error time. Once you have a clock offset that is stable you'll want to do the same for your memory offset. I'd give it a week or so of playing games etc at a certain clock offset without touching memory offset so that if you get a crash you know what caused it and you aren't stuck with two variables to mess with. So here I have a +150 offset on my gpu clock and a +700 offset on my memory. The memory clock offset makes very little difference relative to the GPU clock.
|
# ? Jan 3, 2019 22:04 |
|
Another FE bites the dust, maybe, ill see how the reinstall goes but my hopes aren't high 1gnoirents fucked around with this message at 02:49 on Jan 6, 2019 |
# ? Jan 3, 2019 22:51 |
|
Sounds like a heat issue, almost. I assume you've checked all that though.
|
# ? Jan 3, 2019 22:54 |
|
LRADIKAL posted:Sounds like a heat issue, almost. I assume you've checked all that though. Yeah I actually just watercooled the thing but the problem started before and after that, though this is such a weird one to me with the degradation behavior I'm not ruling anything out. I hope by posting its going bad I just reverse jinxed it and the Windows reinstall will work edit: drat same behavior after Windows reinstall. Its power limit is roughly in the 50% range. I messed with a lot of stuff and rebuilt the whole computer into another case, I wonder if something else is loving up but drat what a downer 1gnoirents fucked around with this message at 02:48 on Jan 6, 2019 |
# ? Jan 3, 2019 23:33 |
|
Have you tossed it in a different computer yet entirely? different mobo/psu? tried a different video card in your current system?
|
# ? Jan 4, 2019 00:42 |
|
TheFluff posted:https://www.guru3d.com/news-story/nvidia-titan-v-raytraces-battelfield-v-in-rtx-mode-at-proper-perf-(but-does-not-have-any-rt-cores).html It would be the most thing ever if they made the game always use software raytracing.
|
# ? Jan 4, 2019 01:37 |
|
https://www.tomshardware.com/news/nvidia-geforce-gtx-1160,38301.html Lenovo leaked listing for a GTX 1160 to go into their gaming laptops. Unconfirmed reports says it looks like a GTX 2060, minus the tensor cores, with a 40W TDP drop.
|
# ? Jan 4, 2019 02:18 |
|
Techspot article benchmarking used video cards. So, valid for only another...... twelve hours before price move in response, thus making the article completely worthless again. https://www.techspot.com/article/1775-guide-buying-used-graphics-card/
|
# ? Jan 5, 2019 19:05 |
|
The photo of all of the videocards is impressive, I could probably only make 1 of those stacks if I emptied out all my computers.
|
# ? Jan 5, 2019 19:35 |
|
Thats a friggen fantastic resource for "should I upgrade" questions, especially with AT Bench getting way out of date lately. I am pretty unimpressed with the 1030 when you put it into context with the used market, assuming thats the DDR5 and not DDR4 model. I guess thats always true of the value end of the spectrum, but seems pretty stark. They price the 1030 at $62, too, when on Newegg the cheapest DDR5 model I can find is actually $85. Oof. Also I feel like AMDs lovely rebranding horseshit made the used prices lower, cause nobody has any idea what is what and what to search for and prices are way down because of it. Cygni fucked around with this message at 21:51 on Jan 5, 2019 |
# ? Jan 5, 2019 21:49 |
|
I'm pretty sure the point of the 1030 is that it's the easiest way in Nvidia's lineup to get an HDMI 2.0 port if you want to connect a 4K TV to a computer, and half of them are fanless so you couldn't hear them in an HTPC either. The gaming performance is secondary at best.
|
# ? Jan 5, 2019 22:08 |
|
Statutory Ape posted:Have you tossed it in a different computer yet entirely? different mobo/psu? tried a different video card in your current system? Well I shouldnt have blamed the gpu so quickly. I'm sitting in a pool of parts from Amazon today, poo poo literally everywhere trying every combo of parts I can think of. Sometimes 1 ram stick would allow the computer to boot, sometimes 2 worked, performance was always poor though and sometimes it didn't boot flagging VGA, RAM, and CPU as the culprit at various times across two motherboards. I figured finally it must be the CPU somehow. I should mention I cleaned the CPU both sides at this point and even cleaned the tops of the motherboard socket pins. As a last step I put the CPU in a bowl of rubbing alcohol for 10 minutes while contemplating a CPU RMA. But I dont have to because now it works. It works with both motherboards, all new and old ram in every configuration, and immediately both my CPU and GPU scored exactly as well as they should have in 3d Mark and there are no further signs of issues. My guess is there was some tiny debris shorting a surface mount device on underside of the cpu in the center where all those are and I simply washed it away. What an ordeal. But I'm happy to say I finally have a working watercooled 2080ti / 9700k combo packed in a SG13 case all neat, tidy, and cute
|
# ? Jan 6, 2019 02:57 |
|
Yet Another RTX 2080ti: works just fine Glad to hear it man
|
# ? Jan 6, 2019 03:02 |
|
The SG13 is such an adorable yet awesome case. I wish it had the ability to mount a SFX PSU in front though. That would provide so much more room for a larger air cooler.
|
# ? Jan 6, 2019 03:04 |
|
VelociBacon posted:Once you have a clock offset that is stable you'll want to do the same for your memory offset. I'd give it a week or so of playing games etc at a certain clock offset without touching memory offset so that if you get a crash you know what caused it and you aren't stuck with two variables to mess with.
|
# ? Jan 6, 2019 03:50 |
|
betterinsodapop posted:Started with trial and error GPU offset, and kept going up by 13. I hit a wall at 208, so backed off a step to 195. It seems happy here so far. Gonna let it sit here for a while and then start messing around w/the memory offset. Thank you for the advice. Sounds good! +195 is quite an offset, what card is it again? If that's stable in games you've got a good one I guess.
|
# ? Jan 6, 2019 03:57 |
|
buglord posted:The SG13 is such an adorable yet awesome case. I wish it had the ability to mount a SFX PSU in front though. That would provide so much more room for a larger air cooler. It is, I like it more than I would have thought. However building in it has been an epic pain in the rear end. My previous small RVZ02B was arguably easier to build in than a regular sized case because of its layout, the SG13 put me in the other end of the pain spectrum. Granted I am water cooling a 2080ti in it with an AIO which was basically just theoretically possible rather than recommended by anyone. I'll be fully done with it on Tuesday and ill post a pic so others may bear witness to the suffering. A lot more dremel needed than I was hoping for (which was none, ideally) Even in its 90% finished hacked together state its clear the payoff will be worth it though. Its so drat tiny and with giant bouncy rubber feet you can pick it up so easily, and it gurgles a bit when you do too
|
# ? Jan 6, 2019 04:55 |
|
buglord posted:The SG13 is such an adorable yet awesome case. I wish it had the ability to mount a SFX PSU in front though. That would provide so much more room for a larger air cooler. The biggest problem with SG13 is that they don't make the pink version anymore
|
# ? Jan 6, 2019 05:10 |
|
|
# ? May 10, 2024 09:46 |
|
So I just discovered how to actually do the software overclocking with the EVGA Precision Mobile on my SC17. (Literally didn't find any sort of guide until I stumbled on a post on how to activate the tabs on the left and, finally I can tweak things outside of the bios. Go figure..) Anyway, finally tinkering with the GPU overclock and while Precision X1 sees my 1070 but can't do much more than that. Precision Mobile has the two offsets you need though and so far I have tested up to +1000 ok the memory, and +220 on the core which hits ~1800mhz core and 5003mhz on the memory and while I haven't had time to test that with games, Haven and 3DMark though have passed with flying colors which, even though those aren't real stress test full time, is a bit surprising. Pushing the memory that hard while also bringing the core up to Desktop 1070 levels seems a bit crazy but I suppose figuring out if it can hold this in games is the next step.
|
# ? Jan 6, 2019 05:13 |