|
I'm having a crack at overclocking my EVGA 1080 FTW with PrecisionXOC, and I'm stable in Superposition benchmark at 120% power target, +100 clock / +1000 memory and getting a max temp of 70° with barely audible fans (running at 1700rpm) this gets me an extra 5 FPS on the Extreme preset, going from 30 to 35 which is kind of a big deal. So the card is running at 2037 Mhz boost clock, 6003Mhz memory ... is that normal? If I raise the clock anymore it crashes, but for memory it just accepted the maximum possible setting of one extra Ghz without breaking a sweat, I'm really impressed. Do you think I could keep this 24/7, considering that I often make 3-4 hours sessions of continuous gaming? Is there any danger of loving things up? Not that I *need* this mind you, I already max out every game I have at stock settings (1440p), but the extra headroom for future heavier games is pretty nice TorakFade fucked around with this message at 09:49 on Oct 17, 2018 |
# ? Oct 17, 2018 09:45 |
|
|
# ? Apr 24, 2024 17:00 |
|
Sounds like you got a pretty good card. Take a picture of your case and stuff! Have you not increased your voltages more? You have a lot of thermal headroom at the moment. You're probably unstable still, it's a question of how often hangs are acceptable to you. You're probably in a good place to wait and see how it behaves across a bunch of different loads.
|
# ? Oct 17, 2018 09:50 |
|
Iirc, Any 1080 that does over 2000mhz is a good card. 2050 great, 2100 silicon lottery winner.
|
# ? Oct 17, 2018 09:52 |
|
LRADIKAL posted:Sounds like you got a pretty good card. Take a picture of your case and stuff! Pictures? Gladly! Hope you like gaudy leds, because... well... (yes I went full retard on this build, but I think it looks astonishing, pics don't do it justice especially not these lovely phone pictures) side view with "minimal" lighting my desk is a mess I haven't touched voltages and honestly I'm scared, despite building PCs since 1998 or so, I'm still afraid of electricity and frying things. What could be a completely safe voltage I could try? If I could reach 2100Mhz it'd be awesome as long as it doesn't add too much stress, I want this card to last as long as possible. Btw, EVGA's GPU warranty is unaffected as long as I only use EVGA's own PrecisionXOC program to overclock it and nothing else, right? TorakFade fucked around with this message at 10:32 on Oct 17, 2018 |
# ? Oct 17, 2018 10:28 |
|
Nvidia cards have their voltages locked down by the factory BIOS these days so you can't actually damage them with the voltage slider in Afterburner/Precision XOC. It's to the point where the voltage slider is kinda placebo; the card will decide on its own how much voltage it'll use depending on temperature and power budget. Just drag the voltage and power sliders to the max and let the card figure things out on its own. It's quite possible the voltage slider will do almost nothing though. I run my Asus 1080 at ~2030-2050 core clock and... +450MHz men, I think? Could maybe go higher, but probably not much.
|
# ? Oct 17, 2018 10:37 |
|
Green Gloves posted:Can I expect any fps gains going from 1600 mhz to 2400 mhz ram with a 4790k/1080 ti combo? I managed to find 16gb locally for $65. Might be a good cheap upgrade before I build a new system next year. For that price, do it. 1600 to 2400 is significant, and that's cheap. Performance increases differ from workload to workload: https://www.eurogamer.net/articles/digitalfoundry-2016-is-it-finally-time-to-upgrade-your-core-i5-2500k HalloKitty fucked around with this message at 10:40 on Oct 17, 2018 |
# ? Oct 17, 2018 10:37 |
|
EVGA will cover you all the way up until you physically remove chips from the board or solder new bits on. Third-party software is totally fine.
|
# ? Oct 17, 2018 10:40 |
|
LRADIKAL posted:Thanks for that post, very helpful. Changing gears; what else could be done to get path tracing into our games? Is RTX the only logical "next step" that will be iterated and added to as performance increases? I guess, along with de-noising improvement. Seems like ray and path tracing are pretty fundamental simulations that just take a lot of transistors to perform quickly. Hope these are coherent questions, thanks! Well, in one way path tracing is in your games today, in the sense that a lot of game developers use path tracing or something like it to precompute and store the indirect lighting that their game can't compute in real time. Real time path tracing of the entire primary image is kinda hard, since brute force typically takes minutes to hours per frame. There's some promising techniques where you try to use essentially temporal anti-aliasing and some very strong denoise filters which work ok. The state of the art looks roughly like this, but note that even 1 sample/pixel means around 5 FPS in reality. In addition those approaches are not robust, are hard for artists to control and have very objectionable corner case blurring artifacts. So one possibility for real time path tracing is to fix those problems: trace rays faster, with better and more robust noise removal filters that don't have errors. Another option is to use path tracing in a game for something other than the primary image. You could for example do the aforementioned light baking -- using path tracing, or light cuts, or progressive photon mapping, or some other ray trace based global illumination technique -- progressively in the background while the game is running to update the stored data in real time based on changes to the environment. Ray tracing also isn't necessarily path tracing and other global illumination type techniques, I'm just a bit blinkered and tend to associate the two because, well, I work on a path tracer and trace a lot of rays. The potential is pretty exciting to me because raster graphics are sort of fundamentally a hack: your GPU draws a ton of 3D triangles by throwing away all surrounding context, individually turning each 3D triangle into a 2D triangle n the vertex shader, then turning the 2D triangles into points and throwing away even more context in the raster pass, then going "oh poo poo" and trying to hack it all back in to do 3D lighting in the pixel shader where the only things you can really inspect are the immediately surrounding pixels. Doing basic tests for things like "is this light source over there occluded" or "what does this piece of metal reflect" in a game is difficult and can require doing an entire separate rendering of the scene from some other perspective first to have the data available later in the primary draw. The ideal is that now if you really need to figure out those things in your shader you have a possible option to just cast a ray in the direction of the light source or reflection and get the data directly. It's possible to use that to substantially improve how shadows, reflections, refractions and other sort of "non-local" effects work. I doubt that change is going to be revolutionary right now given that even with RTX it's still expensive and people are talking about 1 or 2 rays per pixel budgets with current hardware. We're going to keep using shadow maps and screen space reflections, even though they're poo poo, because they're known poo poo, people have done the legwork of writing the overly complex code for them, and they're relatively cheap compared to spending 1 of your 2 rays for the pixel. For stuff like complex glass objects where games don't really have any good answers at all, a budget of 1-2 rays per pixel is just about enough to match this in a non-demoscene environment. Xerophyte fucked around with this message at 10:48 on Oct 17, 2018 |
# ? Oct 17, 2018 10:41 |
|
TheFluff posted:Nvidia cards have their voltages locked down by the factory BIOS these days so you can't actually damage them with the voltage slider in Afterburner/Precision XOC. It's to the point where the voltage slider is kinda placebo; the card will decide on its own how much voltage it'll use depending on temperature and power budget. Just drag the voltage and power sliders to the max and let the card figure things out on its own. It's quite possible the voltage slider will do almost nothing though. I raised the voltage all the way to 100% leaving the offsets at +100/+1000, it boosts slightly higher (2075Mhz instead of 2037) but in 1 out of 3 benchmark runs, it crashed... set voltage back to default and offsets still +100/+1000, and it happily boosts all the way to 2037/2050Mhz without a care in the world even on 5 consecutive benchmark runs. I'll have to try it in actual gaming when I finish working, but I guess this is the max it'll go. Not too shabby anyway
|
# ? Oct 17, 2018 10:51 |
|
Looking forward to when I can just use EVGA’s Precision X1 for overclocking on Pascal, seems like quite the might-as-well overclocking tool.
ufarn fucked around with this message at 12:38 on Oct 17, 2018 |
# ? Oct 17, 2018 11:08 |
|
If you've got the power/case cooling budget there's no reason not to max out Pascal's power target. It should figure out clocks itself and remain stable. Free performance in high load situations when it matters most. That might be one of the later 1080s when memory clocks for GDDR5X really exploded, early ones did like +400, but later ones did +1000 so consistently they raised the stock speed by 500.
|
# ? Oct 17, 2018 12:27 |
|
Thanks everybody for your opinions and support. I am now running more benchmarks (Time Spy and Firestrike from 3DMark), +100 clock would give me a crash in Firestrike but not on the more modern Time Spy ... hmm, anyway I prefer stability over pure performance so went down to +80 and it seems everything runs smoothly, max clocks still stay between 2010 and 2050Mhz which is perfectly fine.
|
# ? Oct 17, 2018 13:04 |
|
LRADIKAL posted:65 bucks? Buy it, sell your old RAM. It makes some difference. HalloKitty posted:For that price, do it. 1600 to 2400 is significant, and that's cheap. Thanks for the responses. Ill pick it up today.
|
# ? Oct 17, 2018 14:27 |
|
What's our opinion on 2070s? Worth it at ~550 USD? I'd be upgrading from a 970. https://www.newegg.com/Product/Prod...em-_-14-487-412 *edit* gently caress it, I did it anyway Doh004 fucked around with this message at 15:04 on Oct 17, 2018 |
# ? Oct 17, 2018 14:54 |
|
So, small update re: my over clocking in real life loads (Forza Horizon 4) Game just won't start if I put the power limit at 120%, it will crash back to desktop while loading. If I put it at 100% or 110% it will start. Literally nothing else is changed, offset are +100/+1000 in both cases and once the game is started, I can also go back to PrecisionXOC and raise the powerlimit back to 120% and it will work flawlessly, it's only on startup. Real clocks are consistent with the program, running at 2037/12000 - even if I have power limit at 100%. Edit: now at 110% it will start 2 out of 3 times, while at 100% it starts up every time. Clock speeds have nothing to do with this , since the GPU boosts at the same levels anyway. Would it be OK to leave power target to 100% and offsets to +100/+1000? TorakFade fucked around with this message at 15:12 on Oct 17, 2018 |
# ? Oct 17, 2018 15:05 |
|
Doh004 posted:What's our opinion on 2070s? Worth it at ~550 USD? I'd be upgrading from a 970. I don't think so. I'm also looking to upgrade from a 970 and at that price I think I'd rather take an eBay 1080ti vs new 2070.
|
# ? Oct 17, 2018 15:06 |
|
TorakFade posted:
my next case will have 0 cutouts. I want the opposite of this rainbow vomit nightmare. Stop putting RGB leds on everything component makers ffs
|
# ? Oct 17, 2018 15:20 |
|
zer0spunk posted:my next case will have 0 cutouts. I want the opposite of this rainbow vomit nightmare. Stop putting RGB leds on everything component makers ffs I used to be like this. Then midlife crisis happened or something and now I enjoy it, and I hope manufacturers start putting MORE leds into stuff (you can always turn them off if you don't like them, but you can't turn them on if they're not there. And really, they cost basically nothing so if they become more commonplace they'll also be much cheaper, right now you pay a hell of a premium for anything with LEDs in it because it's "rare" and "elite" and whatever crap, just like not-beige cases were 15 years ago) Edit: out of curiosity, why so much hate for colored lights? Honest question, I "hated" them for ... basically no reason looking back, now I think there's really not much point, but in the same way there is no point in having a house full of nice fancy furniture or whatever; you don't *need* it, but it's just ... nice to look at TorakFade fucked around with this message at 15:32 on Oct 17, 2018 |
# ? Oct 17, 2018 15:25 |
|
There's a middle ground where the RBG vomit is terrible. You either have to go all in or none at all.
|
# ? Oct 17, 2018 15:31 |
|
I have a windowless Define S case to hide the RGB that is nearly inevitable with higher end PC hardware. I don't care that a unicorn could be barfing in my case now, I don't ever see it.
|
# ? Oct 17, 2018 15:33 |
|
The future is powered cases that open the side panel when it gets too hot inside, revealing all the internal RGB. Higher-end cases will have spotlights and a smoke machine inside for when this happens.
|
# ? Oct 17, 2018 15:36 |
|
TorakFade posted:Edit: now at 110% it will start 2 out of 3 times, while at 100% it starts up every time. Clock speeds have nothing to do with this , since the GPU boosts at the same levels anyway. Would it be OK to leave power target to 100% and offsets to +100/+1000? Totally fine. It won’t be quite as good from a clocks perspective but it won’t hurt the card or anything. Like someone pointed out, you can’t do anything in overclocking software to hurt Pascal. Worse case scenario it runs slower, but just using the standard power target definitely won’t do that. E: I have a full windowless Define R5 and I left the motherboard RGB on because I think it looks cool and mysterious to walk behind and see the colors through the exhaust fan. Arivia fucked around with this message at 15:41 on Oct 17, 2018 |
# ? Oct 17, 2018 15:39 |
|
EVGA Precision X1 sucks rear end. Half the time I open it I find that my settings have reverted back to default. I have no idea how to fix this or even if there is a fix.
|
# ? Oct 17, 2018 15:43 |
|
I'd cut the offsets in half and keep the power target maxed. The card shouldn't crash from just power target adjustments but will be much faster than offset adjustments.
|
# ? Oct 17, 2018 15:43 |
|
Beautiful Ninja posted:I have a windowless Define S case to hide the RGB that is nearly inevitable with higher end PC hardware. I don't care that a unicorn could be barfing in my case now, I don't ever see it. I hide all of my barfing unicorn RGB crap behind a Corsair Carbide 600Q
|
# ? Oct 17, 2018 15:44 |
|
Soldering iron and flush cutters, remove the LEDs at the source. Show them who's the boss. Edit: I'm only half kidding, if they're dim or can be disabled in software okay, but if it's a permanent thing boring into my brain it's gotta go. Why does a monitor need a status light brighter than the screen? I know if the screen is on and it turns off if the cable is unplugged so it doesn't even work as a status light for a brand new monitor before you plug it in. craig588 fucked around with this message at 15:55 on Oct 17, 2018 |
# ? Oct 17, 2018 15:48 |
|
TorakFade posted:I used to be like this. Then midlife crisis happened or something and now I enjoy it, and I hope manufacturers start putting MORE leds into stuff (you can always turn them off if you don't like them, but you can't turn them on if they're not there. And really, they cost basically nothing so if they become more commonplace they'll also be much cheaper, right now you pay a hell of a premium for anything with LEDs in it because it's "rare" and "elite" and whatever crap, just like not-beige cases were 15 years ago) I don't have a office where the desktop lives, so the less light sources blasting in my bedroom the better. It's also tacky as poo poo, but that's a personal thing. Ironically I'm a department head for lighting film/tv/music videos etc etc and the trend in the last 3-4 years have been RGB LEDS and now every shoot is a clusterfuck of random colors because it doesn't cost anything extra to achieve it like it used to and everyone thinks they are the first person to do it. The latest tech for us is pixel mapping on tube style leds so you can get color chases and washes and blah blah blah. You'll see these in every shot pretty soon, it's the new dumb trend (look up travis scott on SNL a few weeks ago to see what I'm talking about, those are one of the first companies to make the tubes available)
|
# ? Oct 17, 2018 15:49 |
|
TorakFade posted:Edit: out of curiosity, why so much hate for colored lights? Honest question, I "hated" them for ... basically no reason looking back, now I think there's really not much point, but in the same way there is no point in having a house full of nice fancy furniture or whatever; you don't *need* it, but it's just ... nice to look at I wouldn't mind manufacturers sticking LEDs everywhere if they were off by default.
|
# ? Oct 17, 2018 16:03 |
|
zer0spunk posted:my next case will have 0 cutouts. I want the opposite of this rainbow vomit nightmare. Stop putting RGB leds on everything component makers ffs i did this and wont look back (or at lights)
|
# ? Oct 17, 2018 16:04 |
|
Krailor posted:I don't think so. I'm also looking to upgrade from a 970 and at that price I think I'd rather take an eBay 1080ti vs new 2070. You're probably right! I went with it anyways because I haven't actually bought a new GPU since 2012 (just kept buying used) and figured I'd pull the trigger. We shall see. TorakFade posted:Edit: out of curiosity, why so much hate for colored lights? Honest question, I "hated" them for ... basically no reason looking back, now I think there's really not much point, but in the same way there is no point in having a house full of nice fancy furniture or whatever; you don't *need* it, but it's just ... nice to look at Because MY FIANCE (now that we're all older) would kill me if the computer was constantly emulating an EDM concert. My gaming keyboard is too much for her as it is. Also, we need to hold onto some sort minimalism while justifying spending loads of cash on nerd toys.
|
# ? Oct 17, 2018 16:12 |
|
TorakFade posted:you can always turn them off if you don't like them They're getting better about this, but most of the time you still need to install the manufacturer's crapware to turn them off, which I'd prefer not needing to do. Luckily Fractal Design still makes good cases where only minimal amounts of light shine out of.
|
# ? Oct 17, 2018 16:17 |
|
Doh004 posted:What's our opinion on 2070s? Worth it at ~550 USD? I'd be upgrading from a 970. If you were in the market its a good card to buy dont worry. The main issue is its not compelling to put people into the market if they werent
|
# ? Oct 17, 2018 18:36 |
|
TorakFade posted:Edit: out of curiosity, why so much hate for colored lights? Honest question, I "hated" them for ... basically no reason looking back, now I think there's really not much point, but in the same way there is no point in having a house full of nice fancy furniture or whatever; you don't *need* it, but it's just ... nice to look at They make your actual gaming sessions worse by ruining immersion, and they make your non-gaming life worse by being gaudy attention grabbers. Lighting is the dumbest trend to ever happen in computing.
|
# ? Oct 17, 2018 18:52 |
|
I kind of love it precisely because of that reaction it causes for some people. RGB everything all day!
|
# ? Oct 17, 2018 18:57 |
|
My next case will probably have a window in it because I'm a child who likes looking at shiny things, but I did the colored cathode thing in 2006 and I'm no longer interested in my office looking like a disco. Still using the 2006 case minus the lights. It was surprisingly forward thinking in a lot of ways, but it's literally rusting out at this point and needs to go.
|
# ? Oct 17, 2018 19:05 |
|
Xerophyte, your tracing posts are great!
|
# ? Oct 17, 2018 19:12 |
|
I’m that guy who hated lights fifteen years ago but enjoys them now. It was stupid for people to cut holes for acrylic windows and stick lights in to illuminate dark components. Now that the components themselves light up and we have edge to edge glass panels, go nuts. I’d buy a motherboard where all the tracepaths had LEDs.
|
# ? Oct 17, 2018 19:23 |
|
my computer is a big stupid toy that i use to intricately animate realtime bouncy asscheek physics in games where i pretend to be a neat robot or a mobster or a robot mobster the fact that the computer itself also looks super dumb and ridiculous is a nod to the fact that i realize this whole thing is dumb as hell but i still like it
|
# ? Oct 17, 2018 19:35 |
|
anyone who went from 1080 ti to 2080 ? Is the difference really that negligible ?
|
# ? Oct 17, 2018 19:43 |
|
|
# ? Apr 24, 2024 17:00 |
|
Ulio posted:anyone who went from 1080 ti to 2080 ? Is the difference really that negligible ? It'd be a pointless swap, at this point. Their performance is pretty much the same; the 2080 is better in some games, and the 1080ti is better in others. The obvious difference between the two would be the inclusion of RTX components for features that aren't yet implemented in games. That additional hardware isn't enough on its own to warrant that upgrade. Hopefully DLSS and raytracing will be implimented in some games soon so we can see how well the RTX cards deal with that stuff. The step up from 1080ti to 2080ti is only about a 35% improvement, depending on the game. This is also probably not worth the upgrade at this point, unless you are dying for 60FPS at max settings on a 4K monitor.
|
# ? Oct 17, 2018 20:00 |