|
Radio Talmudist posted:I hope that's true. I hear the last Max Payne game was crazy well optimized, which is a world away from what GTA4 was on the PC. I really hope that the recent AC:Unity performance fiasco opens the eyes of people with regards to optimization. It's one thing to tax a top-end PC to the limit and another issue altogether whether the resulting visuals are actually good enough over previous games to justify yet another increase in hardware demands. I really like Wolfenstein:TNO and but I sure as hell take issue that it's 2006-era looking graphics chokes on my HD7950 with obvious texturing artifacts just by rotating the camera.
|
# ? Dec 18, 2014 09:01 |
|
|
# ? May 16, 2024 18:50 |
|
Kornjaca posted:I was only using elvui, bigwigs, master planer and one more addon for the garrison. Decided to delete everything, incl the WTF folder, and the problem still persisted until I started using DSR. As I mentioned before, I re-installed the same addons after I got it working properly and it works perfectly now, in 3840 x 2160 rez. Try changing the power saving mode from "adaptive" to "prefer maximum performance" in the Nvidia control panel. It's very unusual for increasing GPU load to increase performance unless the launch drivers have adaptive power management set up too aggressively. I have a feeling everyone else in the thread (me included) just changed that by default so we have no experience with what you're encountering. craig588 fucked around with this message at 09:06 on Dec 18, 2014 |
# ? Dec 18, 2014 09:03 |
|
The reason I responded to you at all is because I had read all your posts, and at that point you had multiple people telling you to do x and y and you kept going on about z. Thanks for finally doing what people asked, and it sucks that it didn't fix your problem. content: Has anyone got the Omega drivers to install on a 280x? I tried to install them on my Asus one, but the installer would never actually install the display driver component* so far as I could tell. I didn't see anything about the drivers themselves not being compatible with the 280x, just simply that *During a custom install, it didn't have an entry for the display driver. Also, after the installation, I lost all my resolution options besides the low-res default, as if I had fallen back to a generic Windows display driver. jkyuusai fucked around with this message at 14:13 on Dec 18, 2014 |
# ? Dec 18, 2014 13:52 |
|
Could someone please update the original post? I'm looking for a new GPU, and have no idea on what to get for the more-or-less-$200 mark.
|
# ? Dec 18, 2014 14:21 |
|
Use this thread instead.
|
# ? Dec 18, 2014 14:23 |
|
Angry Fish posted:Could someone please update the original post? I'm looking for a new GPU, and have no idea on what to get for the more-or-less-$200 mark. You were pointed at the right thread im sure but right now im sure its going to be this http://www.ncixus.com/products/?usaffiliateid=1000031504&sku=91053&vpn=R9-280X-TDFD&manufacture=XFX&promoid=1265
|
# ? Dec 18, 2014 18:47 |
|
AMD is working on a frame rate limiter to get some power savings
|
# ? Dec 18, 2014 19:05 |
|
Wouldn't somebody using Vsync already experience those same power savings? I suppose this finally lets us limit frame output through the driver without the potential for vsync's 60->30->60 fps stutters.
|
# ? Dec 18, 2014 19:12 |
|
[edit] ^ Nvidia's Adaptive V-Sync auto turns it off if you're below 60 fps. Isn't frame limiting already a feature on Afterburner? Also who really cares if your GPU costs 0.02c more per hour to run? Ak Gara fucked around with this message at 19:15 on Dec 18, 2014 |
# ? Dec 18, 2014 19:13 |
|
True frame limiting isnt just about power savings. It reduces stuttering, hitching, frame drops. This isn't an accurate comparison but the results are the same - imagine if your GPU isn't running full tilt at all times, you have more "power" on tap when it does suddenly require a lot of poo poo to render on screen. Vsync does not provide that, in that way. Vsync is harder on the GPU for the same framerate. Vsync is (marginally) better at reducing screen tearing than plain frame limiting. It will also help with some of the stuttering. But since its really slamming the GPU all the time (apples to apples) you don't gain that "power on tap" to "soak" momentary instances in games where the GPU is overwhelmed. Also you get input lag which bothers some (me included), along with some other quirks. The power savings really can be incredible if you're running a game where you really can push something way too high like 150+ fps. But not a lot of people are going to give a poo poo about the power cost in money, its all about what you gain when your GPU can handle more when needed. Especially with AMD cards that can tend to run very hot, you may avoid throttling altogether this way. Throttling is bad. I don't see any point in rendering a FPS higher than your refresh rate. It's a waste in every sense. It come up here quite often, but why hasn't nvidia or AMD include this as an inherent global feature all this time? It seems almost trivial, and it probably is. Perhaps now it will be finally included. Afterburner does allow it, through Rivatuner. The older Precision software even had it on the main screen. Despite all that, its not some magic button either. Regardless some games will stutter because the GPU can't handle it no matter what. But there is virtually no downside to frame limiting. It will help (a lot sometimes) or do nothing at all at worst. edit: I want to add while I think rendering fps above your refresh rate is a waste, being able to isn't a waste. Say if I *can* run BF4 at 80 fps, it's true that a 15 fps dip when someone direct hits me with a tank round will still keep me above my refresh (sort of). But if you frame limit at 60 fps that doesn't mean you'll suddenly drop to 45 fps in the same instance. It just stays at 60. I know this is simplifying since in reality that 15 fps drop is really a completely-to-zero fps momentary drop (likely many times), but frame limiting will help minimize that regardless. 1gnoirents fucked around with this message at 20:00 on Dec 18, 2014 |
# ? Dec 18, 2014 19:29 |
|
Ak Gara posted:Also who really cares if your GPU costs 0.02c more per hour to run? I thought this was much more about keeping it in thermal targets. Current AMD reference GPUs are limited by heat more than anything else, and making less heat in simple scenes would let it avoid throttling more in complex scenes.
|
# ? Dec 18, 2014 19:29 |
|
Ak Gara posted:Also who really cares if your GPU costs 0.02c more per hour to run? Its not only the cost of the energy but the heat it puts out that is reduced.
|
# ? Dec 18, 2014 19:30 |
|
calusari posted:AMD is working on a frame rate limiter to get some power savings The hell, hasn't afterburner and the EVGA oc utility done this poo poo on nvidia cards forever?
|
# ? Dec 18, 2014 19:49 |
|
Don Lapre posted:The hell, hasn't afterburner and the EVGA oc utility done this poo poo on nvidia cards forever? I'm just glad it will be a mainstream concept now. Add "dynamic" to it and poof, its a brand new AMD feature. I wonder what word nvidia is going to use. Zsync Funonmic Boost 3.2 GPU Excelar8er
|
# ? Dec 18, 2014 19:51 |
|
Wooper posted:Use this thread instead. 1gnoirents posted:You were pointed at the right thread im sure but right now im sure its going to be this Thanks a lot!
|
# ? Dec 18, 2014 19:56 |
|
Angry Fish posted:Thanks a lot! Post what you have (PSU especially there) and you'll get a wider range of advice and options
|
# ? Dec 18, 2014 20:01 |
|
1gnoirents posted:Post what you have (PSU especially there) and you'll get a wider range of advice and options Doing it piecemeal right now: Llano-based APU (A8-3870k) (currently drawing ~110watts when hitting its 3.6ghz 1.56volt ceiling) Sapphire passive-cooling 6670, Dual-Graphics'd A FM1 motherboard with an A55 southbridge 8gigs of DDR3 1600 RAM Samsung SSD (250gig) 1TB Western Digital at 7200rpm Generic and Geriatric DVD drive 550 Watte 80% gold rated powersupply from NZXT, recent grab I plan to upgrade the GPU this week, see where I stand on bottlenecking the ancient Llano CPU/APU, and move on to an Intel i7 4970k with that generic Asus x97(?) mobo within ~2 months, throwing the old stuff into another case and handing it to my nephew. I don't think I'll need a new power supply unless I'm going to crossfire another GPU in the future, and I'm pretty sure I won't be doing that.
|
# ? Dec 18, 2014 20:10 |
|
You can run any cpu/gpu combo on that power supply short of a 295x2
|
# ? Dec 18, 2014 20:20 |
|
Lol yeah I meant post that in parts picking, but yes you're good to go for a 280x. And yes the APU will be your bottleneck, significantly in some games nowadays. Well... pretty much everything. Since I happened to just look at some far cry 4 benchmarks, you would gain 50%+ fps from a CPU change alone.
1gnoirents fucked around with this message at 20:33 on Dec 18, 2014 |
# ? Dec 18, 2014 20:30 |
|
Don Lapre posted:You can run any cpu/gpu combo on that power supply short of a 295x2 Oh, good. Newegg has this PSU calculator that told me I'm going to need an upgrade, and I figured they were just trying to upsell. Thanks.
|
# ? Dec 18, 2014 20:30 |
|
1gnoirents posted:Lol yeah I meant post that in parts picking, but yes you're good to go for a 280x. And yes the APU will be your bottleneck, significantly in some games nowadays. Well... pretty much everything. I'm not "with it" anymore. What's this parts picker postin'? I play Crusader Kings II and some 2012-13 action/shooter games. I can play the new Wolfenstein at medium settings at 1080p for maybe three hours before the Sapphire tells me its tired of running at 90c and bottlenecks, either regulating its Hz down to a crawl or choking and giving me a BSOD. I can't play Metro Last Light, but that's why I'm getting the new GPU. Is the CPU really that important? The Llano is roughly a little bit equal to a Core 2 Quadro from 2007/8 in terms of instructions per cycle, and its clocked a tiny bit higher. It's old but its not dead.
|
# ? Dec 18, 2014 20:37 |
|
Angry Fish posted:I'm not "with it" anymore. What's this parts picker postin'? Wooper posted:Use this thread instead.
|
# ? Dec 18, 2014 21:01 |
|
Angry Fish posted:Oh, good. Newegg has this PSU calculator that told me I'm going to need an upgrade, and I figured they were just trying to upsell. I run a 550w with a 780ti @ 1254 and a 4790k @ 4.8ghz and it only pulls around 520 from the wall which is probably 490 or so load on the psu itself (at max load)
|
# ? Dec 18, 2014 21:03 |
|
Don Lapre posted:...a 4790k @ 4.8ghz... How lucky you are. What mobo and BIOS settings are you using?
|
# ? Dec 18, 2014 21:23 |
|
Don Lapre posted:You can run any cpu/gpu combo on that power supply short of a 295x2 This isn't as true as it used to be. With an overclocked R9-290 and a decent OC on a 2500K I made a 520W 80+ PSU pull 660W from the wall and be electrically noisy enough to trigger my arc fault circuit interrupt. I replaced it with a 750W 80+ Gold one and it now only draws 570 from the wall and no longer trips AFCI.
|
# ? Dec 18, 2014 21:36 |
|
I'm pretty sure 4.8 is supposed to be the upper norm for 4790ks made after a second batch (just not the first batch, whatever that was). Very much unlike the 4770k of course
|
# ? Dec 18, 2014 21:36 |
|
r0ck0 posted:How lucky you are. What mobo and BIOS settings are you using? Asus Maximus VII Gene, delid /w h100i. Stock settings except for voltage, id have to look at what voltage im running as i havn't messed with it since i got my 4790k.
|
# ? Dec 18, 2014 21:39 |
|
Twerk from Home posted:This isn't as true as it used to be. With an overclocked R9-290 and a decent OC on a 2500K I made a 520W 80+ PSU pull 660W from the wall and be electrically noisy enough to trigger my arc fault circuit interrupt. I replaced it with a 750W 80+ Gold one and it now only draws 570 from the wall and no longer trips AFCI. You must have been doing some crazy overclocking on the gpu. hardocp's build only pulled 439w from the wall when overclocked with a 4.6ghz 3770k http://www.hardocp.com/article/2014/05/19/asus_radeon_r9_290_directcu_ii_oc_video_card_review/9#.VJM7sdXF-70
|
# ? Dec 18, 2014 21:41 |
|
Don Lapre posted:You must have been doing some crazy overclocking on the gpu. hardocp's build only pulled 439w from the wall when overclocked with a 4.6ghz 3770k 4 hard drives, an SSD, and I'm pretty sure that I was over capacity on that 6 year old PSU and its efficiency was garbage. The AFCI thing may have also been related to a voltage droop, I was only getting ~112V from the wall. Twerk from Home fucked around with this message at 21:55 on Dec 18, 2014 |
# ? Dec 18, 2014 21:52 |
|
That seems really high. With a 290 at stock voltage and a 2600k at 4.6ghz & 1.38v I pull about 420W max from the wall per the kill-a-watt, with 6 hard drives and an SSD. Maybe the oversized gold PSU helps, but I can't imagine it making up that much of a difference (other than running fanless at load which is neat).
|
# ? Dec 18, 2014 22:20 |
|
Yea, the PSU would have to be running at really really low efficiency to do that.
|
# ? Dec 18, 2014 22:23 |
|
Don Lapre posted:Yea, the PSU would have to be running at really really low efficiency to do that. I'll take another look at it tonight. I was reading these numbers from an APC UPS, and I know for a fact that the old 520W was doing something at those load levels that was triggering AFCI left and right, and the new 750W isn't.
|
# ? Dec 18, 2014 22:45 |
|
Don Lapre posted:I run a 550w with a 780ti @ 1254 and a 4790k @ 4.8ghz and it only pulls around 520 from the wall which is probably 490 or so load on the psu itself (at max load) I have to put entirely too much voltage to mine to get past 4.6ghz on my 4770k, but I'm not exactly cpu bottlenecked at my resolution.
|
# ? Dec 19, 2014 00:34 |
|
I'm home now and checked, those power consumption numbers are including my monitor. Boy is my face red.
|
# ? Dec 19, 2014 03:08 |
|
im so mad at you
|
# ? Dec 19, 2014 04:00 |
|
My monitor is exactly 10w at minimum brightness and 20w at max, what do you use that pulls anything substantial?
|
# ? Dec 19, 2014 14:02 |
|
Zero VGS posted:My monitor is exactly 10w at minimum brightness and 20w at max, what do you use that pulls anything substantial? Monoprice Zero-G cheapo 2560x1440 panel. Anandtech saw 61W usage at minimum brightness. I've got a T-amp on there as well for some speakers, but I think its wall wart is only 12V 2 amps so it can't be pulling more than 25W.
|
# ? Dec 19, 2014 14:45 |
|
How do you guys manage to measure your power usage levels? You're buying a special tool just to see where you stand?
|
# ? Dec 19, 2014 16:22 |
|
Angry Fish posted:How do you guys manage to measure your power usage levels? You're buying a special tool just to see where you stand? http://www.amazon.com/P3-P4400-Electricity-Usage-Monitor/dp/B00009MDBU Kill-A-Watt This one uses non-volatile memory so that it will save the energy usage of a device plugged into it even if it is unplugged or the Kill-A-Watt is unplugged. http://www.amazon.com/P3-International-P4460-Electricity-Monitor/dp/B000RGF29Q SlayVus fucked around with this message at 16:35 on Dec 19, 2014 |
# ? Dec 19, 2014 16:31 |
|
|
# ? May 16, 2024 18:50 |
|
Angry Fish posted:How do you guys manage to measure your power usage levels? You're buying a special tool just to see where you stand? You can get a killawatt, or lots of UPS's now will show you power usage.
|
# ? Dec 19, 2014 16:50 |