Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

HappyCapybaraFamily posted:

I recently upgraded to a GTX 760 and am using my old GTX 480 as a dedicated PhysX card (and electric bill increaser). I obviously have more dollars than sense, and now I am thinking of replacing the GTX 480 with a used GTX 650 Ti for PhysX duty, but will I see a performance increase? I will at least see my electric bill go down a dollar or so a month, probably, and gaming won't give me heat stroke in the Texas summer, at least.

I went from a GTX 580 for PhysX to a 650Ti for PhysX and have not seen a performance decrease, but I also haven't seen a performance increase. About the only PhysX workload that I've found that can even really get the card to stretch its legs is FluidMark, and that's sort of to PhysX what Prime95 or whatever is to CPU overclocking, it's got a mode specifically made to run it as hard as possible. There, my GTX 650Ti looks like a puny lil dude compared to my 780Ti, but that's to be expected.

In games, the only difference going from a 580 to a 650Ti was power savings. You could probably save even more by going with one of the Maxwell low power cards that have come out recently.

PhysX in games is not a hard job for cards to do; you'd have to have a pretty much bottom of the barrel card for it to suck at PhysX.

Adbot
ADBOT LOVES YOU

HappyCapybaraFamily
Sep 16, 2009


Roger Baolong Thunder Dragon has been fascinated by this sophisticated and scientifically beautiful industry since childhood, and has shown his talent in the design and manufacture of watches.
Thanks for the advice! I would definitely go for the 750 if I could find it used for the prices I'm seeing for a used 650 (less than $100).

Though taking into account the power savings, it might be worth it. v:shobon:v

HappyCapybaraFamily fucked around with this message at 16:51 on Mar 5, 2014

Ignoarints
Nov 26, 2010
After like 2 hours of loving with vsync settings for SLI, simply capping the FPS at 59 and turning it all off is far superior than vsync (input lag), vsync smooth (random drops to 30 fps, input lag), and adaptive (utter garbage). Just tested for BF4, but I wonder why that isn't just an option to begin with.

veedubfreak
Apr 2, 2005

by Smythe

BurritoJustice posted:

If you can wait and want to go crazy with an overbuilt digital power supply, corsair announced their upcoming AX1500i recently. It's even 80 Plus Titanium so you can save on those power bills :shepface:

This might be an option, especially since no one has any of the drat psus in stock at msrp that I'm looking at. If anything I think I'm going with the EVGA 1300 assuming I can find it in stock because it's only 229 @ evga. gently caress Newegg and their price gouging, they want 359 for it. As for the current power supply being worn out, I have only had it in service for the last year or so. It literally would not power on with the EVGA x58 board as it thought there was a fault. So it sat on a shelf for 3ish years. Cool thing is, I can actually disable the third card on my board using the dip switches until I get a better psu.

Also, Titanfall comes out next Tuesday. Wewt.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Ignoarints posted:

After like 2 hours of loving with vsync settings for SLI, simply capping the FPS at 59 and turning it all off is far superior than vsync (input lag), vsync smooth (random drops to 30 fps, input lag), and adaptive (utter garbage). Just tested for BF4, but I wonder why that isn't just an option to begin with.

I'm always hearing this back and forth about whether you're supposed to cap FPS at 59 or at 60. Some say 59 because the last frame is part of the next second or something but that doesn't sound right to me. What's the official word for what I should be doing in Precision X?

Ignoarints
Nov 26, 2010

Zero VGS posted:

I'm always hearing this back and forth about whether you're supposed to cap FPS at 59 or at 60. Some say 59 because the last frame is part of the next second or something but that doesn't sound right to me. What's the official word for what I should be doing in Precision X?

I set 59 just to have it below 60 for vysnc based on the internet. Since vsync is off I'm not sure it matters if it's 60 or 59. I know for a fact I wouldn't notice the difference though, and if there is any chance for it to cause a problem it isn't worth it of course. I'll try it out though. One thing I didn't take into account is how much less work my GPU's are doing now, where vysnc was intense on them for a worse result. I'm going to start bumping my memory clock now

I've heard 59, 60, and 61 for vsync off actually

r0ck0
Sep 12, 2004
r0ck0s p0zt m0d3rn lyf

Ignoarints posted:

I set 59 just to have it below 60 for vysnc based on the internet. Since vsync is off I'm not sure it matters if it's 60 or 59. I know for a fact I wouldn't notice the difference though, and if there is any chance for it to cause a problem it isn't worth it of course. I'll try it out though. One thing I didn't take into account is how much less work my GPU's are doing now, where vysnc was intense on them for a worse result. I'm going to start bumping my memory clock now

I've heard 59, 60, and 61 for vsync off actually

How do you limit the frame rate? Is it a third party app or something you can do in the nvidia control panel?

Ignoarints
Nov 26, 2010

r0ck0 posted:

How do you limit the frame rate? Is it a third party app or something you can do in the nvidia control panel?

I'm sure there are others ways to do it, but I just created "user.cfg" in the main BF4 directory and added "gametime.maxvariablefps 59"

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
I use EVGA Precision X to limit frames, if I recall it's the only one that works on both 32 and 64 bit games. Stupidly it only goes up to 120hz when plenty of monitors do 144.

Edit: enabling Vsync usually caps any game to 60fps as well but with V sync off a lot of cards will stupidly try to push 200+ fps in Call of Duty, etc.

Zero VGS fucked around with this message at 18:04 on Mar 5, 2014

Ignoarints
Nov 26, 2010

Zero VGS posted:

I use EVGA Precision X to limit frames, if I recall it's the only one that works on both 32 and 64 bit games. Stupidly it only goes up to 120hz when plenty of monitors do 144.

Edit: enabling Vsync usually caps any game to 60fps as well but with V sync off a lot of cards will stupidly try to push 200+ fps in Call of Duty, etc.

I was getting 120-180 fps with some occasional stutters with vsync off in BF4 so thats what led me to give vysnc a try. Immediately impressed then immediately disappointed. I'm just glad frame limiting accomplished the same exact thing without any of the consequences so far

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
Yeah if you can hit a solid 60fps you want to frame limit, Vsync will add input delay and lower the overall framerate.

Frame limiting is absolutely awesome for SLI / Crossfire, I like to adjust the settings so I'm getting 60 fps with both cards at about 75% load. That way the cards are ready to crank themselves up and prevent any frame drops, but they're also comparatively cooler/quieter together than a single more powerful card would be.

Ignoarints
Nov 26, 2010

Zero VGS posted:

Yeah if you can hit a solid 60fps you want to frame limit, Vsync will add input delay and lower the overall framerate.

Frame limiting is absolutely awesome for SLI / Crossfire, I like to adjust the settings so I'm getting 60 fps with both cards at about 75% load. That way the cards are ready to crank themselves up and prevent any frame drops, but they're also comparatively cooler/quieter together than a single more powerful card would be.

That's interesting, do you mean in game settings ?

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
Right, I set the frame limit in Precision X then I lower/raise the game settings to hit 60 FPS at around 75%

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

Zero VGS posted:

Frame limiting is absolutely awesome for SLI / Crossfire

Vsync is pretty much required for me with Crossfire. The frame pacing fix did gently caress all for me, it is still plagued by microstuttering.

FAT32 SHAMER
Aug 16, 2012



Right now I have an MSi Radeon HD 6850 Cyclone, but I'm looking to upgrade so that I can play BF4/new games without much issue since I've had this GPU for four years and it's starting to show. Which would be a better upgrade: buying a second matching GPU or buying one different GPU to replace the 6850? My budget will probably be $250 max, and if I remember right, most games can't really use crossfire very well anyways...

Ignoarints
Nov 26, 2010

Jan posted:

Vsync is pretty much required for me with Crossfire. The frame pacing fix did gently caress all for me, it is still plagued by microstuttering.

I know its different tech and all but have you tried vsync off and setting a frame limit? Basically all info online talks about combining the two, but very little suggests just trying the one. Frame limiting and vsync did absolutely nothing for me different than plain vsync, contrary to what a lot of people have reported improvements with. I tried to convince myself the input lag was in my head but I just couldn't get over it.

Tusen Takk posted:

Right now I have an MSi Radeon HD 6850 Cyclone, but I'm looking to upgrade so that I can play BF4/new games without much issue since I've had this GPU for four years and it's starting to show. Which would be a better upgrade: buying a second matching GPU or buying one different GPU to replace the 6850? My budget will probably be $250 max, and if I remember right, most games can't really use crossfire very well anyways...


Since you will still be limited to 1GB vram I don't think it would be worth crossfiring that. The best card I can think for $250 is the GTX 760

Ignoarints fucked around with this message at 22:50 on Mar 5, 2014

FAT32 SHAMER
Aug 16, 2012



Ignoarints posted:

Since you will still be limited to 1GB vram I don't think it would be worth crossfiring that. The best card I can think for $250 is the GTX 760

Good stuff, cheers!

Now, I've never swapped out video cards before, so is there an easy way to uninstall the old drivers and install the new ones? I'm pretty sure video cards are plug and play so it should install the drivers on its own once the system is fired up, right?

Nephilm
Jun 11, 2009

by Lowtax

Tusen Takk posted:

Right now I have an MSi Radeon HD 6850 Cyclone, but I'm looking to upgrade so that I can play BF4/new games without much issue since I've had this GPU for four years and it's starting to show. Which would be a better upgrade: buying a second matching GPU or buying one different GPU to replace the 6850? My budget will probably be $250 max, and if I remember right, most games can't really use crossfire very well anyways...

Ignoarints posted:

Since you will still be limited to 1GB vram I don't think it would be worth crossfiring that. The best card I can think for $250 is the GTX 760

Echoing that and adding that yours is a question for the parts picking thread.

FAT32 SHAMER
Aug 16, 2012



Nephilm posted:

Echoing that and adding that yours is a question for the parts picking thread.

Woops, thanks for the heads up

Ignoarints
Nov 26, 2010

Tusen Takk posted:

Good stuff, cheers!

Now, I've never swapped out video cards before, so is there an easy way to uninstall the old drivers and install the new ones? I'm pretty sure video cards are plug and play so it should install the drivers on its own once the system is fired up, right?

You should always download the latest drivers from Nvidia (or AMD whatever you get), dont use the CD. It will give you a picture when you boot but that's it (if you just put a new card in and nothing else). I'm pretty sure windows update will get you drivers but you really should get them from the manufacturer website. When I swap cards I always download the new drivers first, completely remove the old drivers (sometimes this isn't the simplest thing) then install the new card and then apply the new drivers.

And just for options sake the R9 270X is similar to the 760 but since you specifically said Battlefield 4 in there I have never seen a benchmark that shows the 270x doing better head to head. Plus its generally a little more money. Others probably know better than me on AMD cards but yeah I guess parts picking thread for that stuff

Ignoarints fucked around with this message at 23:02 on Mar 5, 2014

veedubfreak
Apr 2, 2005

by Smythe
Just a reminder that Newegg is a bunch of greedy fucks.
http://www.newegg.com/Product/Product.aspx?Item=N82E16817438011
http://www.evga.com/Products/Product.aspx?pn=120-G2-1300-XR

FAT32 SHAMER
Aug 16, 2012



Oh sweetness I just realised I get 35% off EVGA products.

:getin:

Ignoarints
Nov 26, 2010

Tusen Takk posted:

Oh sweetness I just realised I get 35% off EVGA products.

:getin:

dude wut

veedubfreak
Apr 2, 2005

by Smythe
Do tell. I'm just waiting for them to come back in stock at EVGA.

FAT32 SHAMER
Aug 16, 2012




The company I work for gets employee discounts on basically everything under the sun (I get 50% off speck cases and a shitload of free OS X software). One of the discounts is for EVGA. I guess it's up to 35% off but still I can get this card for $235

Ignoarints
Nov 26, 2010

http://www.evga.com/Products/Product.aspx?pn=04G-P4-3776-KR

-35% is 266.49 - $20 MIR = 246.49 :lol: there you go. Gonna guess its going to be out of place in a 4 year old computer.

edit: ^^ oh ah. Can you pull off a 770 of any flavor for 250? Still, that 760 is certainly better than any other 760 you'll get for $235. Also gonna have plenty of ram to say the least

Ignoarints fucked around with this message at 23:16 on Mar 5, 2014

FAT32 SHAMER
Aug 16, 2012



I emailed our rep at EVGA to send me a full list of what's available since the employee portal seems to be hosed. I'll report back as soon as I know.

Whatever card I end up with is going to make my Phenom x4 955 feel really silly (i hate my cpu)

Ignoarints
Nov 26, 2010

Tusen Takk posted:

I emailed our rep at EVGA to send me a full list of what's available since the employee portal seems to be hosed. I'll report back as soon as I know.

Whatever card I end up with is going to make my Phenom x4 955 feel really silly (i hate my cpu)

Hey man... I got a lot of mileage out of my 965 BE :P. Overclock it if you haven't, makes a huge difference for online multiplayer especially if your GPU is no longer the bottleneck.

Actually in fact, I got up to 60 or 70 fps in BF4 at ultra with 0x MSAA with that AMD. It would dip to the 40's sometimes, but I was impressed. That was with a GPU with less memory bandwidth too. Getting a better CPU certainly helped to keep it out of the 40's and 50's though. That was at 3.8 GHz I believe, a fairly mild overclock for it.

Ignoarints fucked around with this message at 23:31 on Mar 5, 2014

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
That's the same price as the 4GB 760 "B-Stock" on EVGA's site. If you can apply the discount to B-Stock then that'd be really insane.

Edit: I kinda really want a part-time job at Microcenter just to be able to buy graphics cards at cost.

FAT32 SHAMER
Aug 16, 2012



Ignoarints posted:

Hey man... I got a lot of mileage out of my 965 BE :P. Overclock it if you haven't, makes a huge difference for online multiplayer especially if your GPU is no longer the bottleneck.

Actually in fact, I got up to 60 or 70 fps in BF4 at ultra with 0x MSAA with that AMD. It would dip to the 40's sometimes, but I was impressed. That was with a GPU with less memory bandwidth too. Getting a better CPU certainly helped to keep it out of the 40's and 50's though. That was at 3.8 GHz I believe, a fairly mild overclock for it.
Only thing about OC'ing it would be needing a way better heatsink, otherwise I guess it'd be doable though I feel like this CPU is made of glass based on the number of times it's locked up on me in the past

Zero VGS posted:

That's the same price as the 4GB 760 "B-Stock" on EVGA's site. If you can apply the discount to B-Stock then that'd be really insane.

Edit: I kinda really want a part-time job at Microcenter just to be able to buy graphics cards at cost.

When I leave my company for a big boy programming job I may try to get a job at MC just for that (unless they, too, have a "all your code are belong to us" contract for you to sign. doubtful, though).

Ignoarints
Nov 26, 2010

Tusen Takk posted:

Only thing about OC'ing it would be needing a way better heatsink, otherwise I guess it'd be doable though I feel like this CPU is made of glass based on the number of times it's locked up on me in the past



I had a hyper evo on it at one point and that was more than enough. It ran really cool. I also had something bigger on it but it was unnecessary. I never had any kind of lock ups with mine stock or overclocked there is probably something else going on there, but this is definitely for another thread now lol

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

veedubfreak posted:

Do tell. I'm just waiting for them to come back in stock at EVGA.
Honestly it seems like it would make more sense just to get two somewhat smaller, high-efficiency power supplies. It's not a huge deal but bumping up the efficiency would also give you a bit more room on your circuit before you pop a breaker. As an example, the Rosewill Tachyon 1000W is $199.99 at Newegg (though currently out of stock), and while Rosewill isn't a high-end brand the Tachyon is a decent SuperFlower model with good component selection and build quality. I think one of these should power two R9 290Xs alone no matter how hard you push them.

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

Ignoarints posted:

I know its different tech and all but have you tried vsync off and setting a frame limit?

I certainly did. But there's a reason Vsync has been the best way (and only way, for a while) to prevent micro stuttering in Crossfire configurations. It's pretty much the only way to guarantee that one of the cards won't be issuing a runt frame.

To be fair, I think that my issue isn't so much Crossfire itself as my motherboard's hilariously uneven PCIe bandwidth. As I've mentioned earlier in the thread, the secondary PCIe slot goes through the PCH and so only runs at x4 1.1... Alongside the primary PCIe slot at x16 2.0. Given the fact that the micro stuttering (and worse stalls in general) normally happens in situations when new data gets streamed in, I wouldn't be surprised if Crossfire is somehow failing to account for that disparity. At least Vsync seems to force them in lockstep.

I don't mind, as I've never even noticed the dreaded input lag that apparently comes with Vsync.

GWBBQ
Jan 2, 2005


Zero VGS posted:

Yeah, you might take a look and see if you need to take a can of duster to it and/or redo the paste. Duster and paste are both really cheap so you're not out much and you might keep the thing alive longer.
I thought I had cleaned it sufficiently with a few blasts of air from one side and the vacuum up to the exhaust a few months ago, but I popped it apart to redo the thermal paste and :psypop:

The aluminum fins on the heatsink were so clogged with dust that I had to go at them from both sides 3 or 4 times just to get the dust bunnies out and I went through half a can of air cleaning the whole thing. The thermal paste was pitted and brittle, I'm not sure what the stuff looks like when it overheats but I'll bet that's what happened. I was getting 12fps at 99°C on the FurMark burn-in test before, I'm getting 19fps@69°C now.

I'm going to be a lot more proactive about cleaning my computer now and at least check the heat sink I can't see into with a small mirror next time.

FAT32 SHAMER
Aug 16, 2012



daww bummer I guess the discount doesn't apply to the 760/770 series due to extremely high demand :(

FAT32 SHAMER fucked around with this message at 05:51 on Mar 6, 2014

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.
So buy a 780ti

veedubfreak
Apr 2, 2005

by Smythe

Alereon posted:

Honestly it seems like it would make more sense just to get two somewhat smaller, high-efficiency power supplies. It's not a huge deal but bumping up the efficiency would also give you a bit more room on your circuit before you pop a breaker. As an example, the Rosewill Tachyon 1000W is $199.99 at Newegg (though currently out of stock), and while Rosewill isn't a high-end brand the Tachyon is a decent SuperFlower model with good component selection and build quality. I think one of these should power two R9 290Xs alone no matter how hard you push them.

The EVGA is pretty much 90% efficient. I don't want to deal with the hassle of putting multiple psus in my case. Plus I'll be able to sell this one to recoupe a bit of my cost.

Ignoarints
Nov 26, 2010
... then sell it, dont tell anyone, add your $250 to the profit and get something nice


or towards computer parts


or just get the 760 lol

norg
Jul 5, 2006

Factory Factory posted:

I used a Dwood bracket, which is functionally identical to a G10, and I sawed off part of it with a Dremel.



If you have a windowed side panel, you will need a low-profile fan, too. A vented side panel will work fine with the G10's stock 25mm-thick fan. I actually replaced my panel's window with some mesh from a cheap letter holder, so, that's a third option, I guess.

Ok I'm actually gonna do this I think. I have a vented side panel so the stock fan should work. Is that fan ok or is it better to replace it anyway? And is the VRAM cooled adequately with this setup?

Adbot
ADBOT LOVES YOU

Ignoarints
Nov 26, 2010

Jan posted:

I certainly did. But there's a reason Vsync has been the best way (and only way, for a while) to prevent micro stuttering in Crossfire configurations. It's pretty much the only way to guarantee that one of the cards won't be issuing a runt frame.

To be fair, I think that my issue isn't so much Crossfire itself as my motherboard's hilariously uneven PCIe bandwidth. As I've mentioned earlier in the thread, the secondary PCIe slot goes through the PCH and so only runs at x4 1.1... Alongside the primary PCIe slot at x16 2.0. Given the fact that the micro stuttering (and worse stalls in general) normally happens in situations when new data gets streamed in, I wouldn't be surprised if Crossfire is somehow failing to account for that disparity. At least Vsync seems to force them in lockstep.

I don't mind, as I've never even noticed the dreaded input lag that apparently comes with Vsync.

Ah yeah I guess so then. I have two PCI 3.0 slots but they both go down to x8 if both slots are used. I don't think it's a bottleneck for me though.

I set the FPS to 60 yesterday not expecting much but it was actually smoother than 59 fps in that there were virtually no stutters, but I experienced obvious, reproducible tearing for the first time so far. Also, very very occasionally it would tank to 30 fps or something for a second regardless of how much or little was going on - something that would never happen with everything off (or 59 fps) but would happen constantly with most vsync modes. I dunno, confusing, I'll just leave it at 59.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply