Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
KingEup
Nov 18, 2004
I am a REAL ADDICT
(to threadshitting)


Please ask me for my google inspired wisdom on shit I know nothing about. Actually, you don't even have to ask.
I've just built a custom case and it's a drat tight fit inside. Theres about a 3mm clearance between the back of the card (the side without the heat-sink) and the case. I've got a lot of air blowing through so it should be okay right?

Adbot
ADBOT LOVES YOU

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers

Sir Unimaginative posted:

It might not be just the driver, but that's probably the cheapest/least effort thing that can be done to check it. Reports pretty consistently point out that dropping to reference clocks doesn't help much (although you might be lucky, but give it a week uptime on 314 and see.)

So yeah, I set my card to its reference clock speed and played a bunch of Crysis 3 at very high 1600x900 and Bioshock Infinite at ultra 1920x1080 and neither of them crashed. Maybe I'm one of the lucky ones? If anyone has a way to set it to reference speeds permanently that'd be awesome because I have to do it every time I boot.

future ghost
Dec 5, 2005

:byetankie:
Gun Saliva

Stanley Pain posted:

There aren't enough single heatsinks for the memory but you can more than make due with some of the other heatsinks you get in the kit. You also have to use one of the small ones for one of the heatsinks on the memory that's closest to the PCIe connector or else the mounting bracket won't fit. Out side of that I didn't have to "mod" anything.

The installation is very straight forward. My only word of caution would be make absolutely, positively sure you aren't shorting anything near the VRMs. :)
I think it's probably a good idea to go with copper VRM heatsinks if you're going with an Accelero cooler for one of these cards. I still have the kit aluminum heatsinks from mine as I never used them since they're so light. Figured there wasn't much point using the stock ones since it would've been annoying to swap them out if they were insufficient. I used a custom thermalright heatpipe VRM kit for my old 4870, but those are kind of a losing proposition since they're card-specific. I'd probably wait for a custom-cooled 290/X if I went with one, although I never spend more than ~$300 on a card so I'll probably skip this generation or go with a 280 instead.

veedubfreak
Apr 2, 2005

by Smythe

Sir Unimaginative posted:

[This double post is brought to you by not wanting to cram two subjects in the same reply.]

What's the deal with the two DirectCU II variants of Asus's R9-280X (R9280X-DC2T-3GD5)?

One appears to be a two-slot, 6+8, lateral exhaust and the other (suffix -V2, though allegedly an earlier model) appears to be a three-slot, 8+8, blower exhaust. They seem to be the same price - around $310 floor - but the V2 is rare (possibly discontinued?), I'm curious why it draws more power since it has a slower memory clock (1500/6000 vs 1600/6400), and ASUS hasn't actually declared either model's TDP; since the 280X has a TDP of 250W on its own it doesn't seem like the V2 should need 375W headroom.

Some cases don't actually work well with lateral exhaust because of their design assumptions, but besides that, what is the functional difference between these things?

Lateral fans can be bigger and spin slower while doing the same amount of cooling as a squirrel fan. The drawback is that they exhaust into the case.

As far as memory usage goes, most tests are showing that in single card set ups, 3gb really isn't an issue because the cards are still not fast enough to be able to turn on the stuff that eats up your memory. But...here's where things change a bit. At super high resolutions (4k/eyefinity) and crossfire, the memory starts to get used. My 3 monitor setup is using over 3gb of memory on my 290 in certain maps already.

Captain von Trapp
Jan 23, 2006

I don't like it, and I'm sorry I ever had anything to do with it.
I was all set to buy an R9 280x for a new build, and just noticed that different manufacturers have different outputs on their versions of the cards. Some of them seem to have 2 DVI and 1 DisplayPort, others seem to have 1 DVI and 2 DisplayPort, most of them look to have 1 HDMI in addition to the previous kinds, and so on. Does this make much difference? I've never done multi-monitor before and won't be doing it initially, but eventually I'd like to add a second monitor and I'd like to know how my choices would be constrained depending on what I buy.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Captain von Trapp posted:

I was all set to buy an R9 280x for a new build, and just noticed that different manufacturers have different outputs on their versions of the cards. Some of them seem to have 2 DVI and 1 DisplayPort, others seem to have 1 DVI and 2 DisplayPort, most of them look to have 1 HDMI in addition to the previous kinds, and so on. Does this make much difference? I've never done multi-monitor before and won't be doing it initially, but eventually I'd like to add a second monitor and I'd like to know how my choices would be constrained depending on what I buy.

The R9-280X can do up to three monitors on DVI and/or HDMI (two if using non-identical monitors) and up to six monitors off DisplayPort, including daisy-chaining and MST hubs, up to six total monitors on all connection types. All told, an AMD card can do at least as many monitors as it has plugs, and as many as six monitors if plugs allow and/or DisplayPort technology becomes widespread enough. Using an Active DisplayPort adapter, you can also use each DisplayPort to connect one HDMI or DVI monitor.

Digital is digital when it comes to monitor signals, so it does not matter whether you connect a screen by HDMI, DVI, or DisplayPort as far as video quality goes. The only questions are whether you want to hook up more than two monitors and which connections those monitors have available/require.

If you wish to carry sound with your video, e.g. to a TV, HDMI and DisplayPort both do this but DVI does not.

The only scenario where this could possibly matter to you with two monitors would be if they were both 2560x1440+ screens. That resolution requires either DisplayPort or DVI-DL (dual-link). DP to DVI-DL active adapters are much more expensive than standard DVI or HDMI adapters, so dual DVI on the card or DisplayPort on the monitor would save the $100 per adapter price tag.

Factory Factory fucked around with this message at 17:09 on Nov 14, 2013

thingul
Mar 2, 2005
There was a sale on a Asus gtx 780 directcu II last week here in Norway which I went for, got the card today and after running a few games and benchmarks there was a distinct buzzing/whirring sound coming from my computer. I think most people refer to it as 'coil whine'.

Having read up on this phenomena online various post mentioned it might be a problem between psu and gpu causing this incredibly annoying sound.

Thankfully I had gotten my hands on a fractal design 600w psu earlier that was mispriced which was intended to go with my coming haswell upgrade.
I quickly popped the covers open and yanked the old 'mist 1000w' out and put the new one in, after a few moments of suspense I opened a few games to see if was any better and to my joy it was all gone.

My cabinet no longer sounded like it was infested by a swarm of mosquitoes, not sure if I got lucky or my old psu was really lovely.
On a side note my internet fixed itself today, should I take this as a sign that the hardware gods are favoring me right now ?

Ghostpilot
Jun 22, 2007

"As a rule, I never touch anything more sophisticated and delicate than myself."

Sidesaddle Cavalry posted:

I'm definitely okay with waiting and I hope Ghostpilot's friend is okay with waiting too.

He's okay with waiting, barring some great deal on Black Friday or Cyber Monday. :)

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

God drat is it hard to find a 780Ti for sale at the moment :v:

I had EVGA set to auto-notify me and I was going to nab it immediately, but no dice, by the time I got to the page it was out of stock. This process took about 2 minutes. But as soon as I can lock down a purchase, full chain of events happens - my getting a 780Ti isn't contingent on the rest, it's just recouping some of the expense that'll happen after. Soon as I can find one, it's mine, and my Really Good 780 that I'm foolishly selling like an idiot will be ready to go. I'll be springing for overnight shipping, so once it happens, it can happen fast.

Edit: Question, what's the contingency look like on your end? I'm finding some pretty good evidence of generally fast clockers and may not wait for an aftermarket cooler since the stock one is a stone-cold badass anyway, and there are a variety of regular 780 Ti cards around right now. They're all reference boards, anyway, I doubt much binning has gone into them as of yet and hallelujah nVidia has a quiet and effective cooler, so... I could do this as soon as today or tomorrow, when would you be able to work things out on your end? Is this a top down or bottom up transaction for you, do you have to sell your current card to get the new one or will either direction be fine?

Let me know and I can act accordingly.

Agreed fucked around with this message at 22:19 on Nov 14, 2013

Captain von Trapp
Jan 23, 2006

I don't like it, and I'm sorry I ever had anything to do with it.

Thanks, that makes sense. Good to know I'll be able to upgrade screens smoothly - at least, whenever I have a few more hundred to blow. :toot:

Ghostpilot
Jun 22, 2007

"As a rule, I never touch anything more sophisticated and delicate than myself."

Agreed posted:

God drat is it hard to find a 780Ti for sale at the moment :v:

I had EVGA set to auto-notify me and I was going to nab it immediately, but no dice, by the time I got to the page it was out of stock. This process took about 2 minutes. But as soon as I can lock down a purchase, full chain of events happens - my getting a 780Ti isn't contingent on the rest, it's just recouping some of the expense that'll happen after. Soon as I can find one, it's mine, and my Really Good 780 that I'm foolishly selling like an idiot will be ready to go. I'll be springing for overnight shipping, so once it happens, it can happen fast.

Edit: Question, what's the contingency look like on your end? I'm finding some pretty good evidence of generally fast clockers and may not wait for an aftermarket cooler since the stock one is a stone-cold badass anyway, and there are a variety of regular 780 Ti cards around right now. They're all reference boards, anyway, I doubt much binning has gone into them as of yet and hallelujah nVidia has a quiet and effective cooler, so... I could do this as soon as today or tomorrow, when would you be able to work things out on your end? Is this a top down or bottom up transaction for you, do you have to sell your current card to get the new one or will either direction be fine?

Let me know and I can act accordingly.

Assuming Sidesaddle Cavalry can do Paypal, it should go quickly on my end. The only real hangtime is from his end to yours due to the banking situation. I have to admit a small bit of trepidation with his being relatively new and all, but he seems like a decent guy.

Oh by the way, I didn't get around to thanking you earlier, Agreed! Still wracking my head about it (video cards are the hardest purchases for me to justify to myself), but such is gaming on 1440p.

Ghostpilot fucked around with this message at 02:42 on Nov 15, 2013

Animal
Apr 8, 2003

Agreed posted:

big baby and his money

I am so jealous. My plan to get a second 780 was thwarted by buying a brand new Macbook Pro with a 750m so I can game during my travels. It runs games surprisingly well, some even at "retina" resolution.


...but maybe I should get that second 780 after all so I can do the hack to plug it to the Thunderbolt port and :pcgaming:

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map
Was away the entire day today, just checking in now. If Agreed is able to do overnight then we can handle this bottom-up. I don't want to get in the way of him getting exactly what he needs though, and I'm going to take tomorrow afternoon to pop my card out and send pics to Ghostpilot, by which time it'll be getting close to the weekend and everything stops again until Monday. What's our decision, then?

e:got my direction reversed

e2: just to make it clear, I'm only a few states up the Mississippi river from Agreed, and I'm confident I can figure that payment service of his out. Also I need to get around to emailing you both, one sec...

vvvv e3: sent, also i need to collapse and then go to work

Sidesaddle Cavalry fucked around with this message at 06:45 on Nov 15, 2013

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

EVGA seems to be making them at a pretty healthy rate, getting a notification every day or night - in fact I could hit confirm right now, but I'd like to ensure that everything else is in place first if possible. Email me :) A postal money order is exactly what it sounds like, you go to the post office, tell them you want X amount in a money order, they put the money into the form of a money order and you mail it to the recipient. If anything happens to it in the mail, automatically insured, and sending it in a standard envelope going Priority from a few states away means I'll likely have it in 2 days for extremely cheap shipping.

It ain't paypal, but it's the fastest and safest I can do in Paypal's absence.

Remember, username at gmail.com, make sure the subject says something about SA and the GTX 780 so I don't miss it.

Edit: Email responded to :toot: We'll get it all figured out, and then we'll all have the coolest party ever, with GPUs here and GPUs there and... well that's not much of a party the point is shiny poo poo wooo

Agreed fucked around with this message at 06:51 on Nov 15, 2013

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Animal posted:

I am so jealous. My plan to get a second 780 was thwarted by buying a brand new Macbook Pro with a 750m so I can game during my travels. It runs games surprisingly well, some even at "retina" resolution.


...but maybe I should get that second 780 after all so I can do the hack to plug it to the Thunderbolt port and :pcgaming:

Don't be jealous, pity me that I'd go to such lengths as buying two cards based on the same god damned chip within a single generation just to get a fully unlocked version and hope I win the OC lottery twice so that I can have performance like a 690/7990 on one GPU instead of performance thats about 20%-25% lower. I am bad don't do this folks.

For anyone else who is bad, giving up my spot in line as it were, evga.com CURRENTLY HAS GTX 780 TI SC ACX units for sale, so if you've been pining for one too, strike while the iron is hot. But at least they're making a bunch of them, this was product-in-stock update number two for me.

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



I'm going to have to go and buy the MSI 780 Lightning and join in on this awesome party!

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Speaking of other brands, this is really the hot poo poo from what I can tell for GTX 780Ti overclocking:

http://www.amazon.com/Gigabyte-GTX780-GDDR5-3GB-Graphics-GV-N78TOC-3GD/dp/B00GFZPE5A/ref=pd_sim_sbs_pc_3

Those are some smooth features, although adding like 200W for the cooler might be a bit much. But drat, I bet that thing will hit 1300 on air if any card will.

Still going EVGA, that warranty :drat:

Ghostpilot
Jun 22, 2007

"As a rule, I never touch anything more sophisticated and delicate than myself."
Email replied to by the way. I had to check it a couple of times due to being half-asleep, but at least I know for sure my buddy's info is correct (just copy / pasted that).

Seems that the dominoes are about set!

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Ghostpilot posted:

Email replied to by the way. I had to check it a couple of times due to being half-asleep, but at least I know for sure my buddy's info is correct (just copy / pasted that).

Seems that the dominoes are about set!

This is going to rule. It's like the part where you're climbing the roller-coaster to build up speed and and and :allears:

Edit: Found a better source for this thing being basically a 690/7990 but with one GPU:

http://www.techpowerup.com/reviews/EVGA/GTX_780_Ti_SC_ACX_Cooler/26.html

:aaaaa:

Agreed fucked around with this message at 07:25 on Nov 15, 2013

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map

I'm going to make this my last post tonight for real this time, but I look forward to my cries of help on this thread next week when I can't make Agreed's card break a sweat in War Thunder on my monitor no matter how hard I try :neckbeard: (well I guess I could try downsampling but anyways BED)

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Agreed posted:

This is going to rule. It's like the part where you're climbing the roller-coaster to build up speed and and and :allears:

Edit: Found a better source for this thing being basically a 690/7990 but with one GPU:

http://www.techpowerup.com/reviews/EVGA/GTX_780_Ti_SC_ACX_Cooler/26.html

:aaaaa:

You know you'll want to SLI that 780 Ti ;)

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

SourKraut posted:

You know you'll want to SLI that 780 Ti ;)

My wife would kill me with a 780Ti. After this, I'm pretty much bound to wait for Maxwell's 20nm shrink to get another card. But a 30%-ish increase over my 780 is amazing and coupled with a G-Sync monitor I really don't think I'm going to be moaning too much about how slow my graphics are for at LEAST a year, haha.

Gonkish
May 19, 2004

Wives < MOAR GRAPHICS. Scientifically proven.

Proud Christian Mom
Dec 20, 2006
READING COMPREHENSION IS HARD

Agreed posted:

My wife would kill me with a 780Ti. After this, I'm pretty much bound to wait for Maxwell's 20nm shrink to get another card. But a 30%-ish increase over my 780 is amazing and coupled with a G-Sync monitor I really don't think I'm going to be moaning too much about how slow my graphics are for at LEAST a year, haha.

Some days I don't know whether to cheer for you or against you

sethsez
Jul 14, 2006

He's soooo dreamy...

Agreed posted:

My wife would kill me with a 780Ti. After this, I'm pretty much bound to wait for Maxwell's 20nm shrink to get another card. But a 30%-ish increase over my 780 is amazing and coupled with a G-Sync monitor I really don't think I'm going to be moaning too much about how slow my graphics are for at LEAST a year, haha.

I'm sure it's not the first time you've said this, and I'm equally sure it won't be the last. :)

Stanley Pain
Jun 16, 2001

by Fluffdaddy
So apparently my Corsair AX850 does not have enough power for my 290s in crossfire. Each card works flawlessly by itself, if I put both in they'll run fine until I start running some stress tests at which point they'll just black screen. Going to pick up an AX1200i tonight. My other option is the Seasonic 1250w or 1000w Plat+ rated beasties.


For anyone on the fence about the default heatsinks that come with the accelero extreme III, they are more than adequate for cooling, and having the fan run off 12v cools the VRMs effectively (max temp seen on any VRM was 57C). Running BF4 on ultra (minus MSAA) w/ the render setting at 150% makes the already stellar looking game ever more stellar :haw:

dataupload
Nov 4, 2013
I've seen it mentioned a couple of times about 2GB (e.g. with the GTX 670/680) "not being quite enough" any-more. What are these scenarios and is there any evidence that such an amount of VRam can cause issues?

Wistful of Dollars
Aug 25, 2009

Agreed posted:

My wife would kill me with a 780Ti. After this, I'm pretty much bound to wait for Maxwell's 20nm shrink to get another card. But a 30%-ish increase over my 780 is amazing and coupled with a G-Sync monitor I really don't think I'm going to be moaning too much about how slow my graphics are for at LEAST a year, haha.

The lurking potential of g-sync and the sometimes 3gb VRAM bottleneck of the 780 are the biggest things stopping me from hoping over to the 290 team from my 670 right now. I fear my newfound love of 1440+ will force my hand sooner rather than later.

Stanley Pain
Jun 16, 2001

by Fluffdaddy

dataupload posted:

I've seen it mentioned a couple of times about 2GB (e.g. with the GTX 670/680) "not being quite enough" any-more. What are these scenarios and is there any evidence that such an amount of VRam can cause issues?

Resolutions > 1080p, or high levels of anti-aliasing, etc can cause serious performance issues. It'll depend on the game as well.

veedubfreak
Apr 2, 2005

by Smythe
Got my 2nd 290 installed. But after leak testing it was too late for me to care about putting the case back together and powering it up.




gwarm01
Apr 27, 2010

MWO 30 fps

Rahu X
Oct 4, 2013

"Now I see, lak human beings, dis like da sound of rabbing glass, probably the sound wave of the whistle...rich agh human beings, blows from echos probably irritating the ears of the Namek People, yet none can endure the pain"
-Malaysian King Kai
Looks like there's another reason to not get a 290X. :effortless:

I wonder if this applies to other reference 290s. I don't see why it wouldn't.

If there is any truth to this, that is.

Wistful of Dollars
Aug 25, 2009

veedubfreak posted:

Got my 2nd 290 installed. But after leak testing it was too late for me to care about putting the case back together and powering it up.



How much radiator are you using?

Wistful of Dollars fucked around with this message at 17:48 on Nov 15, 2013

Ghostpilot
Jun 22, 2007

"As a rule, I never touch anything more sophisticated and delicate than myself."

Rahu X posted:

Looks like there's another reason to not get a 290X. :effortless:

I wonder if this applies to other reference 290s. I don't see why it wouldn't.

If there is any truth to this, that is.

Seems to be true: more people over at OCN are reporting success with it. I was thinking of waiting until custom coolers came around but maybe it would be good to get in now while the getting is good?

Josh Lyman
May 24, 2009


This is probably a stupid question, but with both the PS4 and Xbone using Radeon graphics, should we expect desktop Radeons to give us better performance for the next half dozen years?

edit: v Is that because GeForces have a performance lead or because it just doesn't matter?

Josh Lyman fucked around with this message at 18:24 on Nov 15, 2013

Gwaihir
Dec 8, 2009
Hair Elf
Nope.

Rahu X
Oct 4, 2013

"Now I see, lak human beings, dis like da sound of rabbing glass, probably the sound wave of the whistle...rich agh human beings, blows from echos probably irritating the ears of the Namek People, yet none can endure the pain"
-Malaysian King Kai

Ghostpilot posted:

I was thinking of waiting until custom coolers came around but maybe it would be good to get in now while the getting is good?

Only if you're willing to get your hands dirty and install a custom cooler later. At least in my opinion.

Besides, any minor differences in performance between the 290 and 290X can be leveled with a nice OC. For example, in most cases, a 290 running at the 290X's default clock speed is pretty much identical to a 290X in terms of performance.

You can see this in that OCN thread. The moderator who managed to get his MSI to work with the flash compared the new BIOS to the original 290 BIOS OCd to the same clocks as the 290X, and the difference in performance was extremely minimal. While this could also mean that maybe the flash didn't work after all, I'm willing to chalk it up more to the 290 and 290X being so close in performance in the first place.

Josh Lyman posted:

This is probably a stupid question, but with both the PS4 and Xbone using Radeon graphics, should we expect desktop Radeons to give us better performance for the next half dozen years?

MAYBE in console ports, but I seriously doubt it.

Besides, NVIDIA would just make up any difference by pushing their GPUs harder. That's what AMD basically does in the face of NVIDIA optimized games.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

Josh Lyman posted:

This is probably a stupid question, but with both the PS4 and Xbone using Radeon graphics, should we expect desktop Radeons to give us better performance for the next half dozen years?

edit: v Is that because GeForces have a performance lead or because it just doesn't matter?

Doesn't matter at all, PC hardware doesn't have a lot to do with consoles. The APIs are different, the hardware is different, and the console GPUs will become antiquated just like the ATI X1600 variant in the Xbox 360 did.

Nvidia doesn't really have a performance lead right now. As they always have before, the two manufacturers are using cutthroat pricing so that both parties have good bang-for-buck cards.

Ghostpilot
Jun 22, 2007

"As a rule, I never touch anything more sophisticated and delicate than myself."

Rahu X posted:

Only if you're willing to get your hands dirty and install a custom cooler later. At least in my opinion.

Besides, any minor differences in performance between the 290 and 290X can be leveled with a nice OC. For example, in most cases, a 290 running at the 290X's default clock speed is pretty much identical to a 290X in terms of performance.

You can see this in that OCN thread. The moderator who managed to get his MSI to work with the flash compared the new BIOS to the original 290 BIOS OCd to the same clocks as the 290X, and the difference in performance was extremely minimal. While this could also mean that maybe the flash didn't work after all, I'm willing to chalk it up more to the 290 and 290X being so close in performance in the first place.


MAYBE in console ports, but I seriously doubt it.

Besides, NVIDIA would just make up any difference by pushing their GPUs harder. That's what AMD basically does in the face of NVIDIA optimized games.

Turns out that the "Quiet / Uber" switch position matters when flashing the bios. Seems that the memory type also matters, as it won't work on cards that have Elpida memory (which apparently is what Sapphire uses exclusively).

You can be test what memory your 290 has with [url=http://www.mediafire.com/download/voj4j1rlk0ucfz4/MemoryInfo+1005.rar]this[url].

airisom2 posted:

Although I don't have the card, I'm thinking that the flash will only work when it's in the "Uber" mode position, when you switch it in the direction away from the display inputs. The quiet mode is the one closest to the display inputs. The 290 has no uber mode, so the "Uber" position is probably a copy of the "Quiet" mode bios.

So, for the people trying this out, try using the Uber mode 290x bios, and flashing it on the uber mode switch on the 290.

Mustrum posted:

Just did another valley run with same clocks. This time 59,1 FPS. The card definatly is a lot fastr since i flashed the XFX X bios on the memory bank with the switch TOWARDS the power connectors.

I don't know if it is unlocked shaders or not but i changed nothing else!

Edit: for those who did not read from start. I was at 54,5 FPS on stock. It's a XFX 290 with Hynix - watercooled and not throttling.

Those who reported it not working: Please flash X bios with switch towards power plugs and report back. Thank you.
Edited by mustrum - Today at 9:49 am

Ghostpilot fucked around with this message at 19:26 on Nov 15, 2013

Adbot
ADBOT LOVES YOU

future ghost
Dec 5, 2005

:byetankie:
Gun Saliva

gwarm01 posted:

MWO 30 fps
That's an awful lot of effort to play a bad game.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply