|
I've just built a custom case and it's a drat tight fit inside. Theres about a 3mm clearance between the back of the card (the side without the heat-sink) and the case. I've got a lot of air blowing through so it should be okay right?
|
# ? Nov 14, 2013 11:34 |
|
|
# ? Apr 18, 2024 01:51 |
|
Sir Unimaginative posted:It might not be just the driver, but that's probably the cheapest/least effort thing that can be done to check it. Reports pretty consistently point out that dropping to reference clocks doesn't help much (although you might be lucky, but give it a week uptime on 314 and see.) So yeah, I set my card to its reference clock speed and played a bunch of Crysis 3 at very high 1600x900 and Bioshock Infinite at ultra 1920x1080 and neither of them crashed. Maybe I'm one of the lucky ones? If anyone has a way to set it to reference speeds permanently that'd be awesome because I have to do it every time I boot.
|
# ? Nov 14, 2013 11:55 |
|
Stanley Pain posted:There aren't enough single heatsinks for the memory but you can more than make due with some of the other heatsinks you get in the kit. You also have to use one of the small ones for one of the heatsinks on the memory that's closest to the PCIe connector or else the mounting bracket won't fit. Out side of that I didn't have to "mod" anything.
|
# ? Nov 14, 2013 15:16 |
|
Sir Unimaginative posted:[This double post is brought to you by not wanting to cram two subjects in the same reply.] Lateral fans can be bigger and spin slower while doing the same amount of cooling as a squirrel fan. The drawback is that they exhaust into the case. As far as memory usage goes, most tests are showing that in single card set ups, 3gb really isn't an issue because the cards are still not fast enough to be able to turn on the stuff that eats up your memory. But...here's where things change a bit. At super high resolutions (4k/eyefinity) and crossfire, the memory starts to get used. My 3 monitor setup is using over 3gb of memory on my 290 in certain maps already.
|
# ? Nov 14, 2013 15:45 |
|
I was all set to buy an R9 280x for a new build, and just noticed that different manufacturers have different outputs on their versions of the cards. Some of them seem to have 2 DVI and 1 DisplayPort, others seem to have 1 DVI and 2 DisplayPort, most of them look to have 1 HDMI in addition to the previous kinds, and so on. Does this make much difference? I've never done multi-monitor before and won't be doing it initially, but eventually I'd like to add a second monitor and I'd like to know how my choices would be constrained depending on what I buy.
|
# ? Nov 14, 2013 16:49 |
|
Captain von Trapp posted:I was all set to buy an R9 280x for a new build, and just noticed that different manufacturers have different outputs on their versions of the cards. Some of them seem to have 2 DVI and 1 DisplayPort, others seem to have 1 DVI and 2 DisplayPort, most of them look to have 1 HDMI in addition to the previous kinds, and so on. Does this make much difference? I've never done multi-monitor before and won't be doing it initially, but eventually I'd like to add a second monitor and I'd like to know how my choices would be constrained depending on what I buy. The R9-280X can do up to three monitors on DVI and/or HDMI (two if using non-identical monitors) and up to six monitors off DisplayPort, including daisy-chaining and MST hubs, up to six total monitors on all connection types. All told, an AMD card can do at least as many monitors as it has plugs, and as many as six monitors if plugs allow and/or DisplayPort technology becomes widespread enough. Using an Active DisplayPort adapter, you can also use each DisplayPort to connect one HDMI or DVI monitor. Digital is digital when it comes to monitor signals, so it does not matter whether you connect a screen by HDMI, DVI, or DisplayPort as far as video quality goes. The only questions are whether you want to hook up more than two monitors and which connections those monitors have available/require. If you wish to carry sound with your video, e.g. to a TV, HDMI and DisplayPort both do this but DVI does not. The only scenario where this could possibly matter to you with two monitors would be if they were both 2560x1440+ screens. That resolution requires either DisplayPort or DVI-DL (dual-link). DP to DVI-DL active adapters are much more expensive than standard DVI or HDMI adapters, so dual DVI on the card or DisplayPort on the monitor would save the $100 per adapter price tag. Factory Factory fucked around with this message at 17:09 on Nov 14, 2013 |
# ? Nov 14, 2013 17:05 |
|
There was a sale on a Asus gtx 780 directcu II last week here in Norway which I went for, got the card today and after running a few games and benchmarks there was a distinct buzzing/whirring sound coming from my computer. I think most people refer to it as 'coil whine'. Having read up on this phenomena online various post mentioned it might be a problem between psu and gpu causing this incredibly annoying sound. Thankfully I had gotten my hands on a fractal design 600w psu earlier that was mispriced which was intended to go with my coming haswell upgrade. I quickly popped the covers open and yanked the old 'mist 1000w' out and put the new one in, after a few moments of suspense I opened a few games to see if was any better and to my joy it was all gone. My cabinet no longer sounded like it was infested by a swarm of mosquitoes, not sure if I got lucky or my old psu was really lovely. On a side note my internet fixed itself today, should I take this as a sign that the hardware gods are favoring me right now ?
|
# ? Nov 14, 2013 18:58 |
|
Sidesaddle Cavalry posted:I'm definitely okay with waiting and I hope Ghostpilot's friend is okay with waiting too. He's okay with waiting, barring some great deal on Black Friday or Cyber Monday.
|
# ? Nov 14, 2013 20:17 |
|
God drat is it hard to find a 780Ti for sale at the moment I had EVGA set to auto-notify me and I was going to nab it immediately, but no dice, by the time I got to the page it was out of stock. This process took about 2 minutes. But as soon as I can lock down a purchase, full chain of events happens - my getting a 780Ti isn't contingent on the rest, it's just recouping some of the expense that'll happen after. Soon as I can find one, it's mine, and my Really Good 780 that I'm foolishly selling like an idiot will be ready to go. I'll be springing for overnight shipping, so once it happens, it can happen fast. Edit: Question, what's the contingency look like on your end? I'm finding some pretty good evidence of generally fast clockers and may not wait for an aftermarket cooler since the stock one is a stone-cold badass anyway, and there are a variety of regular 780 Ti cards around right now. They're all reference boards, anyway, I doubt much binning has gone into them as of yet and hallelujah nVidia has a quiet and effective cooler, so... I could do this as soon as today or tomorrow, when would you be able to work things out on your end? Is this a top down or bottom up transaction for you, do you have to sell your current card to get the new one or will either direction be fine? Let me know and I can act accordingly. Agreed fucked around with this message at 22:19 on Nov 14, 2013 |
# ? Nov 14, 2013 21:25 |
|
Thanks, that makes sense. Good to know I'll be able to upgrade screens smoothly - at least, whenever I have a few more hundred to blow.
|
# ? Nov 14, 2013 23:21 |
|
Agreed posted:God drat is it hard to find a 780Ti for sale at the moment Assuming Sidesaddle Cavalry can do Paypal, it should go quickly on my end. The only real hangtime is from his end to yours due to the banking situation. I have to admit a small bit of trepidation with his being relatively new and all, but he seems like a decent guy. Oh by the way, I didn't get around to thanking you earlier, Agreed! Still wracking my head about it (video cards are the hardest purchases for me to justify to myself), but such is gaming on 1440p. Ghostpilot fucked around with this message at 02:42 on Nov 15, 2013 |
# ? Nov 15, 2013 02:33 |
|
Agreed posted:big baby and his money I am so jealous. My plan to get a second 780 was thwarted by buying a brand new Macbook Pro with a 750m so I can game during my travels. It runs games surprisingly well, some even at "retina" resolution. ...but maybe I should get that second 780 after all so I can do the hack to plug it to the Thunderbolt port and
|
# ? Nov 15, 2013 05:57 |
|
Was away the entire day today, just checking in now. If Agreed is able to do overnight then we can handle this bottom-up. I don't want to get in the way of him getting exactly what he needs though, and I'm going to take tomorrow afternoon to pop my card out and send pics to Ghostpilot, by which time it'll be getting close to the weekend and everything stops again until Monday. What's our decision, then? e:got my direction reversed e2: just to make it clear, I'm only a few states up the Mississippi river from Agreed, and I'm confident I can figure that payment service of his out. Also I need to get around to emailing you both, one sec... vvvv e3: sent, also i need to collapse and then go to work Sidesaddle Cavalry fucked around with this message at 06:45 on Nov 15, 2013 |
# ? Nov 15, 2013 06:03 |
|
EVGA seems to be making them at a pretty healthy rate, getting a notification every day or night - in fact I could hit confirm right now, but I'd like to ensure that everything else is in place first if possible. Email me A postal money order is exactly what it sounds like, you go to the post office, tell them you want X amount in a money order, they put the money into the form of a money order and you mail it to the recipient. If anything happens to it in the mail, automatically insured, and sending it in a standard envelope going Priority from a few states away means I'll likely have it in 2 days for extremely cheap shipping. It ain't paypal, but it's the fastest and safest I can do in Paypal's absence. Remember, username at gmail.com, make sure the subject says something about SA and the GTX 780 so I don't miss it. Edit: Email responded to We'll get it all figured out, and then we'll all have the coolest party ever, with GPUs here and GPUs there and... well that's not much of a party the point is shiny poo poo wooo Agreed fucked around with this message at 06:51 on Nov 15, 2013 |
# ? Nov 15, 2013 06:41 |
|
Animal posted:I am so jealous. My plan to get a second 780 was thwarted by buying a brand new Macbook Pro with a 750m so I can game during my travels. It runs games surprisingly well, some even at "retina" resolution. Don't be jealous, pity me that I'd go to such lengths as buying two cards based on the same god damned chip within a single generation just to get a fully unlocked version and hope I win the OC lottery twice so that I can have performance like a 690/7990 on one GPU instead of performance thats about 20%-25% lower. I am bad don't do this folks. For anyone else who is bad, giving up my spot in line as it were, evga.com CURRENTLY HAS GTX 780 TI SC ACX units for sale, so if you've been pining for one too, strike while the iron is hot. But at least they're making a bunch of them, this was product-in-stock update number two for me.
|
# ? Nov 15, 2013 06:57 |
|
I'm going to have to go and buy the MSI 780 Lightning and join in on this awesome party!
|
# ? Nov 15, 2013 06:58 |
|
Speaking of other brands, this is really the hot poo poo from what I can tell for GTX 780Ti overclocking: http://www.amazon.com/Gigabyte-GTX780-GDDR5-3GB-Graphics-GV-N78TOC-3GD/dp/B00GFZPE5A/ref=pd_sim_sbs_pc_3 Those are some smooth features, although adding like 200W for the cooler might be a bit much. But drat, I bet that thing will hit 1300 on air if any card will. Still going EVGA, that warranty
|
# ? Nov 15, 2013 07:14 |
|
Email replied to by the way. I had to check it a couple of times due to being half-asleep, but at least I know for sure my buddy's info is correct (just copy / pasted that). Seems that the dominoes are about set!
|
# ? Nov 15, 2013 07:14 |
|
Ghostpilot posted:Email replied to by the way. I had to check it a couple of times due to being half-asleep, but at least I know for sure my buddy's info is correct (just copy / pasted that). This is going to rule. It's like the part where you're climbing the roller-coaster to build up speed and and and Edit: Found a better source for this thing being basically a 690/7990 but with one GPU: http://www.techpowerup.com/reviews/EVGA/GTX_780_Ti_SC_ACX_Cooler/26.html Agreed fucked around with this message at 07:25 on Nov 15, 2013 |
# ? Nov 15, 2013 07:15 |
|
Agreed and Ghostpilot posted:excitement I'm going to make this my last post tonight for real this time, but I look forward to my cries of help on this thread next week when I can't make Agreed's card break a sweat in War Thunder on my monitor no matter how hard I try (well I guess I could try downsampling but anyways BED)
|
# ? Nov 15, 2013 07:25 |
|
Agreed posted:This is going to rule. It's like the part where you're climbing the roller-coaster to build up speed and and and You know you'll want to SLI that 780 Ti
|
# ? Nov 15, 2013 07:26 |
|
SourKraut posted:You know you'll want to SLI that 780 Ti My wife would kill me with a 780Ti. After this, I'm pretty much bound to wait for Maxwell's 20nm shrink to get another card. But a 30%-ish increase over my 780 is amazing and coupled with a G-Sync monitor I really don't think I'm going to be moaning too much about how slow my graphics are for at LEAST a year, haha.
|
# ? Nov 15, 2013 07:28 |
|
Wives < MOAR GRAPHICS. Scientifically proven.
|
# ? Nov 15, 2013 08:07 |
|
Agreed posted:My wife would kill me with a 780Ti. After this, I'm pretty much bound to wait for Maxwell's 20nm shrink to get another card. But a 30%-ish increase over my 780 is amazing and coupled with a G-Sync monitor I really don't think I'm going to be moaning too much about how slow my graphics are for at LEAST a year, haha. Some days I don't know whether to cheer for you or against you
|
# ? Nov 15, 2013 14:13 |
|
Agreed posted:My wife would kill me with a 780Ti. After this, I'm pretty much bound to wait for Maxwell's 20nm shrink to get another card. But a 30%-ish increase over my 780 is amazing and coupled with a G-Sync monitor I really don't think I'm going to be moaning too much about how slow my graphics are for at LEAST a year, haha. I'm sure it's not the first time you've said this, and I'm equally sure it won't be the last.
|
# ? Nov 15, 2013 14:31 |
|
So apparently my Corsair AX850 does not have enough power for my 290s in crossfire. Each card works flawlessly by itself, if I put both in they'll run fine until I start running some stress tests at which point they'll just black screen. Going to pick up an AX1200i tonight. My other option is the Seasonic 1250w or 1000w Plat+ rated beasties. For anyone on the fence about the default heatsinks that come with the accelero extreme III, they are more than adequate for cooling, and having the fan run off 12v cools the VRMs effectively (max temp seen on any VRM was 57C). Running BF4 on ultra (minus MSAA) w/ the render setting at 150% makes the already stellar looking game ever more stellar
|
# ? Nov 15, 2013 14:38 |
|
I've seen it mentioned a couple of times about 2GB (e.g. with the GTX 670/680) "not being quite enough" any-more. What are these scenarios and is there any evidence that such an amount of VRam can cause issues?
|
# ? Nov 15, 2013 14:52 |
|
Agreed posted:My wife would kill me with a 780Ti. After this, I'm pretty much bound to wait for Maxwell's 20nm shrink to get another card. But a 30%-ish increase over my 780 is amazing and coupled with a G-Sync monitor I really don't think I'm going to be moaning too much about how slow my graphics are for at LEAST a year, haha. The lurking potential of g-sync and the sometimes 3gb VRAM bottleneck of the 780 are the biggest things stopping me from hoping over to the 290 team from my 670 right now. I fear my newfound love of 1440+ will force my hand sooner rather than later.
|
# ? Nov 15, 2013 15:17 |
|
dataupload posted:I've seen it mentioned a couple of times about 2GB (e.g. with the GTX 670/680) "not being quite enough" any-more. What are these scenarios and is there any evidence that such an amount of VRam can cause issues? Resolutions > 1080p, or high levels of anti-aliasing, etc can cause serious performance issues. It'll depend on the game as well.
|
# ? Nov 15, 2013 15:18 |
|
Got my 2nd 290 installed. But after leak testing it was too late for me to care about putting the case back together and powering it up.
|
# ? Nov 15, 2013 15:25 |
|
MWO 30 fps
|
# ? Nov 15, 2013 16:15 |
|
Looks like there's another reason to not get a 290X. I wonder if this applies to other reference 290s. I don't see why it wouldn't. If there is any truth to this, that is.
|
# ? Nov 15, 2013 16:52 |
|
veedubfreak posted:Got my 2nd 290 installed. But after leak testing it was too late for me to care about putting the case back together and powering it up. How much radiator are you using? Wistful of Dollars fucked around with this message at 17:48 on Nov 15, 2013 |
# ? Nov 15, 2013 17:12 |
|
Rahu X posted:Looks like there's another reason to not get a 290X. Seems to be true: more people over at OCN are reporting success with it. I was thinking of waiting until custom coolers came around but maybe it would be good to get in now while the getting is good?
|
# ? Nov 15, 2013 18:09 |
|
This is probably a stupid question, but with both the PS4 and Xbone using Radeon graphics, should we expect desktop Radeons to give us better performance for the next half dozen years? edit: v Is that because GeForces have a performance lead or because it just doesn't matter? Josh Lyman fucked around with this message at 18:24 on Nov 15, 2013 |
# ? Nov 15, 2013 18:14 |
|
Nope.
|
# ? Nov 15, 2013 18:20 |
|
Ghostpilot posted:I was thinking of waiting until custom coolers came around but maybe it would be good to get in now while the getting is good? Only if you're willing to get your hands dirty and install a custom cooler later. At least in my opinion. Besides, any minor differences in performance between the 290 and 290X can be leveled with a nice OC. For example, in most cases, a 290 running at the 290X's default clock speed is pretty much identical to a 290X in terms of performance. You can see this in that OCN thread. The moderator who managed to get his MSI to work with the flash compared the new BIOS to the original 290 BIOS OCd to the same clocks as the 290X, and the difference in performance was extremely minimal. While this could also mean that maybe the flash didn't work after all, I'm willing to chalk it up more to the 290 and 290X being so close in performance in the first place. Josh Lyman posted:This is probably a stupid question, but with both the PS4 and Xbone using Radeon graphics, should we expect desktop Radeons to give us better performance for the next half dozen years? MAYBE in console ports, but I seriously doubt it. Besides, NVIDIA would just make up any difference by pushing their GPUs harder. That's what AMD basically does in the face of NVIDIA optimized games.
|
# ? Nov 15, 2013 18:29 |
|
Josh Lyman posted:This is probably a stupid question, but with both the PS4 and Xbone using Radeon graphics, should we expect desktop Radeons to give us better performance for the next half dozen years? Doesn't matter at all, PC hardware doesn't have a lot to do with consoles. The APIs are different, the hardware is different, and the console GPUs will become antiquated just like the ATI X1600 variant in the Xbox 360 did. Nvidia doesn't really have a performance lead right now. As they always have before, the two manufacturers are using cutthroat pricing so that both parties have good bang-for-buck cards.
|
# ? Nov 15, 2013 18:32 |
|
Rahu X posted:Only if you're willing to get your hands dirty and install a custom cooler later. At least in my opinion. Turns out that the "Quiet / Uber" switch position matters when flashing the bios. Seems that the memory type also matters, as it won't work on cards that have Elpida memory (which apparently is what Sapphire uses exclusively). You can be test what memory your 290 has with [url=http://www.mediafire.com/download/voj4j1rlk0ucfz4/MemoryInfo+1005.rar]this[url]. airisom2 posted:Although I don't have the card, I'm thinking that the flash will only work when it's in the "Uber" mode position, when you switch it in the direction away from the display inputs. The quiet mode is the one closest to the display inputs. The 290 has no uber mode, so the "Uber" position is probably a copy of the "Quiet" mode bios. Mustrum posted:Just did another valley run with same clocks. This time 59,1 FPS. The card definatly is a lot fastr since i flashed the XFX X bios on the memory bank with the switch TOWARDS the power connectors. Ghostpilot fucked around with this message at 19:26 on Nov 15, 2013 |
# ? Nov 15, 2013 18:59 |
|
|
# ? Apr 18, 2024 01:51 |
|
gwarm01 posted:MWO 30 fps
|
# ? Nov 15, 2013 18:59 |