|
To distract from the Haswell chat, here's a bit of a dumb question: I have a Radeon card with Catalyst 12.8 installed and am upgrading to a newer Radeon this weekend. Will I need to do anything other than just swap the cards in and out of the case?
|
# ¿ Sep 13, 2012 00:20 |
|
|
# ¿ Apr 25, 2024 03:34 |
|
Speaking of 7850s, got mine up and running last night, though it didn't go as smooth as I was expecting - swapped in the new card and Windows said that my driver was invalid, so I uninstalled and reinstalled Catalyst 12.8 and then it was fine. Thing is a beast (to me) and what really blows me away is how efficient it is compared to my 6850 - way more power, with similar wattage and it runs about 20 degrees cooler. Plus, even though it has dual fans (on Sapphire's card at least) I swear it's slightly quieter than my 6850 was. In related news, while I doubt anyone in this thread is interested, I'll be throwing my lightly-used 6850 up on SA Mart this weekend.
|
# ¿ Sep 15, 2012 00:52 |
|
MixMasterMalaria posted:How is the value on the GTX 660? Mercury engine playback support for Premiere CS6 would be nice, and it seems to be quite a bit faster than the 7850, but I'm not seeing it recommended here. Decent, but I think people are hesitant to recommend it because currently, for a little more you can get a 7870 that comes with a copy of Sleeping Dogs.
|
# ¿ Sep 16, 2012 15:42 |
|
If AMD really can get cards out with those specs in ~6 months, that'll really gently caress nVidia over (having just launched the GTX 660). I'm curious how exactly they would respond to that.
|
# ¿ Sep 17, 2012 00:14 |
|
I remember 5 or 6 years ago there used to be actual dedicated PhysX cards. Not that anyone actually bought one, but are those still useful at all, or are they entirely obsolete by now compared to previous-gen spare Geforces?
|
# ¿ Sep 20, 2012 03:29 |
|
AMD hasn't given an ETA on Catalyst 12.11, have they? I'm loving the performance boost but the frequent blue screens in Win 8 aren't quite worth it.
|
# ¿ Nov 17, 2012 03:28 |
|
HalloKitty posted:Easily more logical than the NVIDIA line, which is saying something, because both naming systems are not great. I'd agree with that. Both companies are guilty of excessive rebadging on the low end of their lines, but AMD has at least had a consistent numbering scheme for the past five years. AMD has offered a range of cards from the 1xxx series (and xxx series before) on up to our current 7xxx series, while Nvidia only offered the 1xx and 3xx series as lovely OEM parts, while the 2xx and 4xx lines were full-fledged offerings. Anyway, as a longtime team red member, it is a little funny that we've come full circle on GPU naming schemes. I was half tempted to skip the 7xxx series and hold out for the 8xxx series since the GPU in my last gaming rig was a Radeon 8500LE.
|
# ¿ Dec 20, 2012 03:25 |
|
Good lord the 8800GT runs hot. Didn't those have some issue with faulty heat sinks over time as well, or am I misremembering?
|
# ¿ Dec 25, 2012 16:41 |
|
DrSunshine posted:Perhaps this is the best place to ask this. What are the other temp sensors for your PC reading? 55 at idle seems a bit on the high side, you might want to check that your case has proper airflow, and alternately check something like GPU-Z to check your fan speed. As mentioned above, staying in the low 90s under load isn't necessarily bad, but consistently hitting triple digits definitely is.
|
# ¿ Dec 26, 2012 00:11 |
|
DrSunshine posted:Hm. Well, in that case maybe I should just think about getting a new computer? Assuming these are your specs, then yes, absolutely. Even getting the penny-pincher system in the system-building thread OP and slapping a 7750 in there would run circles around that thing - and for a little more money, the value gamer system would give you a lot more flexibility down the road. Also, how are you playing Guild Wars 2 on a Pentium 4?
|
# ¿ Dec 26, 2012 01:12 |
|
Alereon posted:This is what should have been called the Radeon HD 7930, a second shader cluster and 1/3 of the memory channels disabled. This is probably intended to help them clear inventory, they just delayed the Radeon HD 8000-series from late March to Q2 to clear out 7000-series inventory. Looks like a decent card if the pricing is right. A lot of PS4/Xbox 720 rumors have suggested them using an AMD HD 7000-series GPU, is there any realistic chance of this being a consumer version of that (presumably custom) part? The 7950 seems too expensive for them to realistically use that in a console costing ~$400, but I imagine they'd also want something beefier than a vanilla 7850/7870.
|
# ¿ Jan 5, 2013 22:31 |
|
Alereon posted:There's basically no way the 7870 LE is a custom part, they're just trying to find a way to sell GPUs that had one too many defects to make it as a 7950. The PS4's GPU is going to be a very customized part because they eventually want to integrate it with the CPU, if it isn't at launch. Yeah, I didn't think so, but after reading some new console rumor roundups and seeing that review all around the same time, my brain just kind of went wild. I remember the original Xbox basically used off-the-shelf parts, but that bit Microsoft in the rear end financially later in the Xbox's life.
|
# ¿ Jan 6, 2013 00:17 |
|
Catalyst 13.1 has really fixed a range of issues for me in Win 8, and performance seems better/more consistent across the board too with the games I've tried (Skyrim, Borderlands 2, and even Far Cry 3 seems a bit improved). Not bad!
|
# ¿ Jan 20, 2013 17:34 |
|
Alereon posted:Have there ever been any statements confirming backwards compatibility? My assumption would be that they would port most recent/popular games to the new platforms then just give you a free copy if you owned it before, not that they would try to make them play games from the last generation, but I could be wron. None whatsoever, considering they haven't even said a word about the next-gen consoles' existence, let alone any broader details. I'd expect if anything we'll see something resembling the 360's BC capability - software-based emulation of some really popular titles, but nothing more. Sony's philosophy of 'all playstation games playable on all playstations' died with the 40 GB PS3 and the realities of cost-cutting. I'm really curious to know if Eurogamer's leaked PS4 specs are accurate - using the same CPU in both the Xbox and PS4 is good for gamers in general, but there could be quite a performance gulf down the road if the GPU specs are accurate for both systems. Also very curious how the 4 GB DDR5 vs. 8 GB DDR3 memory tradeoff will fare.
|
# ¿ Jan 22, 2013 03:08 |
|
Shaocaholica posted:That reminds me that I haven't been watching the news on next gen consoles for a while now. Are there any leaks that are confirmed or any actual statements? Are we going to finally get 8GB+ of main memory and force all game devs to go 64bit? The closest thing to 'factual' information we've had was the leaked design document from 2010 back in June, and the Durango dev kit that got onto eBay a couple months later that had specs "similar to a high-end PC". That said, it seems we're at the point that next-gen specs can be safely speculated on given currently available PC hardware specs and what's reasonable to expect in a ~$400 console, unless Sony pulls another Sony and has some overengineered Cell processor-type technology waiting in the wings. One thing to note about Wozbo's post - while I'm in no way endorsing the journalistic chops of most gaming sites, as far as the supposed specs of the next gen Xbox go, the faked rumor substantiated the majority of the specs, only embellishing the CPU's clock speed. The software and X-Surface parts were obviously complete bullshit on the hoaxer's part, but you can't completely discredit the (mostly reasonable) specs just because some guy made up a rumor about an Xbox tablet. Though, again, given the state of the majority of games 'journalism', it's pretty advisable to take all of this with a grain of salt.
|
# ¿ Jan 25, 2013 02:43 |
|
William Bear posted:I've been looking into a upgrade, I would be more comfortable with advice. Those are some pretty strict criteria - are you currently running an integrated GPU? Because a GPU that meets all the criteria listed isn't going to be much of an upgrade in general, unless you're currently using something awful like a GMA 950. Like Don Lapre said, what are you hoping to do with this that you can't already do? Anyway, give the Radeon 7750 a look. It fits the bill in every category except price, I think it goes for around $100 or $110, but it'll be noticeably (albeit not dramatically) better than the 6670.
|
# ¿ Feb 2, 2013 02:46 |
|
Not to get too into a semantic argument, but looks about 50/50 to me. Sapphire dual-fan 7850 owner chiming in here, it is really quiet in general and also very cool, I don't think I've ever seen it go over 58 degrees under load. Dual-fan setups are increasingly common (especially among the higher quality brands), you pretty much have to go 7770/650 or below to get to 'nearly all' models being single-fan solutions.
|
# ¿ Feb 3, 2013 23:37 |
|
Factory Factory posted:E: Uh, hey, now that I just thought of that, I'm gonna bet that there's gonna be an AMD-capable GPU physics API out soon, because hey the consoles will soon have access to that compute. Out of curiosity, is Havok horribly unoptimized compared to PhysX? I kind of assumed it would just be the standard bearer for next-gen console physics after it was featured in the PS4 presentation.
|
# ¿ Mar 1, 2013 05:38 |
|
redeyes posted:I recently got a Gigabyte HD7770 v2 gfx card. Since that time I have gotten a few blue screens pointing to the catalyst drivers. I think I was doing HD video playback.. but this makes me unhappy. Any idea if the drivers are unstable? Maybe the card is broken. I don't have issues with corrupted textures or stuff like that. What version of Catalyst are you using? The latest, 13.1, has been smooth as butter in Win 8 for me, even across multiple Radeons. IIRC, 12.10 was the only previous WHQL release for Win 8, so make sure you're not using a different release than those two.
|
# ¿ Mar 2, 2013 19:19 |
|
Endymion FRS MK1 posted:Speaking of 7770's, AMD's newest card is a 7790. Based on a new 28nm (article says 22, the source says 28. 28 is correct) process and GCN 2.0, I guess its a testbed for the higher end 8000 series? Sounds like this was supposed to be the 8770 and they decided to ship it now rather than wait till the fall/winter when the rest of the 8000 series rollout is supposed to happen. Interesting little card, I'm curious how the pricing will shake out. If they can ship this for $130-$140 with 90% of the 7850's performance, that'll be a pretty compelling budget card. Speaking of budget cards, I was toying around with the idea of building a cheap living room Steambox-esque rig and was putting some parts together, but I was having a hard time figuring out what video card would be optimal. We have a 50" 1366x768 plasma display in the living room, so I was figuring on tossing in a 7770 GHz edition and calling it a day, but I wasn't sure if even that would be pushing past the point of diminishing returns for that low of resolution. Would a 7750 be more appropriate for such a build, or will the upcoming generation of DX11 games merit a little more horsepower, even at 720P? e: Sorry if this crosses over too much into parts-picking, I'm months away from any purchasing decisions and this seemed a more appropriate place to figure out the proper cards for a 720P-only situation. The Illusive Man fucked around with this message at 03:33 on Mar 12, 2013 |
# ¿ Mar 12, 2013 03:15 |
|
Animal posted:It seems Bioshock Infinite is using all 2gb on my 670 at 1440p and causing some stuttering. I'm seeing about 1.6 GB usage on my 7950 @ 1080P. Pretty impressive for an unmodded game, but I guess with the next gen consoles staring us in the face we'll be seeing more of this as time goes on (and, sadly, more 15+ GB downloads).
|
# ¿ Mar 27, 2013 06:28 |
|
Alereon posted:On the 660 Ti I wouldn't really consider it an issue, but yes 2GB of VRAM will be the limiting factor on high-end cards before anything else. Before I played Bioshock: Infinite, I remember thinking my card's 3GB of memory was silly. Now I'm regularly seeing 2+ GB of usage at 1080P. That said, the game is absolutely gorgeous for a UE3 game.
|
# ¿ Apr 2, 2013 05:54 |
|
zenintrude posted:It fits the theme park-esque vibe I get (and love) from Irrational's worlds, both Columbia and Rapture. Are you playing with Vsync on, or off? I get consistently higher framrates with it off (but also get horrible screen tearing), but when it's on it's pretty common for the frame rate to dip around 30 fps in more intense scenes. Anyway, I recently sold my 7850 and upgraded to a 7950 and don't have a basis of comparison for Bioshock, but your experience sounds about right. The 7850 is a great little card but it's going to really be pushed to the limit by modern games, and as mentioned Bioshock is much more demanding than you'd expect from a UE3 game.
|
# ¿ Apr 4, 2013 02:46 |
|
slidebite posted:According to the box, 3x quieter! Yeah, seriously - that'd be a monster for a HTPC, but I have to imagine *some* compromise went into it compared to the vanilla 670.
|
# ¿ Apr 5, 2013 03:46 |
|
Dogen posted:I know a lot of people in the last month or two have said the same thing. Maybe they figured out it was costing them some sales, finally. Nothing justifies the cost of a $200-$300 card like ~$135 worth of free, new AAA games. Of course, I'm curious how significantly this promo is eating into AMD's profit margins.
|
# ¿ Apr 24, 2013 04:56 |
|
Factory Factory posted:It's because data has to be duplicated between cards. Every texture on Card A must also be in memory on Card B, etc. The actual SLI/CF link is very slow and doesn't do much besides shuffle the frame buffer around. Just for curiosity's sake, is this the same on cards like the 7990 and 690? I.e., even though the 7990 is a 'single' card with 6 GB of VRAM, is only 3 GB effectively usable? The Illusive Man fucked around with this message at 05:02 on Apr 26, 2013 |
# ¿ Apr 26, 2013 05:00 |
|
Guni posted:So just out of curiosity, would a GTX 780 run on my Seasonic M12II 520W Modular PSU alongside an i5-3570 (not overclocked)and a couple of hard drives? Absolutely. The only reason for PSUs higher than 600W or so is for SLI/CF setups. e: Okay, stated with more confidence than I should have given my knowledge level. That said, it'll work. The Illusive Man fucked around with this message at 03:30 on May 24, 2013 |
# ¿ May 24, 2013 02:57 |
|
Klyith posted:I don't see how his setup can possibly go over 350 watts, the 780 doesn't draw that much power. Anandtech's system with an i7 overclocked to a nutso 4.3 ghz barely broke 400w, and TR's more normal setup was just over 300. Yeah, I backpedaled above from my initial enthusiasm, but this is pretty much what I was wondering as well. Even if the 780's TDP is 250W, I wasn't clear on how that setup was tapping out that PSU without any overclocking.
|
# ¿ May 24, 2013 04:14 |
|
Just curious, but how much does pure PhysX usage tax your 580? As in, how are temps, power usage, etc?
|
# ¿ Jun 1, 2013 02:42 |
|
Did they announce if the R9 Nano uses HBM?
|
# ¿ Jun 18, 2015 04:59 |
|
Only if they rename the Fury as Rage.
|
# ¿ Jun 20, 2015 08:47 |
|
So, for someone gaming at 1080p for at least the next couple years who wants to crank all the settings and still hit (or get near) 60 fps - is the GTX 980 the sweet spot, ignoring price/performance for the moment? The 970 seems like it isn't quite there, while the 980 Ti seems like it gives diminishing returns at 1080p (although, I wonder if that headroom would be useful in a few years as devs keep adding bling to games). My 7950 is mostly fine at the moment, but with the Witcher 3 currently and Fallout 4 coming up (despite, I'm guessing, not being as taxing as W3), I'm starting to get the upgrade itch, but holding out for Pascal seems a better idea assuming it's coming in early/mid 2016 and not late 2016.
|
# ¿ Jun 25, 2015 18:10 |
|
Twerk from Home posted:If you're willing to spend that much on GPU, why not consider moving to a higher resolution monitor too? 2560x1440 is really awesome and gives tons of screen real estate for general use. It's nice for more than just gaming and there's good options around $300 now. The thing about monitors is my current setup is connected to a 32" 1080p TV that I do 95% of my gaming on (from an armchair). I figured I'd just keep my setup as-is until 4k is more affordable in a few years, at which point I'd probably need a full re-build as well.
|
# ¿ Jun 25, 2015 18:28 |
|
So, for someone who hasn't bought a Nvidia GPU since the GeForce 2 MX, what's the breakdown of vendors like? My friends usually get EVGAs but I was curious how the other brands fare. Also, entirely for curiosity's sake (as said earlier, sticking with 1080p a while longer), do most all games support 21:9 monitors now? What about older titles - can they render natively, or are you stuck with vertical black bars (or, ugh, stretched image)? E: Sorry, missed Don Lapre's post above, but I'd appreciate any further info on GPU vendors.
|
# ¿ Jul 1, 2015 20:07 |
|
How much PSU headroom do you typically want for a stable overclock? I have a 550w Seasonic G-series PSU and a (non-K) i7 3770, and I'm eyeing picking up a GTX 9xx in the next month. I'm guessing it'd be fine, given the efficiency of the 900 series, but if I were to hypothetically splurge on a 980 Ti it seems like it might be a little snug. Sorry for the newbie question - I've never dabbled in overclocking, but the 900 series cards seem ripe for it so, why not?
|
# ¿ Jul 6, 2015 23:14 |
|
So, I was looking at cards on Newegg, and while Nvidia lists the reference 970 and 980 as having a HDMI 2.0 port, the MSI cards are listed with HDMI 1.4a spec. Do vendors normally change port specifications on non-reference cards?
|
# ¿ Jul 7, 2015 18:22 |
|
Winks posted:The 970 Gaming 4G has a 2.0 port for sure, and while MSI's website says HDMI x 1 (version 1.4a) for the 980 at the top if you look in the specs, next to HDMI it says 'HDMI-Output : 1 (version 1.4a/2.0)' Thanks, I was half wondering if that was a misprint - downgrading a port spec seemed like a weird thing to do.
|
# ¿ Jul 7, 2015 23:04 |
|
sout posted:Has there ever been a card which could max out every single game that came out when it came out because I keep expecting that to happen and it obviously won't. Things were kinda close in 2012-2013, before the new-gen consoles hit and 4k was still ultra-niche.
|
# ¿ Jul 10, 2015 18:18 |
|
After toying with the idea of picking up a GTX 980, I've talked myself down into 'settling' for a 970. Based on benchmarks it still nearly doubles the performance of my 7950, and to be frank the extra 10-15% performance just really didn't seem worth the extra $170. (Plus, my PC's speakers finally died after a 10 year run, so I suddenly have to budget for those as well) Oculus was another reason I wanted to go higher than a 970, but if it isn't quite pulling it off I can always sell it when Pascal drops next year. I mentioned this earlier in the thread, but this will also be my first time back with Team Green since the GeForce 2 MX. We had a good run,
|
# ¿ Jul 14, 2015 20:46 |
|
|
# ¿ Apr 25, 2024 03:34 |
|
Bleh Maestro posted:Are you asking what to get? MSI Gaming 4G or 100ME. Nah, I'm about to pull the trigger on the 100ME. Just felt like posting about an actual GPU purchase instead of old-rear end CRTs.
|
# ¿ Jul 14, 2015 21:07 |