|
Factory Factory posted:Probably about the time the integrated sound, network, and etc. on the motherboard got good enough that we didn't need a ton of expansion slots any more. And switching to broadband instead of dial-up modems. Remember when you used to pay a premium for 6 PCI slots instead of 5 because you really needed that last one? Yeah, I remember being one of those people. And before that, fussing over available IRQs. Now? I have a non-video card plugged into my main system for the first time in years, and only because I was troubleshooting whether network problems were due to the onboard dual LAN dying.
|
# ? Apr 30, 2012 19:00 |
|
|
# ? Dec 1, 2024 20:22 |
|
Star War Sex Parrot posted:It all sort of happened at once: multi-slot coolers, auxiliary power for GPUs, the rise of PCI Express (making GPU placement in the case a bit more flexible) and the better integration of more devices into the north bridge. Hell, we basically stopped calling it a north bridge around the same time because the south bridge disappeared. The relevancy of front-side bus died too. Other way around on Northbridge/Southbridge, AMD's processors absorbed the memory controller into the die, then Nehalem followed suit with both the memory controller and the PCIe controller. AMD FX systems currently have a split Northbridge (and BIOSes still refer to both Northbridge and CPU-NB), and Intel systems call the "Southbridge" the PCH because the northbridge is entirely on die. Same functions, just not a separate chip. AMD APUs have die-integrated full northbridges, too, so the southbridge is now the FCH, Fusion Controller Hub.
|
# ? Apr 30, 2012 19:02 |
|
Civil posted:While that card is passively cooled, it still requires additional power, and has the case heating issues that go along with that. I was hoping AMD would solve the problem at the chipset level, rather than an OEM solution that takes 3 slots because the heatsink is so massive. The reason midrange cards require additional power is because it takes all that additional power to reach mid range performance. You may as well complain about how new mid range CPUs require extra 12V headers on the motherboard. There's nothing you can currently change about the "chipset", whatever you mean by that, to fix it. Now if you mean you want a card as fast as mid range performance of X years ago then you're in luck.
|
# ? Apr 30, 2012 19:24 |
|
HalloKitty posted:That's probably true. Once you had all that space and nothing but a graphics card to fill it, the idea became less ridiculous. I just built a low power rig for a computer dedicated to Quickbooks, and it's weird not even having to have a graphics card anymore. Does anyone even need a full sized ATX motherboard these days?
|
# ? May 1, 2012 04:00 |
|
You can get a full overclocking experience and dual video cards on an mATX board, and mini-ITX boards are just starting to get full overclocking and using laptop parts like mini-PCIe and mSATA to get full feature sets. Managing heat can be crazy in such small boxes, but pretty much no, full ATX is not needed for most people.
|
# ? May 1, 2012 04:06 |
|
Chuu posted:I just built a low power rig for a computer dedicated to Quickbooks, and it's weird not even having to have a graphics card anymore. Does anyone even need a full sized ATX motherboard these days? Workstations. Ridiculous hardware means you can tackle ridiculous problems.
|
# ? May 1, 2012 04:07 |
|
Chuu posted:I just built a low power rig for a computer dedicated to Quickbooks, and it's weird not even having to have a graphics card anymore. Does anyone even need a full sized ATX motherboard these days? Nah, if I was going to build a machine for someone, and I was pretty sure their needs wouldn't require lots of slots, I'd definitely steer in the direction of a nice micro ATX board and case. Something like the Fractal Design Arc Mini takes a full size ATX PSU, microATX board, and still has 6x 3.5" and 2x 5.25" bays. Even if they had a monster card, you only have to take out the middle cage, culling 3x 3.5". Very few people would need more space. HalloKitty fucked around with this message at 08:27 on May 1, 2012 |
# ? May 1, 2012 08:21 |
|
HalloKitty posted:Nah, if I was going to build a machine for someone, and I was pretty sure their needs wouldn't require lots of slots, I'd definitely steer in the direction of a nice micro ATX board and case. And 7 fans, holy poo poo.
|
# ? May 1, 2012 14:58 |
|
You can never have enough space I'm an edge case and I know it.
|
# ? May 1, 2012 15:02 |
|
I'm waiting on a 680 to be delivered, more then likely it'll be about a week. Is it even worth it to pop in a 8800 512 GTS until then? it's a z77/3770k setup so it's got the 4000 IGP...
|
# ? May 1, 2012 15:20 |
|
Only if you want to play games until the 680 arrives.
|
# ? May 1, 2012 15:43 |
|
I've been trying to figure out how they fare against each other but no ones exactly doing benchmarks comparing the two..It's like a two minute install if that, case is already open and such but if its pointless I might as well not bother.
|
# ? May 1, 2012 15:48 |
|
zer0spunk posted:I'm waiting on a 680 to be delivered, more then likely it'll be about a week. Is it even worth it to pop in a 8800 512 GTS until then? it's a z77/3770k setup so it's got the 4000 IGP... I got by with my 2500K graphics for a week waiting on my card. It worked fine for desktop use, but worthless for any kind of games made in the last 5 years.
|
# ? May 1, 2012 15:57 |
|
mayodreams posted:I got by with my 2500K graphics for a week waiting on my card. It worked fine for desktop use, but worthless for any kind of games made in the last 5 years. Yeah, IB is supposed to be 30-40% more powerful then the 3000 IGP in SB taking it from total poo poo to somewhat useable. I'm just trying to see where it falls in line in terms of what dgpu it's comparable to. The little discussion I've seen has folks saying it's both more and less powerful then the 8800..If it's going to be the same thing, I can just use the IGP until I can pop in the new dgpu..
|
# ? May 1, 2012 16:24 |
|
I would think the 8800GT would be at least somewhat more powerful than the HD4000. I know my 8800GTS could play Skyrim at 1080p pretty easily, I'm not so sure if a HD4000 could do that. e: on second thought... ultrabay2000 fucked around with this message at 20:05 on May 1, 2012 |
# ? May 1, 2012 20:03 |
|
8800GT is faster than ATI HD5570 which beats HD4000 by a wide margin in pretty much all tests.
|
# ? May 1, 2012 20:22 |
|
ultrabay2000 posted:I would think the 8800GT would be at least somewhat more powerful than the HD4000. I know my 8800GTS could play Skyrim at 1080p pretty easily, I'm not so sure if a HD4000 could do that. I was bored at work today, I put a 9800GT (virtually identical to an 8800gt) in our 3720QM test-stand, and ran the Lost Planet 2 DX9 benchmark with default settings at 720p. 3720QM - 21.1fps avg 9800GT - 51.5fps avg Now that's the 3720QM and not the 3770K, but they both have the HD4000, and the only difference I could find between the two is that the 3720QM actually supports a higher max GPU freq, 1250 vs 1150. So yeah, still pretty terrible, but at least starting to get in the usable range for low res.
|
# ? May 2, 2012 06:15 |
|
JBark posted:Same here, 8800GTX for $519 was my last video card purchase over 4.5 years ago (wow, long time!). I've stopped playing games because it's so drat slow at 1920x1200 it's not even worth it. Waiting with barely contained anticipation for details on the 670, so I can decide between that or a 7870 in my much needed new PC. My last purchase for a video card was a couple years ago, bought the GTX 465, and after the rebates it was $190 with free shipping off Newegg after a $30 MIR. Best sub-$200 card I ever got, so far it's kept up with most games I have with the rig I built out of spare parts that a friend let me have after I built him a new system (his old one had a bad motherboard, he gave me the working parts in exchange for helping him set up a new system). Can't complain, and now I have some leftover parts from a PC I worked on for my brother that also had a bad motherboard and hard drive - he bought a new PC and gave me the old one to salvage. Only parts I can see salvaging are the 4GB DDR3-1333, Phenom II X4-840 CPU and cooler, wireless g/n card, and combo DVD/CD optical drive, the rest of the parts are cheap junk (an old HP desktop with a flimsy 300W PSU and some off-brand motherboard with bulging/leaking capacitors). Also, for anyone looking to get something decent that was last-gen, I've checked around and saw places like Amazon still sell HD5000 series AMD cards and 400 series Nvidia, and most aren't too bad price-wise. I was thinking about getting a 5850 or 5870 for when I set up the Phenom system eventually, might keep my current rig and hold it as a backup or sell it later. E: fixed CPU name to avoid confusion BOOTY-ADE fucked around with this message at 17:25 on May 2, 2012 |
# ? May 2, 2012 13:34 |
|
Original Phenom? Wasn't that thing dire? Surely you'd be better off just replacing it. Edit: ah, apparently there is no X4-840, so it must be a Phenom II. That's not bad. Disregard this. Double edit: wait, what? Phenom II X4-840 came out only November 2011. The machine must be in warranty.. I don't know HalloKitty fucked around with this message at 13:57 on May 2, 2012 |
# ? May 2, 2012 13:52 |
|
^ No idea about warranty status - brother had already contacted HP twice, the motherboard was replaced once and hard drive replaced once, both went bad again in less than 6 months so I guess he thought "gently caress it, might as well get a new one". I don't blame him, he's got a bunch of Dell towers that are still running strong, but had nothing but issues with 3 different HP towers he bought.
|
# ? May 2, 2012 17:28 |
|
Don't know if this is a good spot to post this, but I have an Intel Core i7 Sandy Bridge computer with a single AMD Radeon 6870 GPU, I was wondering if it would be more worthwhile/future-proof to just double down on the GPU and go CrossFire with another 6870, as they are about $170 on NewEgg, or should I abandon the Barts architecture to go with the Southern Islands chips at ~$350? Mind that Barts is a 40nm chipset, and overclocks like poo poo, where as the GCN architecture in Southern Islands is 28nm, so the Southern Islands chips run cooler, and use less power, and potentially have more room for overclocking. Just looking for an informed opinion from someone who has made the upgrade or would be familiar with the performance difference?
|
# ? May 2, 2012 22:31 |
|
What is your screen resolution? I upgraded from a single 6850 to two in CF for a 1920x1200 monitor before the 7000 series dropped. I don't think I'll ever do another dual-GPU setup as long as a single one can hit the kind of performance I want, but I like the performance and have had good luck with driver updates, so I'm happy with the pair. Buying new, the equivalent performance for me would be roughly a 7950, probably overclocked a bit, as 6850s overclock better than 6870s and mine are up pretty high. GCN's major differences are a video encode engine (which is currently non-functional as there's no driver support, and you have QuickSync on the IGP anyway), better GPGPU performance if you do Folding@Home or some other compute task with your card, better long idle power draw (a desktop user won't get a lick of use out of unless you're using i-mode Virtu), and simultaneous stereoscopic 3D and Eyefinity. But that phrase you used, "future-proof?" Can't be done. Don't try to do it.
|
# ? May 2, 2012 23:00 |
|
Well as far as overclocking, the 6870 is literally almost impossible to overclock, either I got a bad chip, or something, because a mere 5-10mhz bump on core causes artifacts in 3DMark Vantage, and same deal for UniEngine Heaven(I have since read reviews on several tech sites and they all come to the same conclusion that the 6870 just has 0 overclock headroom). Maybe future-proof was the wrong word to use, as I am aware I will have to upgrade to keep up, but I keep hearing bad poo poo about going with dual-card setups. I currently run a single 1920x1200 display and I get respectable FPS (between 45 and 60) in Skyrim/BF3/Other New poo poo, but was hoping to go to EyeFinity with 5760x1200 resolution. Does anyone know how much GPU horsepower would be required to facilitate this? Is it even possible on a single card except with the insanely expensive (ha) 7970? E: It seems that the limiting factor in EyeFinity is not the processing power of the GPU, but the amount of RAM onboard, looks like the 6870 is plenty powerful enough, if it is tied to 2GB of RAM, so I just have to look for a replacement pair of those or a 7870 2GB edition. RAM is expensive on video cards though. orange juche fucked around with this message at 00:30 on May 3, 2012 |
# ? May 2, 2012 23:23 |
|
sh1gman posted:Well as far as overclocking, the 6870 is literally almost impossible to overclock, either I got a bad chip, or something, because a mere 5-10mhz bump on core causes artifacts in 3DMark Vantage, and same deal for UniEngine Heaven(I have since read reviews on several tech sites and they all come to the same conclusion that the 6870 just has 0 overclock headroom). How badly do you want that extra performance? Upgrading after just one generation is usually not a worthwhile investment, especially since the 7800 series is so much more expensive than the 6800 series. If you really want to move up and 'future proof' then you pretty much might as well go all out and splurge one of the 7900 series. I happen to have a 6870 and I have no problem OCing it to the factory OC levels that manufactures ship the more expensive cards with. I also have no problems getting 60fps @1080p in skyrim but I don't play it in the highest texture level that was patched in.
|
# ? May 3, 2012 04:30 |
|
Yeah, looking at what I have, I do fine at 1200P, but if I want to add 2 monitors to go 3x1 triple monitor, then I will either have to crossfire another card in, or go to the next generation up with a 2GB model. Can you crossfire 2 1GB cards and get decent FPS in 3x1 EyeFinity? If I can do that, then I can just stay with the 6870's
orange juche fucked around with this message at 19:47 on May 3, 2012 |
# ? May 3, 2012 19:15 |
|
sh1gman posted:Yeah, looking at what I have, I do fine at 1200P, but if I want to add 2 monitors to go 3x1 triple monitor, then I will either have to crossfire another card in, or go to the next generation up with a 2GB model. Can you crossfire 2 1GB cards and get decent FPS in 3x1 EyeFinity? If I can do that, then I can just stay with the 6870's If you're serious about playing at multimonitor resolutions you pretty much have to go top of the line unless you're willing to sacrifice image quality settings from what you might normally be used to at 1920x1200. The anandtech review of the 690 has 5760x1080 benchmarch numbers if you want to get an idea of what some cards are capable of.
|
# ? May 3, 2012 23:16 |
|
sh1gman posted:Can you crossfire 2 1GB cards and get decent FPS in 3x1 EyeFinity? If I can do that, then I can just stay with the 6870's
|
# ? May 4, 2012 00:32 |
|
GTX 690 looks like a beast, but I'm happy that for once Anandtech had the 5970 on most comparison charts. It seems to still hold up fairly well, so I guess I'm not changing to the 680.
|
# ? May 4, 2012 09:12 |
|
Recently I've been looking on buying a 2560x1440 Res display to run in a 2 display setup with a 1920x1080 monitor as my secondary. Would a single 1GB 6870 be enough to run that and run games like BF3, STALKER, etc on the 2560x1440 monitor? I might upgrade to a 680 if the 6870 isn't powerful enough.
|
# ? May 6, 2012 19:11 |
|
edit: nm
the fucked around with this message at 19:33 on May 6, 2012 |
# ? May 6, 2012 19:30 |
|
6870 definitely isn't powerful enough to run newer games at 2560x1440. Here's a BF3 chart, it gets 14fps average at that resolution.
|
# ? May 6, 2012 19:39 |
|
TweakTown has a performance preview of the Geforce GTX 670 2GB. Since the GTX 680 is such overkill performance doesn't go down as much as you'd think, so depending on pricing this card looks like a great value.
|
# ? May 6, 2012 20:48 |
|
Lots of people on OCN wishing they hadn't bought 680's right now - http://www.overclock.net/t/1253432/gigabyte-gtx-670-oc-version-hands-on
|
# ? May 7, 2012 00:31 |
|
That would be the smallest performance delta for the largest price difference of any of their cards going back many generations. Seems surprising that they would basically make their own high end redundant.
|
# ? May 7, 2012 01:38 |
|
I don't get why they're so mad about it, though. Obviously, they knew they weren't going to get a good price/performance card for this generation. They paid the flagship tax and the "gotta have it now" surcharge, and they should have seen it coming. Granted, they didn't know the 670 would be so close, but that doesn't diminish the performance of the cards they have. If the $$$ to FPS ratio was acceptable to them when they bought it, it should still be now.
|
# ? May 7, 2012 01:52 |
|
KillHour posted:I don't get why they're so mad about it, though. Obviously, they knew they weren't going to get a good price/performance card for this generation. They paid the flagship tax and the "gotta have it now" surcharge, and they should have seen it coming. Normally I'd agree, but it really has to sting to know that they paid $100 more for essentially 1 frame per second in most games. I can't think of a time when the flagship has been this close to the next card down.
|
# ? May 7, 2012 02:58 |
|
Do we know that the MSRP is going to be $400 yet? With that showing, I wouldn't be surprised if it hit at $450.
|
# ? May 7, 2012 03:05 |
|
It's just speculation right now, yeah. Still, either way it's an odd choice for nvidia. Either they've got practically similar cards at relatively different price points, which makes the more expensive one a complete sucker's buy, or they have practically identical cards at practically identical price points, which seems pointless.
|
# ? May 7, 2012 03:16 |
|
Argh, where are their ~$200 cards? answer: 6 months away
|
# ? May 7, 2012 05:10 |
|
|
# ? Dec 1, 2024 20:22 |
|
One thing to be cautious about - as new as the GK104 is, we're still in the land of driver-level enhancements that could be significant. Though you'd think "well it's just a GTX 680 with one disabled SM" would mean that the overclocking result is totally legit, I find it extremely peculiar that while it was overclocked to stock GTX 680 speeds, a stock GTX 680 lost out in 3Dmark11 on the Graphics subscore. To me that says potential driver-level issues. It's also significant that the GTX 670 was overclocked to stock 680 speeds, while the GTX 680 obviously can exceed those by a substantial amount. Still, as the real launch date approaches I do look forward to seeing what kind of press we get regarding the real performance of the cards, and if nVidia offers any clarification that might help contextualize the scores. Given nobody is out of NDA yet who has an official unit, we're one part speculation, one part jaws-dropped. Just have to remember the speculation angle and keep our heads on straight 'til we find out more about the performance differences from reputable sources. If it is indeed going to offer GTX 680 performance at a Radeon HD 7950 price, I'll be purchasing one and it would require an absurdly good 660/660Ti to do anything but match it on the price:performance curve.
|
# ? May 7, 2012 05:17 |