|
I went to Microcenter today to grab a Kaveri A10 and mobo bundle, and the guy said that all boards need flashed to support them, even the A88x ones that were designed for Kaveri. He then said they could do it in store for $30 and about 1-2 hours wait time. I checked the Asus website, and the A88x board I was looking at supported it after a certain BIOS rev, but I dont know if that was on the box. drat it AMD, I am trying here.
|
# ? Jan 18, 2014 22:14 |
|
|
# ? Dec 10, 2024 14:22 |
|
mayodreams posted:I went to Microcenter today to grab a Kaveri A10 and mobo bundle, and the guy said that all boards need flashed to support them, even the A88x ones that were designed for Kaveri. He then said they could do it in store for $30 and about 1-2 hours wait time. I checked the Asus website, and the A88x board I was looking at supported it after a certain BIOS rev, but I dont know if that was on the box. They were selling a bundle that didn't even work together? Edit: Nevermind, I think I understand now.
|
# ? Jan 18, 2014 22:38 |
|
When I put a phenom II into an older motherboard which didn't recognize it, I was still able to boot into windows and flash the motherboard to a newer bios. It just ran at 2 ghz and only recognized 2 cores. I couldn't find any info if that is possible with Kaveri, but it wouldn't be surprising if you could do it at home.
|
# ? Jan 19, 2014 17:29 |
|
I ended up getting an A10-7850k and Asus A88x board and I am really happy with it. This is my first AMD build since my Socket 939 4200+, but I would recommend this for HTPC / Media applications in a heart beat. Yay AMD!
|
# ? Jan 27, 2014 01:59 |
|
AMD announces first ARM based server SoC: http://anandtech.com/show/7724/it-begins-amd-announces-its-first-arm-based-server-soc-64bit8core-opteron-a1100
|
# ? Jan 29, 2014 13:32 |
|
Is the advice in the OP (Don't loving buy AMD) still the consensus? I read through the first page or two, but I don't really get it. I've only ever used AMD processors in my desktop builds, generally because they were so much cheaper than the comparable Intels. I've never experienced any overheating or bad performance ( I currently have 2 Phenoms [x4?] in gaming computers and one of the integrated GPU/CPU ones in a file server in my house). Is the isuue just that the cost to performance ratio is THAT much greater for Intels that I'm basically gimping myself by buying AMD? I've loved them ever since I bought my first Duron and besides the goofy APU thing that I won't be using in any of my gaming computers, they seem to be working great for me. I even accidentally ripped one out of a CPU socket and got a replacement for free. TL'DR: What's so bad about AMD processors for a guy who doesn't upgrade every year? Eulogistics fucked around with this message at 19:18 on Jan 29, 2014 |
# ? Jan 29, 2014 19:13 |
|
Eulogistics posted:Is the isuue just that the cost to performance ratio is THAT much greater for Intels that I'm basically gimping myself by buying AMD? For someone who upgrades infrequently AMD processors are truly awful because they hold up so much more poorly with time. My roommate and I bought our Phenom II X4 and Core 2 Quad systems at around the same time, my C2Q is still mostly fine performance wise, her Phenom II X4 was retired because it was annoyingly slow.
|
# ? Jan 29, 2014 19:28 |
|
Eulogistics posted:TL'DR: What's so bad about AMD processors for a guy who doesn't upgrade every year? For the most part, desktop systems don't represent the same performance per dollar you get from an intel chip. On top of that, systems of late are massive power hogs, while intel is heading in the direct opposite direction. Integrated systems are actually pretty nice, though. The AMD APU's swap positions with intel on CPU/graphics performance. So if you're building something tiny that you also want to game (HTPC maybe), AMD might have an edge. If AMD continues their brute force performance increases, they might hold up against intel's Broadwell, as intel has only lately shown modest performance gains, focusing on reducing power consumption. But you'll have a system that pulls 200 more watts just to push the CPU.
|
# ? Jan 29, 2014 19:31 |
|
Eulogistics posted:Is the advice in the OP (Don't loving buy AMD) still the consensus? I read through the first page or two, but I don't really get it. Essentially Intel's single thread performance is miles ahead of AMD's, and that matters in games, especially older ones. AMD's current line up also consumes way, way, way more power than the equivalent Intel CPU. AMD is OK for certain situations, but if you're talking about general desktop use and gaming, Intel is the winner.
|
# ? Jan 29, 2014 19:47 |
|
Intel for any kind of desktop built. AMD if you're looking for a good, cheap laptop that can do light to medium gaming, or a nice all in one HTPC box.
|
# ? Jan 29, 2014 19:56 |
|
Alereon posted:her Phenom II X4 was retired because it was annoyingly slow. What was the clock speed on hers? I'm running mine at 3.9 and it's still plenty fast for desktop and the games I play even if a low-end i5 would be decidedly better. Or was she trying to run photo/video/audio editing? I can see that being really taxing.
|
# ? Jan 29, 2014 19:58 |
|
Detroit Q. Spider posted:What was the clock speed on hers? I'm running mine at 3.9 and it's still plenty fast for desktop and the games I play even if a low-end i5 would be decidedly better. Or was she trying to run photo/video/audio editing? I can see that being really taxing.
|
# ? Jan 29, 2014 20:26 |
|
Well to be fair Skyrim and Dark Souls are the most demanding things I play so I'm probably not stretching the old bird too bad. It is telling that my wife's i3-3225 benchmarks about the same as my X4 with two fewer cores at roughly the same speed (stock speed for both is 3.3) and draws 90w less to boot.
|
# ? Jan 29, 2014 20:46 |
|
Stanley Pain posted:Intel for any kind of desktop built. AMD if you're looking for a good, cheap laptop that can do light to medium gaming, or a nice all in one HTPC box. I'm not sure that I would agree with the laptop thing, as battery life is a huge priority in the laptop space and Intel parts use significantly less power at idle, peak, and partial usage. Also, with Ivy Bridge and Haswell Intel's integrated graphics have gotten competitive.
|
# ? Jan 29, 2014 20:52 |
|
Weinertron posted:I'm not sure that I would agree with the laptop thing, as battery life is a huge priority in the laptop space and Intel parts use significantly less power at idle, peak, and partial usage.
|
# ? Jan 29, 2014 21:24 |
|
Alereon posted:While Intel Iris Pro 5200 graphics is competitive, you essentially can't buy that so isn't much of a factor. You need a dedicated graphics card paired with an Intel CPU to get comparable performance to the integrated graphics provided by a current AMD APU, which incurs sufficient power, size, and cost penalties to make AMD APUs a better option. I've read the last few "don't buy AMD" posts, but is this true for onboard graphics in desktop motherboards? I understand dont buy AMD + dedicated GPU but I don't know anything about Intel's onboard graphics for desktops.
|
# ? Jan 29, 2014 22:12 |
|
They're exactly the same as the onboard graphics in laptops? There's no difference other than desktop chips having far higher power limits.
|
# ? Jan 29, 2014 22:32 |
|
Crotch Fruit posted:I've read the last few "don't buy AMD" posts, but is this true for onboard graphics in desktop motherboards? I understand dont buy AMD + dedicated GPU but I don't know anything about Intel's onboard graphics for desktops.
|
# ? Jan 30, 2014 00:20 |
|
I'm not unhappy with my AMD FX CPU purchase, over a year later. People comment on how much better Intel is, but even the cheapest intel i3 proc was more expensive than my 4130 FX processor at the time I bought it. I have yet to find a game that I think suffers because of my proc (though I don't play all of the latest and greatest games).
|
# ? Jan 30, 2014 00:36 |
|
Alereon posted:High-end AMD APUs can make sense as they offer compellingly better graphics performance than Intel's CPUs, which means they are better gaming processors (for systems without dedicated graphics cards). It's AMD's CPUs (FX-series) that are retarded buys because they have nothing to balance out their much worse CPU performance. Anyone actually trying to game will have a card, however.
|
# ? Jan 30, 2014 03:12 |
|
PerrineClostermann posted:Anyone actually trying to game will have a card, however.
|
# ? Jan 30, 2014 03:30 |
|
Also a future Cataylst driver release will improve frame pacing in dual graphics mode, so if you outgrow the performance of your APU you may be able to add an R7 250 and get a significant boost for less than $100.
|
# ? Jan 30, 2014 04:24 |
|
Rastor posted:Also a future Cataylst driver release will improve frame pacing in dual graphics mode, so if you outgrow the performance of your APU you may be able to add an R7 250 and get a significant boost for less than $100. ...or just get a 260x for $30 more and get better performance (far better in the games that don't scale well with dual graphics) with less hassle.
|
# ? Jan 30, 2014 05:33 |
|
Happy_Misanthrope posted:...or just get a 260x for $30 more and get better performance (far better in the games that don't scale well with dual graphics) with less hassle. Yeah but for low power or prebuilt systems that don't have a 6pin this is a nice option
|
# ? Jan 30, 2014 05:55 |
|
Mantle question: Benchmark tests using BF4 and that StarSwarm thingy seem to indicate that you get significant gains over DX11 if you have a relatively weak CPU and throw in a decent card that supports Mantle. My current setup is an ageing Phenom II x4 840 and a Radeon 5850. I'd like to play BF4 and potentially other upcoming games that support Mantle. The problem is that my motherboard has an AM3 socket, which means that I cant just swap in a newer FX or A-series CPU/APU, without having to buy a new motherboard too. I would also like to upgrade my 5850, but buying a new video card, motherboard and CPU is simply outside of my current budget. Might it then make sense to just buy a nice R9 video card and keep my existing CPU/mobo setup, given Mantle's ability to sqeeze good performance out of video cards while minimizing CPU-bottlenecks?
|
# ? Feb 19, 2014 01:47 |
|
Poor CPU + powerful GPU does seem to be the current usage case where Mantle shows startling improvements, and while you're running a CPU with Core2Quad performance, it's still going to offer a huge percentage greater performance improvement than someone running a high-end Intel CPU.
|
# ? Feb 19, 2014 02:29 |
|
e: Nevermind
slidebite fucked around with this message at 03:24 on Feb 19, 2014 |
# ? Feb 19, 2014 02:45 |
|
Agreed posted:Poor CPU + powerful GPU does seem to be the current usage case where Mantle shows startling improvements, and while you're running a CPU with Core2Quad performance, it's still going to offer a huge percentage greater performance improvement than someone running a high-end Intel CPU. Wasn't the biggest improvement tested seen an a 6-core i7 with Crossfire 290X? Maybe there are new numbers out somewhere.
|
# ? Feb 19, 2014 08:56 |
|
Keep in mind that Mantle will only work in the couple of games that support it and drivers will be buggy for awhile. It's one thing if you're buying a card FOR BF4 but Mantle is definitely not a general solution. Honestly your current card isn't THAT awful, a Radeon R9 270X is only about twice as fast. You have the equivalent of a lower mid-range card now, it doesn't seem like you'd be well-served spending a lot of money on a videocard that's about twice as fast before bottlenecks. You might consider overclocking everything you can, especially your CPU and VRAM, and then sticking it out until the summer. You might even find that it makes more sense to buy an Intel CPU+Mobo+RAM and keep the videocard until you can afford better, at least you can turn down texture quality to avoid hitching, its much harder to avoid lag due to a slow CPU.
|
# ? Feb 19, 2014 09:18 |
|
HalloKitty posted:Wasn't the biggest improvement tested seen an a 6-core i7 with Crossfire 290X? You might be right, thinking on it I do seem to recall the other "very big numbers" improvement being anything related to Crossfire. Why is anyone's guess. Crossfire overhead's awful? Is nVidia's SLI overhead equally awful? I dunno, they're pretty different in how they work. Plus, the new Crossfire via PCI-e is an odd duck and maybe it comes with some "special" CPU time penalties that prevent it from showing best results. Or, hey, maybe there's just that much damned overhead when you're juggling 12 logical cores and two tandem GPUs.
|
# ? Feb 19, 2014 11:29 |
|
Agreed posted:You might be right, thinking on it I do seem to recall the other "very big numbers" improvement being anything related to Crossfire. Why is anyone's guess. Crossfire overhead's awful? Is nVidia's SLI overhead equally awful? I dunno, they're pretty different in how they work. Plus, the new Crossfire via PCI-e is an odd duck and maybe it comes with some "special" CPU time penalties that prevent it from showing best results. Or, hey, maybe there's just that much damned overhead when you're juggling 12 logical cores and two tandem GPUs. Yeah, I think the crossfire case is definitely something improved specifically; rather than the CPU being limiting, but I can't be sure.
|
# ? Feb 19, 2014 12:45 |
|
I was reading a review of a FM2 motherboard, and noticed they were comparing benchmarks from the A10 to Intel chips http://anandtech.com/show/7865/asrock-fm2a88x-extreme6-review/7 Why aren't the FX-8xxx chips on the benchmark list? Aren't they way faster than the A10 and competitive with the i5/i7 in a few benchmarks?
|
# ? Mar 20, 2014 12:55 |
|
Bob Morales posted:I was reading a review of a FM2 motherboard, and noticed they were comparing benchmarks from the A10 to Intel chips FX chips don't have graphics. EDIT: Scratch that, they have E5's on that benchmark for some reason. I'd guess the reason is that AMD doesn't really have a chip that's meant to be competitive with current generation intel; and enthusiasts aren't interested in the chip. People drag out benchmarks vs. the i5-2500K and pretend like the incremental improvements since then in power and frequency don't really matter. Chuu fucked around with this message at 02:54 on Mar 21, 2014 |
# ? Mar 21, 2014 02:49 |
|
I was gonna say, isn't the 2500K as old as the hills? No wonder AMD finally looks competitive
|
# ? Mar 21, 2014 12:02 |
|
Panty Saluter posted:I was gonna say, isn't the 2500K as old as the hills? No wonder AMD finally looks competitive I know you're joking but 2/3rds of Intels performance isn't competitive I wasn't paying much attention but I thought the A10's were just FX8xxx's with an APU, which apparently they aren't They were pretty competitive in the really heavily-threaded stuff but that's about it.
|
# ? Mar 21, 2014 12:38 |
|
Note that when the bleeding-edge AMD CPU looks capable against a 2500K, that's usually at stock. The 2500K, with its trivial 30% overclock that still keeps it well within any expected AMD power draw.
|
# ? Mar 21, 2014 12:40 |
|
Sir Unimaginative posted:Note that when the bleeding-edge AMD CPU looks capable against a 2500K, that's usually at stock. This reminds me of these benchmarks: https://www.hardwarecanucks.com/forum/hardware-canucks-reviews/62166-amd-fx-9590-review-piledriver-5ghz-13.html In Skyrim, 4 module piledriver @ 5GHz perfoms so badly it's barely funny.
|
# ? Mar 21, 2014 12:46 |
|
Panty Saluter posted:I was gonna say, isn't the 2500K as old as the hills? No wonder AMD finally looks competitive 2500k is still absolutely relevant, because ivy bridge and haswell didn't make very big performance jumps. An oc'd 2500k compares well to 4670k. New (true) 8-core haswells will destroy everything, as least on the build your own consumer end. http://arstechnica.com/gadgets/2014/03/intel-returns-to-its-roots-with-slew-of-overclocker-friendly-desktop-cpus/
|
# ? Mar 21, 2014 15:21 |
|
Panty Saluter posted:I was gonna say, isn't the 2500K as old as the hills? No wonder AMD finally looks competitive Yep, it came out in early 2011.
|
# ? Mar 21, 2014 15:27 |
|
|
# ? Dec 10, 2024 14:22 |
|
Civil posted:2500k is still absolutely relevant, because ivy bridge and haswell didn't make very big performance jumps. An oc'd 2500k compares well to 4670k. It'll destroy your wallet, too.
|
# ? Mar 21, 2014 16:52 |