|
Sinestro posted:I know this is pretty of me, but is BD going on laptops, or is it just Llano? Last I heard the mobile equivalent of Bulldozer is Bobcat, but I might be mistaken there. And by equivalent I mean you would need discrete graphics to use Bobcat. Does anyone know more? EDIT: Don't listen to me I'm stupid. Bobcat is just the re-designed CPU core for the upcoming "below Llano" notebooks. I don't think they are going to do many "new" designs for notebooks that aren't APU - so high-end notebook workstations will just continue to adapt the HE desktop chips maybe? roadhead fucked around with this message at 14:12 on May 19, 2011 |
# ¿ May 19, 2011 14:08 |
|
|
# ¿ Jan 16, 2025 17:21 |
|
Looking at the hardwareheaven.com benches it seems they just used insane settings so that all the games were GPU-bound - like its a graphics test or something. After that its just a matter of running each test enough times and throwing out the high scores for Intel and the low scores for AMD. Had an 8150+Mobo in my cart on Newegg and plenty of time to finish the check-out but could not bring myself to do it. $100 more than the Phenom II x6 1100T AND I need a new motherboard? Hmmmm. I still want one though - will wait for FX-8170. Also I love the little A8-3850 in my HTPC - so what AMD lacks in the low-volume enthusiast gamer market perhaps can be made-up in the "I want the cheapest computer you have that plays games" segment at BestBuy.
|
# ¿ Oct 12, 2011 21:07 |
|
I'm hoping against hope the thread scheduling is all whack and threads that should be sharing a module (and L2) aren't so the L3 is being used for things it shouldn't be (under normal circumstances). The only other explanation I can think up is they really did alter the integer pipelines significantly (lengthened them, I fear) from Phenom II and there will be no way a simple Windows 7 patch can save it. How could they not see that it wasn't reliably beating the x4 and x6 Phenom IIs a while back, unless they had some synthetic (simulated most likely) situations where it was?
|
# ¿ Oct 13, 2011 13:23 |
|
Bob Morales posted:So Bulldozer is AMD's Merced? It's not *that* much of a complete waste of money and effort - at least its still x86/x64 and not some one-off instruction set that almost no-one uses. If threads that should be co-operating are being assigned to separate modules and L2 and L3 is being wasted on inter-module communication as opposed to actually caching main memory like it should be, the ripple effect of a simple scheduling fix could be huge. Even 10-15% at this point would at least be enough of an improvement not to look completely foolish next to the 2500k or the 1100T (once they drop the price of the 8150 to $235 or so) - but this is just wild speculation. The seemingly abundant supply of the 6100 means that yields of fully working Bulldozers is probably not that good :/
|
# ¿ Oct 14, 2011 13:12 |
|
movax posted:I think those guys are floundering because they don't have enough money. They're having to cut corners somewhere, be it the architecture team, process, software support, packaging, etc. They can't fire on all cylinders. In an ideal world they'd have an army of software engineers preparing drivers and updates for the major operating systems while the hardware team gets the actual hardware ready. The eternal optimist in me wants to say they automated bulldozer while the hand-tuned transistor work was(is) being done for Piledriver. Perhaps these chips are more or less the same at the block-level and all the improvement in PD will be from tweaking the circuits down to as few gates as possible and other tuning. Otherwise I just don't know anymore - this is obviously not the product we needed to come out of AMD to actually keep Intel on their toes. Did they even have to price drop the 2600k in response?
|
# ¿ Oct 18, 2011 13:12 |
|
The 1100T I bought as a consolation prize for myself installed smoothly - but I can no longer get any sort of program to read its temperature outside of in the BIOS it-self. HWMonitor, AMD Overdrive, SpeedFan all say 0c - preposterous! Gigabyte GA-MA790FXT-UD5P with the F8N BIOS revision of course. Oh well I needed to upgrade in order to build my brother's wedding gift you see...
|
# ¿ Dec 3, 2011 01:49 |
|
Alereon posted:Use CoreTemp for monitoring AMD CPU temperatures. That also reports 0c when I am running Prime95 - obviously incorrect. Probably a result of the "beta" status of this BIOS, the only one on this motherboard that supports this CPU.
|
# ¿ Dec 3, 2011 05:20 |
|
HalloKitty posted:It makes it sound slightly less worse engineered. Still doesn't change the fact of the benchmarks/power & heat. I guess they're trying any kind of damage control right now. Makes you think the recent house-clearing let go a lot of pure marketing people and maybe a real engineer took their place? How else do you officially release a number that is nearly twice the actual transistor count?!?
|
# ¿ Dec 3, 2011 14:55 |
|
Newegg messed up and sent me an email shouting about ordering it RIGHT NOW. And I followed the links to nothing. So.... Soon? http://www.newegg.com/GTX680/?nm_mc=EMCCP-032212-NV&cm_mmc=EMCCP-032212-NV-_-GeForceGTX680-_-banner-_-SHOPNOW Goes to a promo page but the links don't work, yet.
|
# ¿ Mar 22, 2012 13:02 |
|
nmfree posted:For anyone looking for a late night laugh, AMD just sent out their 2011 Annual Report; I got mine in the mail yesterday so I haven't read through much of it yet but there's gotta be some good stuff in there. (If the direct link doesn't work go here instead to get it.) I'm glad I got mine when I did back in November. They probably quit making them a while back and the channel finally sold them all. I bet if you look hard enough and are willing to deal with an unknown vendor you can find one.
|
# ¿ Apr 5, 2012 16:29 |
|
Christobevii3 posted:When did you see that it was nvidia? I've always seen ibm/amd? Yea I was pretty sure AMD/ATI had all the design wins for the upcoming generation of consoles.
|
# ¿ Jul 31, 2012 13:16 |
|
Factory Factory posted:No. Brazos might, maybe, to sort-of-compete with the new Atom SoCs (which have a huge advantage in power consumption and platform cost). But Trinity's lowest TDP will be 17W, destined for "ultrathin" laptops. Maybe possibly someone will make a fat-tab around it, like with IVB ULV CPUs, but that'd be a tough sell when tablet gaming largely does not need the type of GPU that Trinity prioritizes over CPU performance. According to AnandTech's testing, the load power consumption is kinda poo poo, too, with 65W Trinity A8-5600K using more 14W power on Metro 2033 than a 77W i7-3770K with HD 4000. This is despite having an idle power 7W lower. I'm pretty sure the average person can count on one hand the number of times they've completely pegged their CPU in the last week. Yes its the thing everyone thinks they need, but raw CPU performance is "good enough" on any modern processor in my opinion. Its the overall experience (balance of CPU and GPU) that AMD is aiming for now (since they'll never catch up in raw CPU performance obviously) and rightly so. You go where you think you can make a difference. What I want to know is the actual street date of this Lenovo IdeaPad S405! All the press releases indicate it should be out now but I can't find it anywhere!
|
# ¿ Oct 1, 2012 19:02 |
|
Bob Morales posted:You'll still notice an i5 vs a A6 or whatever. Pegging your CPU is like pegging your internet connection. '100%' only means you were using it at full speed for a whole second. Not when the bottleneck is your HDD as it is when most people complain of their machine being "slow" - people don't react to the extra seconds (minutes?) it takes to encode an album full of MP3s, they react to the overall responsiveness and usefulness of the machine. "Can I still browse the web without it herking and jerking while I encode these MP3s," type thinking.
|
# ¿ Oct 1, 2012 21:33 |
|
Professor Science posted:Er, dedicated? As far as I've seen, PS4 seems to have a GPU with (at least relatively) standard GCN units. The only interesting thing that HSA would provide is if the GPU and the CPU have some level of cache coherence. Main memory is shared and I bet all the L2 cache (and L3 if there is any) is shared as well.
|
# ¿ Feb 21, 2013 15:25 |
|
Professor Science posted:Shared main memory != cache coherence. This kind of coherence (between CPU and integrated GPU) is one of the big promises of HSA. If I had to make a bet, I'd say the GPU can snoop the CPU L1/L2 but not vice-versa. I doubt Sony wouldn't push this out without that feature though. This is heavily customized and will have volume in the millions most likely over the next 8-10 years, so I'm assuming all stops were pulled. Unless you have a PS4 Dev-kit, and then I'd assume the NDA would keep you from posting anyway
|
# ¿ Feb 22, 2013 15:16 |
|
JawnV6 posted:It's not a matter of volume ("millions over 8-10 years" is pitiful) or pulling stops. Cache coherence between heterogeneous compute cores is a Hard Problem. It's entirely possible the complexity of making the agents agree on protocol was far greater than either team could manage. It's hard sure, but when you have a stable hardware platform and supporting Compiler/SDK/Best Practices a lot of the variables and unknowns that make it "Hard" go away and it becomes a lot more solvable. I'm betting we get cache coherence on BOTH the PS4 and the Next xbox (when using the AMD supplied tool-chain of course) and that is probably the thing that sold AMD APU to Sony and Microsoft for this gen.
|
# ¿ Feb 22, 2013 16:46 |
|
PS4 and Next-Box news are the only things really - and anyone who yawns at the thought of HSA - wtf?
|
# ¿ May 17, 2013 16:05 |
|
keyvin posted:So, we know that the ps4 is using AMD, and I suspect the 360 is as well. Do you guys think this will lead to AMD returning as a viable option for PC gaming? I built my first Intel box ever this time around because of AMDs lovely single core performance. If developers are targeting AMD levels of per core performance, then it seems like it follows that AMD will be acceptable on the PC. But its not even the Big-Boy core that you MIGHT (don't) build a desktop around. Its the low power Jaguar core that's meant to compete with Atom. Yea there are 8 of them but... Better bet might be a GCN-based AMD GPU as the shaders they write for the console's should also work on the PC side but all the CPU code will just be C/C++ anyway and at the mercy of the compiler. If for some reason they did hand-tuned Assembly on the consoles CPUs the micro-architectures are different enough that it would have to be re-tuned for Atom, Haswell, Bulldozer, Phenom II, etc. Which is why they don't often hand-tune x86 that much anymore.
|
# ¿ May 24, 2013 14:00 |
|
If its like the A8 in my HTPC build it will game fine at medium/high details at low resolution (720P), but no amount of sacrificing fidelity seems to get decent 1920x1080 frame-rates in any games. Obviously for desktop/media tasks its sufficient/overkill.
|
# ¿ Jul 17, 2013 19:39 |
|
Boiled Water posted:That is, to say the least, unlikely. He didn't say anything about per-watt performance though
|
# ¿ Apr 30, 2015 17:46 |
|
Malcolm XML posted:AMD is going up uP UP! I've been long on AMD so long I got my shares for $2.49 Yea
|
# ¿ Nov 18, 2016 14:34 |
|
They must not be using QuickSync on the 6700k that is streaming like poo poo right?
|
# ¿ Dec 13, 2016 21:33 |
|
Haquer posted:I'm still on a Phenom II X6, I feel you A year ago I finally played musical chairs and pulled the 705e out of the server and moved my 1100T there, and went Intel on my personal desktop for the first time since a 733 mhz coppermine mounted on a Slocket adapter...
|
# ¿ Dec 24, 2016 16:52 |
|
|
# ¿ Jan 16, 2025 17:21 |
|
K8.0 posted:The main time I find myself lacking frames is when I'm playing a game while watching something on my second monitor. That's a very, common use that benchmarks aren't going to account for and where going to 6 or 8 cores would probably have a real impact on performance. What is it about advancing to the next episode on Netflix that totally kills my frame-rates for about 1 second? I've got 32 gigs of RAM!
|
# ¿ Feb 10, 2017 13:53 |