|
Cybernetic Vermin posted:I think you are vastly underestimating the number of extremely pessimistic global locks in software today. There are bound to be a lot of software that go from being just "multi-threaded" to basically scaling linearly in some operations with cheap memory transactions. edit: you're basically arguing that there are a lot of applications that are inherently parallel, CPU bound, and extremely impacted by locking overhead to some shared object. those are the only cases where transactional memory could theoretically make a performance difference. what applications would those be? also, please note that Haswell TSX won't solve those problems due to the inability for any app that relies on a monolithic global lock to fit all of its shared state in L1. the transactional memory people have been tooting this horn for ten years; if it were actually as amazing as all the academic papers claim, Rock probably wouldn't have killed Sun Professor Science fucked around with this message at 23:36 on Feb 23, 2013 |
# ? Feb 23, 2013 23:24 |
|
|
# ? Dec 1, 2024 20:38 |
|
Professor Science posted:you're basically arguing that there are a lot of applications that are inherently parallel, CPU bound, and extremely impacted by locking overhead to some shared object. those are the only cases where transactional memory could theoretically make a performance difference. what applications would those be? also, please note that Haswell TSX won't solve those problems due to the inability for any app that relies on a monolithic global lock to fit all of its shared state in L1. Right, that is the claim I am making (assuming that you are saying "locking overhead" not to mean overhead but rather actual lock contention). Hard to have any real statistical foundation for this, but I have worked on a fair bit of software myself where a lot of time is spent waiting on a lock on some huge collection where each thread will touch only a tiny random subset of the data held. It is not that this situation occurs because finer-grained locking is that hard, but it is just one of those things one knows is very hard to reason with perfect certainty about, so locks tend to be very pessimistic to be sure they are covering sufficiently much. To some part the way to view this is that a perfectly successful transactional memory implementation will give you the performance of the finest-grained locking possible with the level of bug-resistance (and effort) of coarse locking. Your phrasing of my statement is a bit disingenuous though, of course I am talking about CPU-bound software, since I have no idea what you expect Intel or AMD to put in their CPUs to improve the situation for software that is not CPU-bound. It should also be clear that any discussion is in terms of successfully eliminating false lock dependencies, since no parallel processing technology will help when there are real dependencies. If you find a huge step towards perfect extraction of parallelism to be amazing then fine. I have no idea how you imagine that transactional memory on Rock would have saved Sun, since they very notably didn't actually manage to make a Rock CPU, a very notably recurring phenomena when it came to Sun hardware promises (on the other hand they very successfully wasted money on unprofitable stuff like Java and OpenOffice).
|
# ? Feb 24, 2013 00:47 |
|
Crysis 3 does in fact favor AMD CPU's, according to this test from yesterday. This is kinda cool, perhaps it means I should go for hyperthreading when I upgrade to a Skylake/AMD-equivalent chip later if these trends continue?
|
# ? Feb 25, 2013 19:15 |
|
Keep in mind that that's a rather odd mix of overclocked and non-overclocked processors, I wouldn't read too much into it until we get repeatable, correctly tested results from reputable English-language sites.
|
# ? Feb 25, 2013 19:33 |
|
So what's new at AMD?
|
# ? May 17, 2013 01:56 |
|
Waiting for Kabini and HSA to blow us away, I guess.
|
# ? May 17, 2013 03:03 |
|
The 8970M is out (7970M with boost clocking). Richland and Kaveri are starting to sneak out. Richland is just showing up now in gaming notebooks, a higher-clocked A10 in the same power envelope. Kaveri is coming next June and it'll be the next APU revision with HSA junk. Yawn.
|
# ? May 17, 2013 03:14 |
|
PS4 and Next-Box news are the only things really - and anyone who yawns at the thought of HSA - wtf?
|
# ? May 17, 2013 16:05 |
|
Richland is literally just a higher-clocked Trinity. Kaveri might be nice, what with GCN and HSA and an on-chip ARM core, but we know practically nothing about it compared to Haswell, which is coming out at about the same time. Except that it's Gen 3 Bulldozer. Big deal? Who knows?
|
# ? May 17, 2013 16:17 |
|
https://twitter.com/anandshimpi/status/335518239830454272 Apparently we may see NDA lift on Jaguar next week. Brazos was exciting and popular. Brazos 2.0 was not so much. Here comes Tiny APU For Baby Computers Gen 3.
|
# ? May 19, 2013 12:05 |
|
There are benchmarks of Temash out there already. http://www.notebookcheck.net/Review-AMD-A6-1450-APU-Temash.92264.0.html It compares pretty well to Atom. No sign of Kabini yet, though.
|
# ? May 19, 2013 12:40 |
|
Maxwell Adams posted:There are benchmarks of Temash out there already. This is for me a really interesting part. Low power, heat, etc. but not quite as useless as the current Atom or as expensive as an i3. It would make a nice traveling companion that could do work.
|
# ? May 20, 2013 09:00 |
|
Anandtech's AMD Jaguar architecture article and AMD A4-5000 "Kabini" review are live.
|
# ? May 23, 2013 04:50 |
|
So, we know that the ps4 is using AMD, and I suspect the 360 is as well. Do you guys think this will lead to AMD returning as a viable option for PC gaming? I built my first Intel box ever this time around because of AMDs lovely single core performance. If developers are targeting AMD levels of per core performance, then it seems like it follows that AMD will be acceptable on the PC.
|
# ? May 24, 2013 12:18 |
|
keyvin posted:So, we know that the ps4 is using AMD, and I suspect the 360 is as well. Do you guys think this will lead to AMD returning as a viable option for PC gaming? I built my first Intel box ever this time around because of AMDs lovely single core performance. If developers are targeting AMD levels of per core performance, then it seems like it follows that AMD will be acceptable on the PC. But its not even the Big-Boy core that you MIGHT (don't) build a desktop around. Its the low power Jaguar core that's meant to compete with Atom. Yea there are 8 of them but... Better bet might be a GCN-based AMD GPU as the shaders they write for the console's should also work on the PC side but all the CPU code will just be C/C++ anyway and at the mercy of the compiler. If for some reason they did hand-tuned Assembly on the consoles CPUs the micro-architectures are different enough that it would have to be re-tuned for Atom, Haswell, Bulldozer, Phenom II, etc. Which is why they don't often hand-tune x86 that much anymore.
|
# ? May 24, 2013 14:00 |
|
keyvin posted:If developers are targeting AMD levels of per core performance, then it seems like it follows that AMD will be acceptable on the PC. The problem is, AMD already is acceptable - just wholly inferior at every price point. And it's not bulldozer cores going into these consoles, so it's not even going to encourage developers to optimize for the strengths that could possibly make the bulldozer architecture cost competitive.
|
# ? May 24, 2013 15:41 |
|
keyvin posted:So, we know that the ps4 is using AMD, and I suspect the 360 is as well. Do you guys think this will lead to AMD returning as a viable option for PC gaming? I built my first Intel box ever this time around because of AMDs lovely single core performance. If developers are targeting AMD levels of per core performance, then it seems like it follows that AMD will be acceptable on the PC. The PS4 is using an AMD APU, so is the Xbox One. However, the per core performance of the CPUs they are using is below what they're offering now anyway - it's essentially a pumped up netbook CPU, they just happen to have two modules with 4 cores back to back. So essentially, not amazing CPU peformance, but 8 of them. It doesn't change the scene in terms of PC parts. It is of course a win for AMD.
|
# ? May 24, 2013 18:32 |
|
HalloKitty posted:It doesn't change the scene in terms of PC parts. Not really, but it basically guarantees that new games will be highly multithreaded and 64-bit. It could also mean that video cards with less than 6 gigs of ram will be obsolete. The really interesting stuff is the hUMA architecture in the new consoles. The CPU and GPU can access the same memory, which could be amazing with the memory bandwidth in the PS4. Someone might, for example, make a hot new tesselation algorithm that relies on hUMA. Who knows what would happen next. Radeon cards with a few jaguar cores tacked on?
|
# ? May 24, 2013 21:57 |
|
Maxwell Adams posted:Radeon cards with a few jaguar cores tacked on? drat, now that's a good idea.
|
# ? May 24, 2013 22:03 |
|
HalloKitty posted:drat, now that's a good idea. Just imagine how smug PC gamers could be. "Oh, your console run on a sophisticated APU? Yeah, my PC has that. As an accessory."
|
# ? May 24, 2013 22:31 |
|
Maxwell Adams posted:Not really, but it basically guarantees that new games will be highly multithreaded and 64-bit. It could also mean that video cards with less than 6 gigs of ram will be obsolete. While the xbone and ps4 seem to both have 8gb of ram, I somehow doubt that developers will get access to more than 2-4gb.
|
# ? May 24, 2013 22:39 |
|
Rawrbomb posted:While the xbone and ps4 seem to both have 8gb of ram, I somehow doubt that developers will get access to more than 2-4gb. IIRC, Sony already mentioned that Games will have access to 6GB of the ram with 2 reserved for the OS. The xbone, runs DDR3 while the PS4 is running (G)DDR5, it can play an significant impact as the APU will be utilizing the system ram for video processing. Hell my APU got a nice 15fps boost in CS:Go when I went from 1333mhz CAS9 to 1600 CAS9 ram. Anand did a nice writeup, kept me occupied on my lunch today. http://www.anandtech.com/show/6972/xbox-one-hardware-compared-to-playstation-4 Dilbert As FUCK fucked around with this message at 22:55 on May 24, 2013 |
# ? May 24, 2013 22:52 |
|
It's DDR3-2133 on the XBO, so that's something, plus the eSRAM cache will help a lot with GPU transfers the way Haswell's GT3e is helped by the eDRAM cache.
|
# ? May 24, 2013 22:56 |
|
Anandtech has a good article about the Kabini value proposition, which mostly seems to be providing better value cheap/small laptops than Celeron/Pentium processors.
|
# ? May 25, 2013 03:04 |
|
Alereon posted:Anandtech has a good article about the Kabini value proposition, which mostly seems to be providing better value cheap/small laptops than Celeron/Pentium processors. Seems to be a decent little thing in the sector, but they're right, it needs to turbo aggressively when under lightly threaded loads. Then it would be a bit of a winner in the low end segment.
|
# ? May 25, 2013 14:16 |
|
Sony said it will be a 7/1 memory split for games/OS, MS is 5/3.
|
# ? May 25, 2013 14:25 |
|
Maxwell Adams posted:Not really, but it basically guarantees that new games will be highly multithreaded and 64-bit. It could also mean that video cards with less than 6 gigs of ram will be obsolete. I thought it was the lack of highly threaded applications that was holding AMD back in the PC market. IIRC battlefield 4 runs nearly as well on an eight core A8479578439 or whatever the AMD branding is these days as it does on a core i7 that is over $100 more. So games being highly multi-threaded seems like a big win for AMD, especially considering that the intel roadmap has them sticking to four cores for the next tick-tock cycle.
|
# ? May 25, 2013 16:26 |
|
Maybe, maybe not. As I said a couple pages ago, each Jaguar core does a pitiful amount of work by desktop CPU standards. An i5-3570K could run every instruction eight Jaguar cores could with clocks to spare.
|
# ? May 25, 2013 17:06 |
|
Maxwell Adams posted:Not really, but it basically guarantees that new games will be highly multithreaded and 64-bit. This is something I keep going back and forth on when toying around with the idea of building a cheap-ish Steambox. Is it better to have an i3, or a similarly priced AMD part with 4-6 cores? It seems like the dual-core i3 (even with hyperthreading) will be a liability before long.
|
# ? May 25, 2013 17:17 |
|
A Haswell i3 will probably handle things just fine. Plus you can kick it up a bit with a 25% BCLK strap overclock. An Ivy i3? Yeah, probably will feel slow for next-gen ports.
|
# ? May 25, 2013 17:26 |
|
Riso posted:Sony said it will be a 7/1 memory split for games/OS, MS is 5/3. 3 gigs for dynamic ad space.
|
# ? May 25, 2013 19:14 |
|
incoherent posted:3 gigs for dynamic ad space. I wouldn't mind this if the ads actually reduced the cost of the console.
|
# ? May 25, 2013 19:19 |
|
Cockblocking Jerk posted:I wouldn't mind this if the ads actually reduced the cost of the console. Hell, give me a game with a good physics engine, a sandbox mode, and a bunch of cans of Coke you can really shake up, and I'd pay for that ad. Anyway, to finish answering the "i3 vs. FX-4 or FX-6 chip" question, if you're buying now, the FX-6 chip might look better when you get to a CPU-bound game made for PS4 and XBO. Still doesn't mean it's a single-thread powerhouse that'll run Starcraft 2 well, though.
|
# ? May 25, 2013 23:03 |
|
Riso posted:Sony said it will be a 7/1 memory split for games/OS, MS is 5/3. It's fairly meaningless though, that's something easy enough to change in software if it really starts constraining developers over the console lifespan, and the worst case of doing so is that it makes the switch from gaming to the Windows-based OS in the One a little less snappy. Microsoft was just conservative with how much memory to tell developers it's okay to use, since until Sony's announcement they were hoping the PS4 wasn't going to go with a full 8GB.
|
# ? May 25, 2013 23:46 |
|
Paging anyone with an FX-8 series CPU who would care to run this benchmark in the overclocking thread? http://forums.somethingawful.com/showthread.php?threadid=3465021&pagenumber=38#post415824410 Factory Factory asked an interesting question on how well the AMD arch would do given what we know about the algorithm.
|
# ? May 26, 2013 03:13 |
|
Killer robot posted:Microsoft was just conservative with how much memory to tell developers it's okay to use, since until Sony's announcement they were hoping the PS4 wasn't going to go with a full 8GB. Sony until their announcement didn't know they were doing 8gb, they originally thought they could get only four.
|
# ? May 26, 2013 09:54 |
|
Shaocaholica posted:Paging anyone with an FX-8 series CPU who would care to run this benchmark in the overclocking thread? Wrong thread, no one actually owns an AMD processor here.
|
# ? May 26, 2013 17:52 |
|
Shaocaholica posted:Paging anyone with an FX-8 series CPU who would care to run this benchmark in the overclocking thread? keyvin posted:Wrong thread, no one actually owns an AMD processor here. We're all pretty much bitter ex-AMD owners, who all gathered up to bitch about how bad they are.
|
# ? May 27, 2013 07:39 |
|
Let me be the first to admit to still use an AMD Athlon II x4 640.
|
# ? May 27, 2013 11:43 |
|
|
# ? Dec 1, 2024 20:38 |
|
Riso posted:Let me be the first to admit to still use an AMD Athlon II x4 640. Me too and I don't see much of a reason to upgrade at the moment. If I did upgrade though, I'd pretty much have to go Intel for the first time in over 10 years.
|
# ? May 27, 2013 12:36 |