|
Alereon posted:There's a POWER7 CPU (I guess that puts paid to the idea that POWER7 isn't power-efficient enough for consoles) Anyways with a CPU die of only 33mm2 on a 45nm process and a power usage so low that the whole system only uses around 33w when in use its safe to say that even if the CPU isn't just a mildly updated Broadway it still isn't POWER7 based. For reference on the same process POWER7 chewed up 567mm2 of die space and 100w at its slowest speed. Its not a apples to oranges comparison of course. The WiiU's CPU is supposedly a tri core chip while POWER7 was quad core at a minimum, lots of differences in cache too and the bus and the WiiU's CPU supposedly has a clock of 1.29-1.6Ghz while POWER7's slowest available clock was 2.4Ghz. Joink posted:What i find surprising is its sold at a small loss given its hardware and price.
|
# ¿ Nov 25, 2012 12:16 |
|
|
# ¿ Apr 27, 2024 12:04 |
|
Killer robot posted:The WiiU is going to be the weakest of its generation, but it should be closer to the GameCube vs. the PS2/Xbox than the Wii vs. PS3/360. I'm sure it'll still be able to put out games that look significantly better than the current consoles but it might still end up looking significantly worse than what the PS4/X720 will be able to do. If the their new controller approach doesn't take off in a big way a la Wii than the WiiU might not have much market staying power. They'll probably still make money and definitely break even though so no need for anyone to freak out. I'm not suggesting a "they're doomed" scenario. Even if they did a mediocre/crappy job on the hardware Nintendo has obviously thought things through financially here.
|
# ¿ Nov 25, 2012 22:28 |
|
Space Racist posted:A lot of PS4/Xbox 720 rumors have suggested them using an AMD HD 7000-series GPU, is there any realistic chance of this being a consumer version of that (presumably custom) part? For some perspective the entire original "power hog" X360 used about 172W and the original PS3 used about 189W. (edit:)The Xenos original GPU used around 90w. Bear in mind too that the Tahiti LE GPUs are being made on a still very high end modern 28nm process. So even with a revamp and die shrink it'll probably still put out too much (edit) heat and use too much power for use in a console. That is half the reason why the expected/rumored GPUs for the PS4/X720 are mid-range 6xxx class GPU's. They were very compact in terms of die space and had good performance per watt as well and should be fairly cheap now to produce on a more modern process. 6670-6870 GPU are fairly realistic to expect in a late 2013/early 2014 PS4/X720 console IMO. PC LOAD LETTER fucked around with this message at 07:56 on Jan 7, 2013 |
# ¿ Jan 7, 2013 07:26 |
|
Are they going to at least drop the prices significantly or are they planning on milking everyone for as long as possible? I bet its the latter but gotta make sure right? :/
|
# ¿ Jan 9, 2013 14:18 |
|
XFX has been having problems with their higher end cards for quite a while. Not quite RRoD levels of failure but apparently it was pretty bad for many. I've had to RMA my 7970 DD Edition 2x. Underclocking does nothing and temps were OK in each case. There was something wrong with the card and I've gotten a new revision each time I got a card back. Can't adjust the voltages or clocks upwards at all on my current revision, not that I was doing that before. Their support has been good but I probably won't buy from them next time.
|
# ¿ Mar 14, 2013 09:21 |
|
Agreed posted:Are we at a point where it's okay to actively discourage people from purchasing AMD cards because their software team is broke as poo poo and, as a result, so are their drivers? The biggest problem for me this time around has been XFX and their lovely QC really.
|
# ¿ Jun 5, 2013 08:34 |
|
Agreed posted:The disparity is enormous. Agreed posted:I don't know what you mean by hype. Agreed posted:You describe updating your drivers and it breaking regular, not even enterprise or especially GPU-dependent software like your internet browser. That's pretty, ah, unusual. Agreed posted:you remember when they finally unified their driver architecture? Compare that to nVidia who did it ages ago. Good luck on the surgury BTW.
|
# ¿ Jun 5, 2013 11:38 |
|
randyest posted:Is it normal to have to change stock settings to avoid destroying a video card?
|
# ¿ Jun 6, 2013 12:34 |
|
Agreed posted:Edit: If current leaks are true, performance of the 9000s being ~roughly at Titan level (a few FPS here or there depending on AA, games, etc.) is not exciting to me. VDay posted:What does one have to do with the other?
|
# ¿ Sep 23, 2013 23:51 |
|
Agreed posted:It's specious as all hell to talk about "Titan performance" - everyone who pays attention to this stuff knows that really it's the GTX 780 it's going up against Agreed posted:And yeah, making PC game development "more console-ish" would be pretty neat, if by that you mean addressing the problem of moving workloads around coherently. As to how to do it, let's wait and see, yeah?
|
# ¿ Sep 24, 2013 01:10 |
|
The Lord Bude posted:I also suspect that the 780ti will blow it out of the water, although I'm dissapointed Nvidia isn't going with 6gb of ram. For $549 R9 290X looks like a solid win for AMD but R9 290 might end up being this gens' 7950 for best bang vs buck. That ~$400 price point needs filling somehow... PC LOAD LETTER fucked around with this message at 14:28 on Oct 24, 2013 |
# ¿ Oct 24, 2013 14:25 |
|
zylche posted:The thermal limit of my CPU is ~63C, gently caress if I'm letting the R9 290X near it. The Lord Bude posted:A non reference overclocked gtx780 will comfortably outperform a titan though, I'm expecting the 780TI will outperform the titan as well. e: Blowers are better for moving air in instances where you have high back pressure. In this case the heatsink on these modern GPU's tend to be high fin density and have heat pipes running through the middle of them on top of that. The space for fans also tends to be pretty limited around the GPU in most cases too. Short of going to something wacky like a stacked contrarotating axial fan assembly a blower is the way to go. Though I guess if you're willing to put up with a 3 slot width video card and multiple big axial fans that can work too. e2: I don't think the problem is the fan with the reference R9 290X video cards, IMO its probably the heatsink itself.\/\/\/\/\/\/\/ PC LOAD LETTER fucked around with this message at 15:53 on Oct 24, 2013 |
# ¿ Oct 24, 2013 15:19 |
|
Tab8715 posted:Welp, that was a fun launch day? No cards are to be found at all anywhere.
|
# ¿ Oct 25, 2013 01:34 |
|
Rahu X posted:The 290 seems like a better card outright, but if you factor in overclocking, the 780 seems like it would pull ahead. No benches at 4K, but the 290 will probably eke out a win at that resolution.
|
# ¿ Nov 5, 2013 16:47 |
|
edit:/\/\/\/\/\Don't know of a site with those numbers up yet but generally 2x 760's will be around 10% faster on average than a Titan @ 1600p resolution. If you're willing to deal with SLI's quirks, and your PSU will be OK with it, a 2nd 760 would probably be the best bang for the buck option for you.Rahu X posted:While I'm not too keen on having a 3 slot card, I can't argue with the pressure. I know it needs game support (so many games won't benefit at all) but the idea of getting positional audio that is the next best thing to true binaural audio in game, especially a game like Thief, has me excited. PC LOAD LETTER fucked around with this message at 17:13 on Nov 5, 2013 |
# ¿ Nov 5, 2013 17:02 |
|
Supposedly at stock (according to hardocp and techreport) the fan isn't that bad at all though. Its if you raise the fan cap to 60%+ is when it starts to get loud.Brent Justice hardocp reviewer posted:The 290 at 47% fan speed wasn't loud. The fans on these cards really don't become loud until around 60-65%. At 47% the card was not throttling, so the full performance potential was there. Sidesaddle Cavalry posted:I had wanted to ask this earlier, this was the driver that AMD extended the 290's NDA for a week for, right? I'm wondering if it's out for public release in case I missed it. hardocp posted:For the AMD Radeon R9 290 we are using AMD supplied driver Catalyst 13.11 Beta V8. This is the second driver provided to us by AMD for the R9 290 launch review. This new driver improved performance over the Beta V6 driver by tweaking PowerTune and raising the fan speed profile to a new default of 47% over the previous default of 40%. PC LOAD LETTER fucked around with this message at 17:29 on Nov 5, 2013 |
# ¿ Nov 5, 2013 17:25 |
|
GrizzlyCow posted:If you want to use Mantle, you'll have to have a GCN 1.x card. I don't think AMD has said officially how long they plan on supporting Mantle but if history repeats itself a la their VLIW GPU's they probably have another 2-3 yr of GCN iterations in the works at a minimum. Possibly longer if you consider how the foundries have slowed releasing new processes plus how those new processes don't have the same level of advantages as previous years shrinks. AMD and nV both will be forced to put more work into designing their GPUs since now they can't rely on a die shrink in 6-8 months giving them another 30%+ performance with the same uarch like the "old days".
|
# ¿ Nov 6, 2013 16:26 |
|
I'd be surprised if they used the reference cooler on a card like that. Anything less than a very good double slot HSF a la the Accelero would be idiotic, especially after all the complaints on enthusiast forums about the reference cooler. IMO if they're going to do it right they should release a 3 slot HSF for a card like that. Done properly they could get rid of 300w+ of heat with a HSF like that without much if any noise and still keep the clock speed up. Avg. Joe Sixpacks will laugh but people actually interested in buying a CF-on-a-stick video cards won't really care too much. PC LOAD LETTER fucked around with this message at 23:53 on Nov 15, 2013 |
# ¿ Nov 15, 2013 23:51 |
|
Ghostpilot posted:the thermal pads on the vrms leave an oily residue that is best removed with a gentle rubbing with a pencil eraser.
|
# ¿ Nov 20, 2013 00:10 |
|
Agreed posted:"why did they choose to engineer the Volcanic Islands architecture in this fashion even though it is going in the exact opposite direction of not only nVidia, but also literally everyone else, including their own CPU division?"...but that's not going to happen with the relatively much poorer performance per watt compared to Kepler The rumor mill is still saying mid-late 2014 for TSMC's 20nm chips to roll off the line and that the improvements over their 28nm process won't be all that impressive. At least initially. They've improved their 28nm process over time, I'm sure they'll do the same with their 20nm tech. Thing is I don't see them doing much of it in a timely manner before their next process is supposed to be ready...unless they're going to delay that too. That would make a 28nm Maxwell reasonable to do for nvidia, but it'd probably also be a relatively hot and power hungry chip vs Kepler on that process. The HPC Hawaii cards are going to have ungimped compute DP performance which is something that GCN is pretty good at. The performance/watt probably won't be "poor" at all vs. Kepler for those work loads. Power usage hasn't been the issue with AMD getting the HPC guys to use their hardware anyways, its software and developer support.
|
# ¿ Nov 21, 2013 02:19 |
|
Agreed posted:I think we might be at risk of talking past each other a bit....I am interested in what makes you think that Maxwell will necessarily be hotter running than Kepler if it's launched on the 28nm process....why would explicitly engineering toward more efficiency and integrating a CPU on the card itself to operate more effectively in terms of overall system resource utilization make it hot at 28nm but not at 20nm?...Aren't we pretty much allowing that a node shrink alone, at this point, doesn't offer the sort of really impressive efficiency boosts that it used to, especially one that isn't introducing anything especially radical? Unless nvidia has managed some significant break through(s) in transistor + uarch design (unlikely, I think the low hanging fruit is pretty much gone which is why you're seeing AMD/nvidia push stuff like Mantle, TrueAudio, memory virtualization, ARM cores, etc.) they're going to have to use lots more transistors to get close to the typical performance increases (ie. 30%+) that people have come to expect for a new GPU. Lots more transistors on the same process with similar clocks = more heat/power usage. Heck even if they just do a similar number of transistors but bump the clocks quite a bit and go for a "speed demon" design power usage will shoot up. If it turns out Maxwell is just current Kepler + ARM CPU + memory virtualization than I'd be wrong about the power shooting up by quite a bit but you're also not going to see a large performance increase either. I don't think, given the rumors, that TSMC's 20nm will be anything special either. But I'd be surprised if that sort of die shrink didn't also knock power usage down to somewhere closer to where Kepler is right now which most people seem to consider "normal" for a high end GPU. I'd be assuming of course that nvidia would be aiming to leave transistors/clocks the same and just do a "simple" shrink. They may not, in which case 20nm Maxwell may still end up having higher power usage and end up "hot" but with more performance. It might not be a "bad" trade off and even if it was nvidia might do it anyways so long as the GPU/card price is right and the card isn't too noisy. I'm sure they've been watching with interest what AMD has been able to pull off and sell with Hawaii.
|
# ¿ Nov 21, 2013 12:02 |
|
lethial posted:If you are trying to build a compact gaming PC, excess power and heat is a big issue.
|
# ¿ Nov 21, 2013 15:50 |
|
Agreed posted:what's your rumor mill source, by the way?...I genuinely doubt their word when they say that they were aiming at a 95ºC temperature target all along That is all guess work though. While their reference cooler could've been lots better I think the throttling issue is over blown too. There are lots of people who actually own the card, and some reviewers, who've reported little or no issues with throttling with the default fan cap on the 290 or uber mode on the 290X. Most of the complaints about the heat/power/throttling issue seems to be coming from people who don't even own the card. lethial posted:My system has a GTX780, and I could actually fit a GTX 780 ti if I change my main HDD into a green drive PC LOAD LETTER fucked around with this message at 04:27 on Nov 22, 2013 |
# ¿ Nov 22, 2013 04:18 |
|
Agreed posted:I don't think anyone buys the "people are used to cards running cool, but really they should run hot! As hot as possible!" bullshit...nVidia is being hilariously prickish about it...kick 'em when they're up, kick 'em when they're down
|
# ¿ Nov 22, 2013 05:03 |
|
Yeah a better HSF addresses all the complaints quite nicely. What I wonder about now is how AMD is going to deal with Maxwell assuming nvidia does a have a 28nm version out by March or so 2014. Unless they do a new amazing HSF a up clocked 290/X is probably out of the cards. A $300-350 R9 290 or $400-450 290X would be a pretty nice option though! Or nvidia could just decide to compete on brand again and sell a 880GTX, or whatever they'll call it, for $rape$ and AMD might just decide that their prices/performance are fine where they currently sit thank-you-very-much. That would be...boring. PC LOAD LETTER fucked around with this message at 05:26 on Nov 22, 2013 |
# ¿ Nov 22, 2013 05:23 |
|
KillHour posted:If you had shown me that screenshot a year ago, I would have called you a liar.
|
# ¿ Nov 22, 2013 05:37 |
|
Agreed posted:I don't think they even officially support the 5000-series cards anymore, do they? Their driver support has actually been pretty good for a while now. For single cards you've got nothing to worry about. Its just very different from how Nvidia does things. They do WHQL's "as necessary" for the office guys and for the gamerz they do betas frequently. Usually every month there is a new one. Doing the whole CCleaner/Driver cleaner bit isn't necessary either most of the time. Only if you have issues after installing a new driver. Personally I've just been installing over the old ones for quite a while now. Haven't had a problem. CF is a different story but even that has improved quite a bit too. PC LOAD LETTER fucked around with this message at 10:58 on Nov 22, 2013 |
# ¿ Nov 22, 2013 10:50 |
|
Litecoin miners are going absolutely apeshit. Supposedly that guy is bringing in upwards of $20K a month and has been for at least a few months, for as long as the boom lasts anyways. With that sort of income he'll have made his money back in 1-2 months easy. The AMD cards are better for mining than the nvidia cards due to better DP compute performance. The cards based on older 79xx GPU's have better bang for the buck than the new R9 290/X's because AMD gimped DP performance with their new GPU's this time around*. Or at least they did until miners pumped the prices of the cards up into the $300-400 range for a card that would normally sell for $250 or so. *Which is irritating. Hawaii could've been a real DP monster if they hadn't done that. I don't care too much about mining but I do like to participate in distributed computing projects like Milkway. As things are now a R9 290X is only about as good at DP compute as a 7970GE. edit: wouldn't be so sure about that. Their inherent advantage at DP compute work loads means that 79xx cards may end up being kept and used on other mining projects for a long time.\/\/\/\/\/\/\/
|
# ¿ Dec 5, 2013 05:56 |
|
Byolante posted:If they really care about DP compute stuff, why aren't they buying Teslas or something? spasticColon posted:Is this guy actually mining coins or is he buying the cards on the cheap and then reselling them to batshit insane coin miners?
|
# ¿ Dec 5, 2013 06:45 |
|
tjume posted:Hi, I'll be switching from a 6870 to a 280x, Should work fine after that.
|
# ¿ Dec 6, 2013 02:38 |
|
They're very different GPU's though. The 68xx was VLIW5 and the 280X is GCN 1.x.
|
# ¿ Dec 6, 2013 10:33 |
|
Unless you're using software that needs a Quadro or FirePro I don't think you'd want a laptop that used either for gaming. Those are workstation GPU's and their drivers are "tuned" for stuff like CAD and not gaming. AMD mobile GPU's have good performance and as far as driver stability are fine but they tend to be power hogs compared to the Nvidia GPU's. AMD really needs to work the bugs out Enduro which have persisted for quite a while now. Until they do a gaming oriented Nvidia GPU would be best to get over all. Only get a laptop with a AMD GPU if its an APU (power saving works fine with those) or if you can get a good deal on it and can live with much less battery time. Generally speaking though any gaming laptop with a mid to high end dGPU is going to have generally poor battery time compared to one with only a iGPU.
|
# ¿ Dec 6, 2013 15:37 |
|
Spug posted:"Good deal" doesn't really factor in, as my workplace is paying
|
# ¿ Dec 6, 2013 18:49 |
|
SourKraut posted:the prices of almost all of the 7970s are now $500-600 on Newegg. Newegg really isn't the best place to get deals on stuff now. They still have some of the best parts selection and description (ie. pics, specs) though so I still browse that site often but will buy elsewhere.
|
# ¿ Dec 8, 2013 07:00 |
|
Agreed posted:That is far from universal, and checking any of the HPC performance lists shows that the real contenders for perf:watt are nVidia with massively parallel Tesla systems and lately Intel with their Xeon Phi cards. I still cannot loving believe these miners: 108 goddamn R9 290's. That guy probably singlehandedly emptied newegg's or amazon's stock of that card. Prices do seem to have leveled off though so maybe the miner boom is dying down. PC LOAD LETTER fucked around with this message at 06:23 on Dec 14, 2013 |
# ¿ Dec 14, 2013 06:16 |
|
HT had some limited success outside of AMD CPU's though. There were some pretty badass FPGA's that fit right into a Operton socket years ago. AMD also had a HT based slot for a while too though I don't think there were too many products made for it in mind. Might've ended being used for some of their larger systems that they teamed up with Cray to build.
|
# ¿ Jan 7, 2014 17:45 |
|
You'd have to ask XFX what the max temp rating for the VRM is supposed to be on their card but generally yea 100C is pushing the limit for most electronics.
|
# ¿ Jan 14, 2014 01:24 |
|
Agreed posted:AMD owners, how are they doing lately? Do they have a single driver out yet or are they still running multiple branches? If you're a gamer= latest beta. Some 290 owners had trouble with the 14.1's but generally they've been working fine otherwise.
|
# ¿ Mar 7, 2014 07:06 |
|
spasticColon posted:But is crossfire still a wash on AMD cards?
|
# ¿ Apr 12, 2014 09:43 |
|
|
# ¿ Apr 27, 2024 12:04 |
|
cisco privilege posted:but they're still (subjectively) very noticeable at 40%-50% and incredibly noisy at 100%.
|
# ¿ Jun 6, 2014 18:00 |