|
Driver software to be tweaked to reduce Radeon frame latencies in series of updates
|
# ? Jan 3, 2013 04:54 |
|
|
# ? Mar 28, 2024 11:32 |
|
Has there been much talk about the Radeon 7870 LE? It's lovely that AMD would release this thing right after holiday season when everyone's already done buying their crap (my 7870 Ghz weeps gently in the corner). It's priced at the same point as the vanilla 7870 GHz edition but performs quite a bit better. AMD really should have just called this thing the 7890 or the 7930.
|
# ? Jan 4, 2013 01:11 |
|
So I haven't built a machine in years and I'm doing a build for gaming and photoshop. I got a pretty awesome deal on a used 560 Ti (It's a gigabyte card, which is not the best option according to the building thread but at that price ill take a risk). Since I'm not going to be overclocking it (and quite frankly hat emost video card utility addons and their awful interfaces), can I just go right to the nvidia site and get the proper drivers or do I have to go to gigabytes site and install theirs?
|
# ? Jan 4, 2013 17:22 |
|
For desktop GPUs, the manufacturer always just points you to the generic drivers. Go ahead and get them from Nvidia's site.
|
# ? Jan 4, 2013 17:28 |
|
Factory Factory posted:For desktop GPUs, the manufacturer always just points you to the generic drivers. Go ahead and get them from Nvidia's site. Thanks, just double checking.
|
# ? Jan 4, 2013 17:56 |
|
I got a new aftermarket cooler for my 7970 because with my new case I realize just how loud the damned thing is. I don't think I've seen so many goddamned RAM heatsinks that I know are going to be a bitch to put on because they're so tiny.
|
# ? Jan 4, 2013 17:58 |
|
real_scud posted:I got a new aftermarket cooler for my 7970 because with my new case I realize just how loud the damned thing is. Make sure you also have heatsinks for the PWM/voltage chips for your card if it doesn't have a separate PWM heatsink already (RAM sinks aren't expressly necessary, but leaving the PWM chips bare could kill the card). For the VRM/PWM section I like to use Enzotech MOS-C1's since you can cut them to whatever height you need with nothing but wire cutters.
|
# ? Jan 4, 2013 19:39 |
|
Glen Goobersmooches posted:Asus-branded Nvidia GPUs have been top tier this cycle but I've never actually heard anything about their warranty practices. I can't speak about gpus specially, but I can report that when I needed the screen if my laptop replaced (under warranty) it was a painless and satisfactory process.
|
# ? Jan 5, 2013 14:17 |
|
grumperfish posted:You might want to pick up some sekisui thermal tape off of eBay to attach the heatsinks with. I buy a big roll of it occasionally, as the more-common 3M thermal tape you can buy is pure poo poo in comparison and I don't trust it to hold up heatsinks over the long-term. Of course there is no manual in the actual packaging and you have to download it, but I'm actually going to use this as a more visual guide on how to install it. Thankfully I have a 2nd 7970 that was sitting outside my system so I don't have to be without a computer while installing it.
|
# ? Jan 5, 2013 14:55 |
|
Yeah, the glue is designed to work with it so it'll work. I generally use the tape for warranty purposes when working with aftermarket GPU coolers. Make sure to follow the instructions with the glue and maybe test it on something so you get the hang of it. Like it says in the linked instructions, use a standard pencil eraser on the chips (carefully) to clean them before applying the glue/heatsinks. It'll help in securing the application, especially on the RAM chips. Be really cautious when cleaning the PWM row as there's a bunch of SMT parts close to them. Make sure to sink the PWM chips in the line next to the PCI-E power plugs, and also the PWM chips (1-2) next to the bracket as these tie into the RAM. Basically anything needs a heatsink if it was covered by the stock cooler. After you get it installed, you're probably going to be stunned with the noise difference. Going from the stock blower on my 6970 to an AXP was a very noticeable change. future ghost fucked around with this message at 17:10 on Jan 5, 2013 |
# ? Jan 5, 2013 17:03 |
|
M_S_C posted:Has there been much talk about the Radeon 7870 LE? It's lovely that AMD would release this thing right after holiday season when everyone's already done buying their crap (my 7870 Ghz weeps gently in the corner). It's priced at the same point as the vanilla 7870 GHz edition but performs quite a bit better. AMD really should have just called this thing the 7890 or the 7930.
|
# ? Jan 5, 2013 18:13 |
|
PC Perspective is taking a different tack on latency-based GPU analysis. Specifically, they're using a capture card and video analysis to get the post-display-chain results of frame latency and tearing artifacts on CrossFire/SLI setups. They use an uncompressed DVI-DL capture card which writes to a PCIe SSD at over 400 MB/s sustained, and the output is EXACTLY the output sent to the monitor, as opposed to Tech Report's FRAPS-based method, which measures when the system starts a new frame rendering. This leads to interesting realizations about screen tearing and framerates: That image is one 16.67 ms (60 FPS) displayed frame, consisting of three rendered frames. RFrame 1 is red. RFrame 2 is green. RFrame 3 is blue. That green frame is pretty darn insigificant, isn't it?
|
# ? Jan 5, 2013 19:03 |
|
DrSunshine posted:Perhaps this is the best place to ask this. If not, just yell at me and I'll happily edit this out. I bought a 5770 a little over 2 years ago, and it has to have the worst cooling package of all time. (This is the first generation giant cooler, not the second gen "egg" cooler) How this never seemed to be mentioned in the "parts picking" thread when it was one of the quick pick cards I'm sure sure. The thing was the loudest part of my computer on idle, and sounded like a vacuum cleaner under load. And it doesn't even seem to do anything, it idled at 50 and shot up to 100 under any load. Anyway, just a few weeks ago I started playing games without headphones and can't stand it any more. I grabbed an Accelero S1 Plus on clearance for $18 this morning, and threw it on (did a bit of dremel work on the stock RAM heatsink plate so I could use it instead of fiddling around with the glue and separate heatsinks included). Running passively, the card is currently plateaued at 96 after stressing it for 17 minutes. This isn't even a high end heatsink. The stock cooler is just that terrible. The stock heatsink looks like it would cost way more to manufacture and assemble as well. Shutting the computer down now to add a 120mm fan to the Accelero with zip ties, will edit with results. edit: 60C after 20 minutes of Furmark. And this with a low CFM quiet fan that I can't hear over the (relatively quiet) PSU fan, instead of a fan louder than a vacuum. Kreez fucked around with this message at 23:15 on Jan 5, 2013 |
# ? Jan 5, 2013 22:16 |
|
Alereon posted:This is what should have been called the Radeon HD 7930, a second shader cluster and 1/3 of the memory channels disabled. This is probably intended to help them clear inventory, they just delayed the Radeon HD 8000-series from late March to Q2 to clear out 7000-series inventory. Looks like a decent card if the pricing is right. A lot of PS4/Xbox 720 rumors have suggested them using an AMD HD 7000-series GPU, is there any realistic chance of this being a consumer version of that (presumably custom) part? The 7950 seems too expensive for them to realistically use that in a console costing ~$400, but I imagine they'd also want something beefier than a vanilla 7850/7870.
|
# ? Jan 5, 2013 22:31 |
|
There's basically no way the 7870 LE is a custom part, they're just trying to find a way to sell GPUs that had one too many defects to make it as a 7950. The PS4's GPU is going to be a very customized part because they eventually want to integrate it with the CPU, if it isn't at launch.
Alereon fucked around with this message at 00:00 on Jan 6, 2013 |
# ? Jan 5, 2013 23:20 |
|
Alereon posted:There's basically no way it's a custom part, they're just trying to find a way to sell GPUs that had one too many defects to make it as a 7950. The PS4's GPU is going to be a very customized part because they eventually want to integrate it with the CPU, if it isn't at launch. There's going to be some difference due to all the DMA stuff.
|
# ? Jan 5, 2013 23:32 |
|
Alereon posted:There's basically no way the 7870 LE is a custom part, they're just trying to find a way to sell GPUs that had one too many defects to make it as a 7950. The PS4's GPU is going to be a very customized part because they eventually want to integrate it with the CPU, if it isn't at launch. Yeah, I didn't think so, but after reading some new console rumor roundups and seeing that review all around the same time, my brain just kind of went wild. I remember the original Xbox basically used off-the-shelf parts, but that bit Microsoft in the rear end financially later in the Xbox's life.
|
# ? Jan 6, 2013 00:17 |
|
Space Racist posted:Yeah, I didn't think so, but after reading some new console rumor roundups and seeing that review all around the same time, my brain just kind of went wild. I remember the original Xbox basically used off-the-shelf parts, but that bit Microsoft in the rear end financially later in the Xbox's life. It wasn't inherent to the parts being off-the-shelf, as far as I know.
|
# ? Jan 6, 2013 01:33 |
|
I need help comparing the two graphics cards I own. For some reason, the Nvidia Geforce GT 230 doesn't appear on any benchmark or comparison sites. What is better for gaming? quote:NVIDIA GeForce GT 230 graphics processing unit quote:NVIDIA GeForce GT 620 The above sources of information: http://www.geeks.com/details.asp?invtid=GT230-PCIE-15-PB http://www.geforce.com/hardware/desktop-gpus/geforce-gt-620/specifications The challenge is that, as noted above, the 230 is not found on any comparison website, and what information I can find on it is in a different format than other products usually are, making a clear comparison difficult. My major quandary: I assume, reading the above, that one has more memory on board, but the other has a much faster clock. Which is preferable? William Bear fucked around with this message at 03:29 on Jan 7, 2013 |
# ? Jan 7, 2013 03:26 |
|
Since you already own both of them can't you just try them and see? If we're placing bets I'd say the 230 would be faster based on theoretical performance. Especially if that 620 is an OEM card, it's almost half of a retail 620. (I'm just guessing you have an OEM 620 considering the 230 was only available in OEM form) They'll both be pretty terrible but you could probably sell both of them to scrape together 50$ to buy a used 260.
|
# ? Jan 7, 2013 03:43 |
|
William Bear posted:I need help comparing the two graphics cards I own. For some reason, the Nvidia Geforce GT 230 doesn't appear on any benchmark or comparison sites. Is this in an "all-in-one" PC or laptop? Cause I think that model is onboard or OEM video of some kind.
|
# ? Jan 7, 2013 03:59 |
|
Local Resident posted:Is this in an "all-in-one" PC or laptop? Cause I think that model is onboard or OEM video of some kind. No, it's a regular card. It came with my HP Pavilion Elite e9260f. Good idea comparing them in the computer itself, I haven't done it yet because I'm wary of confirmation bias by just eyeballing it. I've been looking at benchmarking programs, do you have a recommendation?
|
# ? Jan 7, 2013 04:05 |
|
http://unigine.com/products/heaven/ Max out everything you can and once the demo is runing the hotkey for running a benchmark is F9. You might have to set it to DX10 for both cards because I guess the 620 has some level of DX11 support. (It's not going to be powerful enough to ever see those features in a playable state so don't worry about not having it on the 230) I think the major difference is going to come from the 230s much faster memory. The 620 is killed by having only a 64 bit bus compared to the 230s 192 bit. I don't think the architectural improvements of the 620 will be enough for it to pull ahead.
|
# ? Jan 7, 2013 04:16 |
|
In an effort to figure out why my ASUS GeForce 570's fans have started to make a lot of noise, I tried removing them but instead accidentally separated the cooler from the GPU itself. Do I need to re-apply the thermal compound? It runs fine and the temperature is the same as before (e.g., ~30 C while idling). Instrumedley fucked around with this message at 04:46 on Jan 7, 2013 |
# ? Jan 7, 2013 04:43 |
|
Instrumedley posted:In an effort to figure out why my ASUS GeForce 570's fans have started to make a lot of noise, I tried removing them but instead accidentally separated the cooler from the GPU itself. Put new thermal paste on. If you don't have any, the old stuff will suffice until you get some as long as the temps seem to be fine. Just keep an eye on it, old thermal paste is hard and if you pulled it apart and squished it back together again there's probably some air pockets.
|
# ? Jan 7, 2013 04:57 |
|
Rexxed posted:Put new thermal paste on. If you don't have any, the old stuff will suffice until you get some as long as the temps seem to be fine. Just keep an eye on it, old thermal paste is hard and if you pulled it apart and squished it back together again there's probably some air pockets. Is there a particular brand of thermal paste I should use?
|
# ? Jan 7, 2013 05:02 |
|
Instrumedley posted:Is there a particular brand of thermal paste I should use? The brand doesn't make a huge difference. I've used Arctic silver 2, 5, and Ceramique (non metallic), but there's plenty of graphs showing minor differences in how thermal paste performs, it's just getting a new layer of it in there that's important. I'd shop for whatever's cheapest.
|
# ? Jan 7, 2013 05:31 |
|
Space Racist posted:A lot of PS4/Xbox 720 rumors have suggested them using an AMD HD 7000-series GPU, is there any realistic chance of this being a consumer version of that (presumably custom) part? For some perspective the entire original "power hog" X360 used about 172W and the original PS3 used about 189W. (edit:)The Xenos original GPU used around 90w. Bear in mind too that the Tahiti LE GPUs are being made on a still very high end modern 28nm process. So even with a revamp and die shrink it'll probably still put out too much (edit) heat and use too much power for use in a console. That is half the reason why the expected/rumored GPUs for the PS4/X720 are mid-range 6xxx class GPU's. They were very compact in terms of die space and had good performance per watt as well and should be fairly cheap now to produce on a more modern process. 6670-6870 GPU are fairly realistic to expect in a late 2013/early 2014 PS4/X720 console IMO. PC LOAD LETTER fucked around with this message at 07:56 on Jan 7, 2013 |
# ? Jan 7, 2013 07:26 |
|
My 660 Ti has been giving me some weird problems lately. When my machine wakes from sleep, the display won't turn on. The rest of the system seems to power up as expected, but nothing is sent to the display. It's not an issue with the monitor, my laptop doesn't cause the issue. I figured it was probably a driver issue, so I upgraded to the most recent nVidia drivers on their site (released 5 January 2013). Now when waking from sleep, I get a BSOD. Anyone else have this happen to them? Google shows me a lot of forum posts of people with similar issues, but no fixes other than 'get the latest drivers', or the especially helpful 'don't let your computer go to sleep'.
|
# ? Jan 8, 2013 08:30 |
|
NVM, found answer.
William Bear fucked around with this message at 07:16 on Jan 9, 2013 |
# ? Jan 9, 2013 07:08 |
|
I was gonna save all the CES stuff for a new thread, but I can't get to it until the end of the week, and I saw a couple stories that I just can't sit on. For you see, AMD and Nvidia are a couple of dogfuckers. Why? The OEM versions of the desktop Radeon HD 8000 series is almost all rebadges. Everything from the 7970 GHz edition down to the 5450 is getting an "HD 8000" branding. 7970 GHz --> 8970 OEM 7950 w/ Boost --> 8950 OEM 7870 --> 8870. 7770 --> 8760. 7750-900 --> 8740. 6450 --> 8400. 5450 --> 8350. The only new parts are The 8670 OEM and 8570 OEM, based on different binnings of the new 384-core GCN chip, Mars (but who cares about AMD codenames?). And those are only "new" in the sense of "only recently released;" they're still first-gen GCN, not the upcoming refresh. On the mobile side, the 7800M is being rebadged, as well, as the 8870M, 8850M, and 8830M. This isn't quite as useless a change, though, as these parts enable Boost clocking. Though for some Goddamn reason the 8870M can be configured with DDR3 SDRAM instead of GDDR5. The rest of the mobile parts fill out the low end with binnings and harvestings of the 384-core GCN chip, which is something, at least. Meanwhile, Nvidia isn't skipping this opportunity to poo poo testicles. Meet the GeForce 710M and 730M. Or you may have already known them as the 640M (GK107) and 620M (GF117).
|
# ? Jan 9, 2013 13:02 |
|
Are they going to at least drop the prices significantly or are they planning on milking everyone for as long as possible? I bet its the latter but gotta make sure right? :/
|
# ? Jan 9, 2013 14:18 |
|
Another tidbit: Intel demo'd the Haswell GT3e IGP. It's roughly GeForce 650M-level.
|
# ? Jan 10, 2013 02:09 |
|
Factory Factory posted:Another tidbit: Jesus Christ
|
# ? Jan 10, 2013 05:15 |
|
I wonder if nVidia/ATI are scared of this, as they kind of should be. At this rate there won't really be a compelling reason to buy a graphics card for all but the most extreme high performance uses in a couple of generations, leaving GPUs to go the way of the sound card.
|
# ? Jan 10, 2013 05:21 |
|
Factory Factory posted:Another tidbit: If Intel gets their driver poo poo together, they stand a chance at owning way more than any one company should. Too much integration. Not... sure if they'll be allowed to keep it at that point, the bread and butter of the graphics companies are that performance range, much as enthusiasts would like to feel included. Do I want a laptop with great battery life and extremely good graphics performance compared to current options? Absolutely! Do I want Intel to own the world? Don't feel they've earned it. I don't think it's sound economics to allow monopoly on the grounds of contemporary success, regardless of how impressive it might look at the moment; the long run does not extrapolate cleanly from the short term, and it's bad decision-making to just trust a company to keep besting themselves when over time that self-interest folds over to a stronger motivation to be profitable (a motive that no longer requires extraordinary innovation, just moving things along now and again). P.S. for FF, I'll get in touch with you soon, man, just... a lot of poo poo going on. Thanks for reaching out, sorry this is my first word back.
|
# ? Jan 10, 2013 05:41 |
|
In essence, this is Intel fighting back against Nvidia's Tegra 3 and other ARM-based SoCs, extended to attack Nvidia's core business. Though Intel is very dominant in x86 laptop-through-server processors, in the broader category of "compute silicon," they've got a lot of competition from ARM and its licensees on CPUs and SoCs (as well as IBM in consoles), from AMD and Nvidia on graphics and HPC (and, to a lesser extent, PowerVR et al.), from AMD on the growingly-popular good-enough multimedia CPUs/APUs/SoCs/whatevers, Samsung on NAND. Intel is big and dominant, but it won't stay that way through size alone; it's in the top spot of a competitive oligopoly, and they can be dethroned by someone else doing better. Meanwhile, Nvidia's Shield demo and its and AMD's cloud gaming initiatives are looking at challenging the console market by providing PC level power in thin clients via infrastructure. That's gonna put them head to head with Microsoft, Sony and Nintendo... who are supplied by ARM and AMD and oy. Bottom line, the compute market is definitely NOT stagnant, regardless of Intel's size. As long as the blatantly anti-competitive poo poo stays gone, there will be continual jockeying to be king of the hill.
|
# ? Jan 10, 2013 10:46 |
|
Factory Factory posted:Another tidbit: And then next up there’s Broadwell which will apparently bring another 40% IGP improvement on top of Haswell. Suddenly articles like this don’t look so stupid anymore. Why is Intel suddenly so successful in the IGP department after trying (and failing) so hard for so many years?
|
# ? Jan 10, 2013 11:05 |
|
Factory Factory posted:The OEM versions of the desktop Radeon HD 8000 series is almost all rebadges. Everything from the 7970 GHz edition down to the 5450 is getting an "HD 8000" branding.
|
# ? Jan 10, 2013 11:34 |
|
|
# ? Mar 28, 2024 11:32 |
|
eames posted:
It's the same reason as why they are seeing success in the mobile (handheld) market. Their manufacturing process is catching up to the talent of the design engineers. Intel's manufacturing process is their greatest strength and I think that may drive their success in mobile and GPUs more than anything else. Nvidia, Apple, Qualcomm, and AMD are all competing for manufacturing space in 3rd party fabs like TSMC as well as having restrictions of the capability of these fabs. Intel has no such problem. Their chip designers and fabrication engineers can work side by side to smooth out manufacturing and Intel may be the excess capacity that the market needs to keep up with increasing demand.
|
# ? Jan 10, 2013 15:04 |