Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
jkyuusai
Jun 26, 2008

homegrown man milk


Driver software to be tweaked to reduce Radeon frame latencies in series of updates

Adbot
ADBOT LOVES YOU

M_S_C
Mar 25, 2004
i like my chicken mechanically separated

Has there been much talk about the Radeon 7870 LE? It's lovely that AMD would release this thing right after holiday season when everyone's already done buying their crap (my 7870 Ghz weeps gently in the corner). It's priced at the same point as the vanilla 7870 GHz edition but performs quite a bit better. AMD really should have just called this thing the 7890 or the 7930.

Matt Zerella
Oct 7, 2002


So I haven't built a machine in years and I'm doing a build for gaming and photoshop.

I got a pretty awesome deal on a used 560 Ti (It's a gigabyte card, which is not the best option according to the building thread but at that price ill take a risk).

Since I'm not going to be overclocking it (and quite frankly hat emost video card utility addons and their awful interfaces), can I just go right to the nvidia site and get the proper drivers or do I have to go to gigabytes site and install theirs?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.


For desktop GPUs, the manufacturer always just points you to the generic drivers. Go ahead and get them from Nvidia's site.

Matt Zerella
Oct 7, 2002


Factory Factory posted:

For desktop GPUs, the manufacturer always just points you to the generic drivers. Go ahead and get them from Nvidia's site.

Thanks, just double checking.

real_scud
Sep 5, 2002

One of these days these elbows are gonna walk all over you


I got a new aftermarket cooler for my 7970 because with my new case I realize just how loud the damned thing is.

I don't think I've seen so many goddamned RAM heatsinks that I know are going to be a bitch to put on because they're so tiny.

future ghost
Dec 5, 2005

det er noget at leve for

Gun Saliva

real_scud posted:

I got a new aftermarket cooler for my 7970 because with my new case I realize just how loud the damned thing is.

I don't think I've seen so many goddamned RAM heatsinks that I know are going to be a bitch to put on because they're so tiny.
You might want to pick up some sekisui thermal tape off of eBay to attach the heatsinks with. I buy a big roll of it occasionally, as the more-common 3M thermal tape you can buy is pure poo poo in comparison and I don't trust it to hold up heatsinks over the long-term.


Make sure you also have heatsinks for the PWM/voltage chips for your card if it doesn't have a separate PWM heatsink already (RAM sinks aren't expressly necessary, but leaving the PWM chips bare could kill the card). For the VRM/PWM section I like to use Enzotech MOS-C1's since you can cut them to whatever height you need with nothing but wire cutters.

Wistful of Dollars
Aug 25, 2009



Glen Goobersmooches posted:

Asus-branded Nvidia GPUs have been top tier this cycle but I've never actually heard anything about their warranty practices.

I can't speak about gpus specially, but I can report that when I needed the screen if my laptop replaced (under warranty) it was a painless and satisfactory process.

real_scud
Sep 5, 2002

One of these days these elbows are gonna walk all over you


grumperfish posted:

You might want to pick up some sekisui thermal tape off of eBay to attach the heatsinks with. I buy a big roll of it occasionally, as the more-common 3M thermal tape you can buy is pure poo poo in comparison and I don't trust it to hold up heatsinks over the long-term.


Make sure you also have heatsinks for the PWM/voltage chips for your card if it doesn't have a separate PWM heatsink already (RAM sinks aren't expressly necessary, but leaving the PWM chips bare could kill the card). For the VRM/PWM section I like to use Enzotech MOS-C1's since you can cut them to whatever height you need with nothing but wire cutters.
The Accelero 7970 I picked up doesn't actually have 3M tape they instead give you a two-part glue to attach the RAM/PWM heatsinks so I think I'm going to try using that stuff since it's here and I don't wait to wait even longer to get it attached.

Of course there is no manual in the actual packaging and you have to download it, but I'm actually going to use this as a more visual guide on how to install it.

Thankfully I have a 2nd 7970 that was sitting outside my system so I don't have to be without a computer while installing it.

future ghost
Dec 5, 2005

det er noget at leve for

Gun Saliva

Yeah, the glue is designed to work with it so it'll work. I generally use the tape for warranty purposes when working with aftermarket GPU coolers. Make sure to follow the instructions with the glue and maybe test it on something so you get the hang of it.

Like it says in the linked instructions, use a standard pencil eraser on the chips (carefully) to clean them before applying the glue/heatsinks. It'll help in securing the application, especially on the RAM chips. Be really cautious when cleaning the PWM row as there's a bunch of SMT parts close to them. Make sure to sink the PWM chips in the line next to the PCI-E power plugs, and also the PWM chips (1-2) next to the bracket as these tie into the RAM. Basically anything needs a heatsink if it was covered by the stock cooler.


After you get it installed, you're probably going to be stunned with the noise difference. Going from the stock blower on my 6970 to an AXP was a very noticeable change.

future ghost fucked around with this message at 16:10 on Jan 5, 2013

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

M_S_C posted:

Has there been much talk about the Radeon 7870 LE? It's lovely that AMD would release this thing right after holiday season when everyone's already done buying their crap (my 7870 Ghz weeps gently in the corner). It's priced at the same point as the vanilla 7870 GHz edition but performs quite a bit better. AMD really should have just called this thing the 7890 or the 7930.
This is what should have been called the Radeon HD 7930, a second shader cluster and 1/3 of the memory channels disabled. This is probably intended to help them clear inventory, they just delayed the Radeon HD 8000-series from late March to Q2 to clear out 7000-series inventory. Looks like a decent card if the pricing is right.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.


PC Perspective is taking a different tack on latency-based GPU analysis. Specifically, they're using a capture card and video analysis to get the post-display-chain results of frame latency and tearing artifacts on CrossFire/SLI setups. They use an uncompressed DVI-DL capture card which writes to a PCIe SSD at over 400 MB/s sustained, and the output is EXACTLY the output sent to the monitor, as opposed to Tech Report's FRAPS-based method, which measures when the system starts a new frame rendering.

This leads to interesting realizations about screen tearing and framerates:



That image is one 16.67 ms (60 FPS) displayed frame, consisting of three rendered frames. RFrame 1 is red. RFrame 2 is green. RFrame 3 is blue. That green frame is pretty darn insigificant, isn't it?

Kreez
Oct 18, 2003



DrSunshine posted:

Perhaps this is the best place to ask this. If not, just yell at me and I'll happily edit this out.

My graphics card is an ATI Radeon HD 5770, which I bought from a used computer store.

A while ago, I started getting odd problems while playing certain games (Guild Wars 2) where the monitor went black all of a sudden. Checking the temperature, I found that it would peak at around 105 C at full loads. I shut the computer down and cleaned all the dust out. Now it runs at around 55 C idle, and peaks at around 90.

I ran Furmark to test it at its full load, and found that it would peak around 92 or 93 C.

Is this too hot? Should I be worried? And if so, what suggestions would you have for me?

(For reference, my computer is a Dell Dimension E510)

I bought a 5770 a little over 2 years ago, and it has to have the worst cooling package of all time. (This is the first generation giant cooler, not the second gen "egg" cooler) How this never seemed to be mentioned in the "parts picking" thread when it was one of the quick pick cards I'm sure sure. The thing was the loudest part of my computer on idle, and sounded like a vacuum cleaner under load. And it doesn't even seem to do anything, it idled at 50 and shot up to 100 under any load.

Anyway, just a few weeks ago I started playing games without headphones and can't stand it any more. I grabbed an Accelero S1 Plus on clearance for $18 this morning, and threw it on (did a bit of dremel work on the stock RAM heatsink plate so I could use it instead of fiddling around with the glue and separate heatsinks included). Running passively, the card is currently plateaued at 96 after stressing it for 17 minutes. This isn't even a high end heatsink. The stock cooler is just that terrible. The stock heatsink looks like it would cost way more to manufacture and assemble as well.

Shutting the computer down now to add a 120mm fan to the Accelero with zip ties, will edit with results. edit: 60C after 20 minutes of Furmark. And this with a low CFM quiet fan that I can't hear over the (relatively quiet) PSU fan, instead of a fan louder than a vacuum.

Kreez fucked around with this message at 22:15 on Jan 5, 2013

Space Racist
Mar 27, 2008

~savior of yoomanity~


Alereon posted:

This is what should have been called the Radeon HD 7930, a second shader cluster and 1/3 of the memory channels disabled. This is probably intended to help them clear inventory, they just delayed the Radeon HD 8000-series from late March to Q2 to clear out 7000-series inventory. Looks like a decent card if the pricing is right.

A lot of PS4/Xbox 720 rumors have suggested them using an AMD HD 7000-series GPU, is there any realistic chance of this being a consumer version of that (presumably custom) part? The 7950 seems too expensive for them to realistically use that in a console costing ~$400, but I imagine they'd also want something beefier than a vanilla 7850/7870.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

There's basically no way the 7870 LE is a custom part, they're just trying to find a way to sell GPUs that had one too many defects to make it as a 7950. The PS4's GPU is going to be a very customized part because they eventually want to integrate it with the CPU, if it isn't at launch.

Alereon fucked around with this message at 23:00 on Jan 5, 2013

Wozbo
Jul 5, 2010


Alereon posted:

There's basically no way it's a custom part, they're just trying to find a way to sell GPUs that had one too many defects to make it as a 7950. The PS4's GPU is going to be a very customized part because they eventually want to integrate it with the CPU, if it isn't at launch.

There's going to be some difference due to all the DMA stuff.

Space Racist
Mar 27, 2008

~savior of yoomanity~


Alereon posted:

There's basically no way the 7870 LE is a custom part, they're just trying to find a way to sell GPUs that had one too many defects to make it as a 7950. The PS4's GPU is going to be a very customized part because they eventually want to integrate it with the CPU, if it isn't at launch.

Yeah, I didn't think so, but after reading some new console rumor roundups and seeing that review all around the same time, my brain just kind of went wild. I remember the original Xbox basically used off-the-shelf parts, but that bit Microsoft in the rear end financially later in the Xbox's life.

Malloc Voidstar
May 7, 2007

Fuck the cowboys. Unf. Fuck em hard.

Space Racist posted:

Yeah, I didn't think so, but after reading some new console rumor roundups and seeing that review all around the same time, my brain just kind of went wild. I remember the original Xbox basically used off-the-shelf parts, but that bit Microsoft in the rear end financially later in the Xbox's life.
That was because Nvidia didn't allow them to renegotiate prices for the videocards, even though they were much cheaper after a couple years had passed. It was either MS's failure to put that in the contract, or Nvidia's refusal to renegotiate when they made a new one. (been a while since I heard about it)
It wasn't inherent to the parts being off-the-shelf, as far as I know.

William Bear
Oct 25, 2012

"That's what they all say!"


I need help comparing the two graphics cards I own. For some reason, the Nvidia Geforce GT 230 doesn't appear on any benchmark or comparison sites.

What is better for gaming?

quote:

NVIDIA GeForce GT 230 graphics processing unit
96 stream processors
500 MHz core clock
1242 MHz shader clock
1.5 GB DDR2 memory
1000 MHz memory clock
192-bit memory interface

quote:

NVIDIA GeForce GT 620
96 CUDA Cores
700 Graphics Clock (MHz)
1400 Processor Clock Tester(MHz)
11.2 Texture Fill Rate (billion/sec)
1.8 Gbps Memory Clock
1024 MB Standard Memory Config
DDR3 Memory Interface
64-bit Memory Interface Width
14.4 Memory Bandwidth (GB/sec)
4.2OpenGL
PCI Express 2.0 Bus Support
YesCertified for Windows 7
DirectX 11, CUDA, PhysXSupported Technologies
YesMulti Monitor
2560x1600Maximum Digital Resolution
2048x1536Maximum VGA Resolution
YesHDCP
YesHDMI1
Dual Link DVI-I, HDMI, VGAStandard Display Connectors
Internal

The above sources of information:
http://www.geeks.com/details.asp?in...T230-PCIE-15-PB
http://www.geforce.com/hardware/des.../specifications

The challenge is that, as noted above, the 230 is not found on any comparison website, and what information I can find on it is in a different format than other products usually are, making a clear comparison difficult.

My major quandary: I assume, reading the above, that one has more memory on board, but the other has a much faster clock. Which is preferable?

William Bear fucked around with this message at 02:29 on Jan 7, 2013

craig588
Nov 19, 2005

by Nyc_Tattoo


Since you already own both of them can't you just try them and see? If we're placing bets I'd say the 230 would be faster based on theoretical performance. Especially if that 620 is an OEM card, it's almost half of a retail 620. (I'm just guessing you have an OEM 620 considering the 230 was only available in OEM form)

They'll both be pretty terrible but you could probably sell both of them to scrape together 50$ to buy a used 260.

bend it like baked ham
Feb 16, 2009

Fries.


William Bear posted:

I need help comparing the two graphics cards I own. For some reason, the Nvidia Geforce GT 230 doesn't appear on any benchmark or comparison sites.

Is this in an "all-in-one" PC or laptop? Cause I think that model is onboard or OEM video of some kind.

William Bear
Oct 25, 2012

"That's what they all say!"


Local Resident posted:

Is this in an "all-in-one" PC or laptop? Cause I think that model is onboard or OEM video of some kind.

No, it's a regular card. It came with my HP Pavilion Elite e9260f.

Good idea comparing them in the computer itself, I haven't done it yet because I'm wary of confirmation bias by just eyeballing it. I've been looking at benchmarking programs, do you have a recommendation?

craig588
Nov 19, 2005

by Nyc_Tattoo


http://unigine.com/products/heaven/

Max out everything you can and once the demo is runing the hotkey for running a benchmark is F9. You might have to set it to DX10 for both cards because I guess the 620 has some level of DX11 support. (It's not going to be powerful enough to ever see those features in a playable state so don't worry about not having it on the 230)

I think the major difference is going to come from the 230s much faster memory. The 620 is killed by having only a 64 bit bus compared to the 230s 192 bit. I don't think the architectural improvements of the 620 will be enough for it to pull ahead.

Instrumedley
Aug 13, 2009


In an effort to figure out why my ASUS GeForce 570's fans have started to make a lot of noise, I tried removing them but instead accidentally separated the cooler from the GPU itself.

Do I need to re-apply the thermal compound? It runs fine and the temperature is the same as before (e.g., ~30 C while idling).

Instrumedley fucked around with this message at 03:46 on Jan 7, 2013

Rexxed
May 1, 2010

Dis is amazing!
I gotta try dis!



Instrumedley posted:

In an effort to figure out why my ASUS GeForce 570's fans have started to make a lot of noise, I tried removing them but instead accidentally separated the cooler from the GPU itself.

Do I need to re-apply the thermal compound? It runs fine and the temperature is the same as before (e.g., ~30 C while idling).

Put new thermal paste on. If you don't have any, the old stuff will suffice until you get some as long as the temps seem to be fine. Just keep an eye on it, old thermal paste is hard and if you pulled it apart and squished it back together again there's probably some air pockets.

Instrumedley
Aug 13, 2009


Rexxed posted:

Put new thermal paste on. If you don't have any, the old stuff will suffice until you get some as long as the temps seem to be fine. Just keep an eye on it, old thermal paste is hard and if you pulled it apart and squished it back together again there's probably some air pockets.

Is there a particular brand of thermal paste I should use?

Rexxed
May 1, 2010

Dis is amazing!
I gotta try dis!



Instrumedley posted:

Is there a particular brand of thermal paste I should use?

The brand doesn't make a huge difference. I've used Arctic silver 2, 5, and Ceramique (non metallic), but there's plenty of graphs showing minor differences in how thermal paste performs, it's just getting a new layer of it in there that's important. I'd shop for whatever's cheapest.

PC LOAD LETTER
May 23, 2005
WTF?!

Slippery Tilde

Space Racist posted:

A lot of PS4/Xbox 720 rumors have suggested them using an AMD HD 7000-series GPU, is there any realistic chance of this being a consumer version of that (presumably custom) part?
Nah. Its got a 185W TDP for just the GPU. That is waaaay to high to go into a console. Way, waaaaaay to high.

For some perspective the entire original "power hog" X360 used about 172W and the original PS3 used about 189W. (edit:)The Xenos original GPU used around 90w.

Bear in mind too that the Tahiti LE GPUs are being made on a still very high end modern 28nm process. So even with a revamp and die shrink it'll probably still put out too much (edit) heat and use too much power for use in a console.

That is half the reason why the expected/rumored GPUs for the PS4/X720 are mid-range 6xxx class GPU's. They were very compact in terms of die space and had good performance per watt as well and should be fairly cheap now to produce on a more modern process. 6670-6870 GPU are fairly realistic to expect in a late 2013/early 2014 PS4/X720 console IMO.

PC LOAD LETTER fucked around with this message at 06:56 on Jan 7, 2013

Portfolio
Dec 10, 2004
The Department of Redundancy Department


My 660 Ti has been giving me some weird problems lately. When my machine wakes from sleep, the display won't turn on. The rest of the system seems to power up as expected, but nothing is sent to the display. It's not an issue with the monitor, my laptop doesn't cause the issue.

I figured it was probably a driver issue, so I upgraded to the most recent nVidia drivers on their site (released 5 January 2013). Now when waking from sleep, I get a BSOD.

Anyone else have this happen to them? Google shows me a lot of forum posts of people with similar issues, but no fixes other than 'get the latest drivers', or the especially helpful 'don't let your computer go to sleep'.

William Bear
Oct 25, 2012

"That's what they all say!"


NVM, found answer.

William Bear fucked around with this message at 06:16 on Jan 9, 2013

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.


I was gonna save all the CES stuff for a new thread, but I can't get to it until the end of the week, and I saw a couple stories that I just can't sit on.

For you see, AMD and Nvidia are a couple of dogfuckers.

Why?

The OEM versions of the desktop Radeon HD 8000 series is almost all rebadges. Everything from the 7970 GHz edition down to the 5450 is getting an "HD 8000" branding.

7970 GHz --> 8970 OEM
7950 w/ Boost --> 8950 OEM
7870 --> 8870. 7770 --> 8760. 7750-900 --> 8740. 6450 --> 8400. 5450 --> 8350.

The only new parts are The 8670 OEM and 8570 OEM, based on different binnings of the new 384-core GCN chip, Mars (but who cares about AMD codenames?). And those are only "new" in the sense of "only recently released;" they're still first-gen GCN, not the upcoming refresh.

On the mobile side, the 7800M is being rebadged, as well, as the 8870M, 8850M, and 8830M. This isn't quite as useless a change, though, as these parts enable Boost clocking. Though for some Goddamn reason the 8870M can be configured with DDR3 SDRAM instead of GDDR5. The rest of the mobile parts fill out the low end with binnings and harvestings of the 384-core GCN chip, which is something, at least.

Meanwhile, Nvidia isn't skipping this opportunity to poo poo testicles. Meet the GeForce 710M and 730M. Or you may have already known them as the 640M (GK107) and 620M (GF117).

PC LOAD LETTER
May 23, 2005
WTF?!

Slippery Tilde

Are they going to at least drop the prices significantly or are they planning on milking everyone for as long as possible? I bet its the latter but gotta make sure right? :/

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.


Another tidbit:

Intel demo'd the Haswell GT3e IGP. It's roughly GeForce 650M-level.

Proud Christian Mom
Dec 20, 2006


Factory Factory posted:

Another tidbit:

Intel demo'd the Haswell GT3e IGP. It's roughly GeForce 650M-level.

Jesus Christ

Boten Anna
Feb 22, 2010



I wonder if nVidia/ATI are scared of this, as they kind of should be. At this rate there won't really be a compelling reason to buy a graphics card for all but the most extreme high performance uses in a couple of generations, leaving GPUs to go the way of the sound card.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down



Factory Factory posted:

Another tidbit:

Intel demo'd the Haswell GT3e IGP. It's roughly GeForce 650M-level.

If Intel gets their driver poo poo together, they stand a chance at owning way more than any one company should. Too much integration. Not... sure if they'll be allowed to keep it at that point, the bread and butter of the graphics companies are that performance range, much as enthusiasts would like to feel included. Do I want a laptop with great battery life and extremely good graphics performance compared to current options? Absolutely! Do I want Intel to own the world? Don't feel they've earned it. I don't think it's sound economics to allow monopoly on the grounds of contemporary success, regardless of how impressive it might look at the moment; the long run does not extrapolate cleanly from the short term, and it's bad decision-making to just trust a company to keep besting themselves when over time that self-interest folds over to a stronger motivation to be profitable (a motive that no longer requires extraordinary innovation, just moving things along now and again).

P.S. for FF, I'll get in touch with you soon, man, just... a lot of poo poo going on. Thanks for reaching out, sorry this is my first word back.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.


In essence, this is Intel fighting back against Nvidia's Tegra 3 and other ARM-based SoCs, extended to attack Nvidia's core business. Though Intel is very dominant in x86 laptop-through-server processors, in the broader category of "compute silicon," they've got a lot of competition from ARM and its licensees on CPUs and SoCs (as well as IBM in consoles), from AMD and Nvidia on graphics and HPC (and, to a lesser extent, PowerVR et al.), from AMD on the growingly-popular good-enough multimedia CPUs/APUs/SoCs/whatevers, Samsung on NAND. Intel is big and dominant, but it won't stay that way through size alone; it's in the top spot of a competitive oligopoly, and they can be dethroned by someone else doing better.

Meanwhile, Nvidia's Shield demo and its and AMD's cloud gaming initiatives are looking at challenging the console market by providing PC level power in thin clients via infrastructure. That's gonna put them head to head with Microsoft, Sony and Nintendo... who are supplied by ARM and AMD and oy.

Bottom line, the compute market is definitely NOT stagnant, regardless of Intel's size. As long as the blatantly anti-competitive poo poo stays gone, there will be continual jockeying to be king of the hill.

eames
May 9, 2009



Factory Factory posted:

Another tidbit:

Intel demo'd the Haswell GT3e IGP. It's roughly GeForce 650M-level.

And then next up thereís Broadwell which will apparently bring another 40% IGP improvement on top of Haswell.

Suddenly articles like this donít look so stupid anymore.

Why is Intel suddenly so successful in the IGP department after trying (and failing) so hard for so many years?

P-Funk
Jan 7, 2001



Factory Factory posted:

The OEM versions of the desktop Radeon HD 8000 series is almost all rebadges. Everything from the 7970 GHz edition down to the 5450 is getting an "HD 8000" branding.
That's some nice loving over the average Joe Retard getting a computer at Best Buy with last year's parts. I'm guessing they're actually going to update their line with new chips but when that happens what will they name the OEM parts, 8970 Real Edition?

Adbot
ADBOT LOVES YOU

bull3964
Nov 18, 2000

DO YOU HEAR THAT? THAT'S THE SOUND OF ME PATTING MYSELF ON THE BACK.




eames posted:


Why is Intel suddenly so successful in the IGP department after trying (and failing) so hard for so many years?

It's the same reason as why they are seeing success in the mobile (handheld) market. Their manufacturing process is catching up to the talent of the design engineers.

Intel's manufacturing process is their greatest strength and I think that may drive their success in mobile and GPUs more than anything else. Nvidia, Apple, Qualcomm, and AMD are all competing for manufacturing space in 3rd party fabs like TSMC as well as having restrictions of the capability of these fabs. Intel has no such problem. Their chip designers and fabrication engineers can work side by side to smooth out manufacturing and Intel may be the excess capacity that the market needs to keep up with increasing demand.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply