Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance

What I find confusing and frustrating is other people have HD7850 cards and aren't having issues with the newest beta driver and catalyst application profile but I still am on my both cards in both systems. Could it be because they are both Sapphire cards?

Adbot
ADBOT LOVES YOU

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

I would guess (genuine guess, could be totally wrong) the determiner of what systems are affected is going to be the motherboard, not card manufacturer.

Boten Anna
Feb 22, 2010



One thing I found interesting about the hfr hobbit is that it seemed to make it more obvious when they sped up the raw footage. The beginning of the movie went back and forth with sped up and actual speed footage fairly often, and I suspect that some people conflated the awkwardness of sped up footage with hfr.

hobbesmaster
Jan 28, 2008



Boten Anna posted:

One thing I found interesting about the hfr hobbit is that it seemed to make it more obvious when they sped up the raw footage. The beginning of the movie went back and forth with sped up and actual speed footage fairly often, and I suspect that some people conflated the awkwardness of sped up footage with hfr.

If thats the case then that is what the real problem is.

As another anecdote, a lot of headaches in 3d movies are caused by improper viewports and impossible geometries caused by it. More recent 3d films don't have these issues as much, but early 3d films had it everywhere.

Dominoes
Sep 20, 2007



Longinus00 posted:

I knew someone who would get TDRs non stop with his nvidia card and was driving him crazy until he finally figured out it was because he was streaming audio over hdmi. Once he turned that off and moved to using the sound card the problem disappeared.
Wait, what? I'm about to upgrade from ATI to nvidia, and use HDMI for audio. Can anyone confirm/deny this problem?

Longinus00
Dec 29, 2005
Ur-Quan

Dominoes posted:

Wait, what? I'm about to upgrade from ATI to nvidia, and use HDMI for audio. Can anyone confirm/deny this problem?

This kind of stuff is all very driver/hardware related. Your best bet is to buy from somewhere with a nice return policy. For what it's worth this problem was a year to 2 years ago when 5xx series nvidia cards were the hot stuff. Also "all the time" in this respects means every several hours and I have no idea what brand card/monitor were involved.

Boten Anna
Feb 22, 2010



hobbesmaster posted:

If thats the case then that is what the real problem is.

As another anecdote, a lot of headaches in 3d movies are caused by improper viewports and impossible geometries caused by it. More recent 3d films don't have these issues as much, but early 3d films had it everywhere.

I saw it in HFR 3D and it was quite nice, though I agree it probably won't solve other 3D problems.

The sped up footage made sense in the battle sequences and such, because well, the movie is already 3 goddamned hours long and it can't really be that easy to maneuver in all the ridiculous armor and makeup they had on. I think they kind of missed by doing it at the beginning when Frodo was just like, reading books and stuff; it was making me wonder if something was wrong with the projector.

To keep this GPU related I wasted a bunch of time at work today trying to get Aero to work again after it mysteriously stopped working on my--wait for it!--ATI video card, and I never did get it to work. I told y'all I hate those things.

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance

Well I caved and ordered a 660Ti for my main rig that was on sale a few days ago and got it today and guess what no more stuttering, hitching or graphical glitches. And the games that weren't messing up run better as well. PhysX in Borderlands 2 and Arkham City is really loving cool too. And yes I know I wasted my money on a incremental upgrade but I don't care because it was xmas money.

McCoy Pauley
Mar 2, 2006
Gonna eat so many goddamn crumpets.

spasticColon posted:

Well I caved and ordered a 660Ti for my main rig that was on sale a few days ago and got it today and guess what no more stuttering, hitching or graphical glitches. And the games that weren't messing up run better as well. PhysX in Borderlands 2 and Arkham City is really loving cool too. And yes I know I wasted my money on a incremental upgrade but I don't care because it was xmas money.

Which 660ti did you get? I'm looking into those myself for a PC I'm building, and between the various models offered by the manufacturers I'm looking at (EVGA, ASUS, and Zotac), my head is spinning with the various options.

Ham Sandwiches
Jul 7, 2000



movax posted:

Did you fully clean drivers before upgrading?

Do you have any OCing software (like EVGA Precision) installed?

I fully cleaned the drivers using driver sweeper before installing. I also chose the 'clean install' option on the Nvidia drivers, and tried both the 306 WHQL and the 310 beta drivers.

I subsequently tried installing the EVGA precision software to minimize throttling, and also did max performance in the Nvidia driver itself. I didn't overclock it at all though, just messed with the non clock speed settings and stuff like the fan curve.

There are lots of threads that talk about this issue in the context of adaptive or regular V-sync, and recommend turning it off. I found them searching for 'geforce 680 stutter' Most posters in those threads say that it's a game issue limited to V-sync. I had my v-sync fully disabled (driver override to off) and the issue presented both when watching videos and when playing games.

And here's the most useful thread I found on the topic:
http://www.overclock.net/t/1256774/...ation-thread/30

I bet it's some combo of MB + Vidcard + other weirdness that makes it happen, but it really sucks when it bites you unfortunately

I may send a hail mary on a 7970 and see if that works better, but I was really hoping to get a 680 and be done with it.

fart simpson
Jul 2, 2005



Lipstick Apathy

Does anyone know if the Intel HD Graphics 4000 supports multichannel LPCM output over HDMI? If I want this feature, will I need to buy a different graphics card?

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

MeramJert posted:

Does anyone know if the Intel HD Graphics 4000 supports multichannel LPCM output over HDMI? If I want this feature, will I need to buy a different graphics card?
Anandtech says this has been supported across all Intel graphics products since 2006, which is cool.

Edit: You are talking about streaming, e.g. movies right? I don't know if you can play a game in 7.1 for example.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.


AMD announced a few parts for the Radeon HD 8000M series today. AnandTech TechReport

The announcement covers four parts, all using some variety of the GCN uarch. The announced parts are GCN "classic" uarch with one new GPU configuration. While this is the current HD 7000 uarch, in the mobile and low-end space the new GPU config will replace the current legacy VLIW4 parts making up the low end of the HD 7000 lineup, adding stuff like GCN's video encode, 28nm die shrink, improved Enduro (like Optimus), Boost clocking, and compute. The non-specific parts are:

Radeon HD 8500M (384 GCN shaders @ "up to" 650 MHz nominal, 2 GHz DDR3 or 4.5 GHz GDDR5).
Radeon HD 8600M (384 GCN shaders @ "up to" 775 MHz nominal, 2 GHz DDR3 or 4.5 GHz GDDR5)
Radeon HD 8700M (384 GCN shaders @ 620-850 MHz, 2 GHz DDR3 or 4.5 GHz GDDR5)
Radeon HD 8800M (640 GCN shaders @ 650-700 MHz, 4.5 GHz GDDR5)

AMD says that an 8690M will be 20-50% faster than a 7590M while being priced about the same, with a similar difference between the 7670M and 8770M. It breaks the model number pattern a bit, but the price:performance increase is nice.

TechReport points out that this is what HD 7000M should have been in the first place, and I can't disagree. And while it's nice to see more GCN parts come out, of course, the split between GCN and the GCN revision that the rumor mill expects to be in the desktop/high-end mobile HD 8000 series means that AMD will likely be continuing the GPU industry's annoying trend of slapping labels on like model years rather than useful identifiers.

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance

McCoy Pauley posted:

Which 660ti did you get? I'm looking into those myself for a PC I'm building, and between the various models offered by the manufacturers I'm looking at (EVGA, ASUS, and Zotac), my head is spinning with the various options.

I got the ASUS one that was on sale for $275 free shipping on newegg but they make ones that run at different clock speeds. The custom cooler on it is HUGE so if you get one make sure you have room for it in your case but it fits just fine in my Corsair Carbide 400R.

One thing I noticed about my card is even though is supposed to be a stock clocked card the turbo boost according to GPU-Z is running 1084MHz when it should be only running at 980MHz and GPU-Z even says that the boost clock should be 980MHz but when I put a load on the video card it shoots up to 1084MHz for a few minutes then settles back to 1071MHz. I checked my order and newegg sent me the right card so I wonder if ASUS put one of their cards with a more aggressive turbo clock in the wrong box. The GPU gets kinda warm too up to 76-78C under load so should I be worried? When it does get that warm the turbo goes down further to 1051MHz though. The voltage reads at 1.175 under load as well which is the normal boost voltage for a 660Ti so I'm kind of confused. They can throttle up that much on stock voltage?

Lord Dekks
Jan 24, 2005



Does anyone else miss the days when cpus and video card families had new names each major revision?

I find I have to look up performance charts now to figure out upgrades. AMD seem especially bad for this, a 8550 or whatever will be a lower powered 7750 but out peformed by a 7950 etc.

I know, I know, I'm old but found TNT>TNT2>Radeon was much easy for me.

Lowclock
Oct 26, 2005


Lord Dekks posted:

Does anyone else miss the days when cpus and video card families had new names each major revision?
Yeah I swore I was going to hold off on upgrading video cards until they were called something besides GeForce or Radeon. 13 years later...

Squibbles
Aug 24, 2000

Mwaha ha HA ha!

Lord Dekks posted:

Does anyone else miss the days when cpus and video card families had new names each major revision?

I find I have to look up performance charts now to figure out upgrades. AMD seem especially bad for this, a 8550 or whatever will be a lower powered 7750 but out peformed by a 7950 etc.

I know, I know, I'm old but found TNT>TNT2>Radeon was much easy for me.

It was even worse when nvidia went from their 4 digit to 3 digit numbering scheme and now AMD is up to where nvidia was a few years ago (8800, 9800, etc). Just for extra confusion if you've been out of the video card market for a while.

parasyte
Aug 13, 2003

Nobody wants to die except the suicides. They're no fun.


Squibbles posted:

It was even worse when nvidia went from their 4 digit to 3 digit numbering scheme and now AMD is up to where nvidia was a few years ago (8800, 9800, etc). Just for extra confusion if you've been out of the video card market for a while.

Though if you were out even longer than that, ATI originally had 7000-9000 series cards (R100 and R200 chips back around the turn of the century) and now are back to that.

movax
Aug 30, 2008



parasyte posted:

Though if you were out even longer than that, ATI originally had 7000-9000 series cards (R100 and R200 chips back around the turn of the century) and now are back to that.

Radeon 8500 All-In-Wonder! 7500 and 8500s were getting their poo poo kicked in by GeForces until R300 (Radeon 9700) launched with Doom 3, and you'd get insane performance improvements over the GeForce 4 with AA and AF maxed.

And it was codenamed Khan

bull3964
Nov 18, 2000

DO YOU HEAR THAT? THAT'S THE SOUND OF ME PATTING MYSELF ON THE BACK.




I don't even try to keep up with GPUs until the 3 or so month run-up of my intended purchases because it's just so drat confusing.

I also only seem to buy a new graphics card every 3-6 years so it makes it even more apparent how the naming schemes make no sense.

Riva 128zx
Geforce 256
Geforce 2 MX400 (as part of an nForce chipset)
Geforce 6600GT
Geforce GTX 460

I can't even pretend to follow the Radeon line recently.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast


bull3964 posted:

I can't even pretend to follow the Radeon line recently.

Easily more logical than the NVIDIA line, which is saying something, because both naming systems are not great.

At least the Radeon line tends to be filled with less rebrands, whereas NVIDIA often throws in a lot of old chips with new names. AMD also culled the extra letters and crap after the model number, for the most part.

Scalding Coffee
Jun 26, 2006

You're already dead


McCoy Pauley posted:

Which 660ti did you get? I'm looking into those myself for a PC I'm building, and between the various models offered by the manufacturers I'm looking at (EVGA, ASUS, and Zotac), my head is spinning with the various options.
The EVGA is all kinds of fun over the 560ti. Just about any game runs in the triple digits at full settings on 1920X1200. Fan looks ridiculous.

Space Racist
Mar 27, 2008

~savior of yoomanity~


HalloKitty posted:

Easily more logical than the NVIDIA line, which is saying something, because both naming systems are not great.

I'd agree with that. Both companies are guilty of excessive rebadging on the low end of their lines, but AMD has at least had a consistent numbering scheme for the past five years. AMD has offered a range of cards from the 1xxx series (and xxx series before) on up to our current 7xxx series, while Nvidia only offered the 1xx and 3xx series as lovely OEM parts, while the 2xx and 4xx lines were full-fledged offerings.

Anyway, as a longtime team red member, it is a little funny that we've come full circle on GPU naming schemes. I was half tempted to skip the 7xxx series and hold out for the 8xxx series since the GPU in my last gaming rig was a Radeon 8500LE.

Endymion FRS MK1
Oct 28, 2011



What do we do once we get past HD9000? Will they go to a 5 digit number, or start new again?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.


Endymion FRS MK1 posted:

What do we do once we get past HD9000? Will they go to a 5 digit number, or start new again?

Well, we went from the GeForce 9800 to the GeForce 280, so...

Radeon HD 200? Featuring the Radeon HD 297? It's not like they ever make the last digit anything but "0" anyway.

Radeon 7000 through 9800, X300 through X1950, HD 2000 to HD 8000... Maybe they'll pull an Apple, call every card Radeon, and you gotta check the model year and spec sheet to figure out anything specific.

Grim Up North
Dec 12, 2011



Factory Factory posted:

Radeon 7000 through 9800, X300 through X1950, HD 2000 to HD 8000...

Radeon 4K 2870.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast


Grim Up North posted:

Radeon 4K 2870.

I guess at the moment the cards could also be interpreted as the 27xxx series, but instead of 2 they put HD. Radeon 27970. Ah yes, the numbers, the numbers!

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance

My 660Ti gets up to 76-78C under load and the boost clock is still running at 1051MHz when it should only running at 980MHz. And this is the card's factory settings I didn't tweak them in any way. Should I be worried?

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast


78c isn't a temperature to worry about on a graphics card. They are just power hungry, hot beasts. Enjoy the performance..

TheRationalRedditor
Jul 17, 2000

WHO ABUSED HIM. WHO ABUSED THE BOY.


spasticColon posted:

My 660Ti gets up to 76-78C under load and the boost clock is still running at 1051MHz when it should only running at 980MHz. And this is the card's factory settings I didn't tweak them in any way. Should I be worried?
The boost clock reading in GPU-Z isn't dynamic (or even accurate) and ultimately doesn't mean anything, stop sweating it. The 660Ti is 915MHz @ stock, many perfunctory "OC" models are adjusted up to 980MHz because it's an easy, safe target.

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance

Glen Goobersmooches posted:

The boost clock reading in GPU-Z isn't dynamic (or even accurate) and ultimately doesn't mean anything, stop sweating it. The 660Ti is 915MHz @ stock, many perfunctory "OC" models are adjusted up to 980MHz because it's an easy, safe target.

That's what making me scratch my head is that the first tab in GPU-Z shows everything running at stock speeds including the boost clock but when I run a graphics intensive game like Metro 2033 the second tab that displays the clockspeeds it shoots up to 1051MHz or even 1084MHz and the GPU gets up to 76C according to GPU-Z. And I got the ASUS 660Ti with the HUGE direct heatpipe heatsink that comes with it. The fans on it do speed up but goddamn this is probably one of the biggest factory installed custom coolers I have ever seen on a video card. The fans rev to about 60 percent according to GPU-Z but I don't know if that's accurate because I'm not hearing the fans rev up while playing a game or rev down after I exit a game.

coffeetable
Feb 5, 2006

TELL ME AGAIN HOW GREAT BRITAIN WOULD BE IF IT WAS RULED BY THE MERCILESS JACKBOOT OF PRINCE CHARLES

YES I DO TALK TO PLANTS ACTUALLY


spasticColon posted:

My 660Ti gets up to 76-78C under load and the boost clock is still running at 1051MHz when it should only running at 980MHz. And this is the card's factory settings I didn't tweak them in any way. Should I be worried?

Nope. The max temp given in the 660Ti's specs is 97C.

Dominoes
Sep 20, 2007



Any reason to buy 2 680s over a 690? I'm leaning 680s because the 4gb models have more effective ram, which I've heard is useful for multi-monitor resolutions, or will be in the near future.

craig588
Nov 19, 2005

by Nyc_Tattoo


Buy a single card now and add more later? My procedure with only a single 670 at 2560x1440 is to max out everything and still get a solid 60FPS.

craig588 fucked around with this message at 14:37 on Dec 20, 2012

Space Gopher
Jul 31, 2006
BLITHERING IDIOT

Dominoes posted:

Any reason to buy 2 680s over a 690? I'm leaning 680s because the 4gb models have more effective ram, which I've heard is useful for multi-monitor resolutions, or will be in the near future.

Why are you buying a thousand dollars worth of consumer GPUs in the first place?

Dominoes
Sep 20, 2007



Space Gopher posted:

Why are you buying a thousand dollars worth of consumer GPUs in the first place?
High-res gaming's awesome; the way I have it set up, the in-game FOV's similar to my real FOV of the monitors, making games more immersive. My current setup gets mediocre performance in newer games, so I'm upgrading to one that will give me twice the framerate.

craig588 posted:

Buy a single card now and add more later? My procedure with only a single 670 at 2560x1440 is to max out everything and still get a solid 60FPS.
That's one reason I'd like the 690.

Dominoes fucked around with this message at 21:15 on Dec 20, 2012

craig588
Nov 19, 2005

by Nyc_Tattoo


Are you running 7680x1600 or something? I'm sure at most a dual 680 setup would be more than enough for all but the most ridiculous of monitor setups. By the time you'll need to add a second 690 there will be better single card options available.

Dominoes
Sep 20, 2007



craig588 posted:

Are you running 7680x1600 or something? I'm sure at most a dual 680 setup would be more than enough for all but the most ridiculous of monitor setups. By the time you'll need to add a second 690 there will be better single card options available.
3820x1900 (bezel compensated). I get 20-40 fps in most newer games on my current setup. The dips below 30 are jarring. The 680s or 690 should double my fps to get 60 steady with some acceptable dips. Also, the tearing should be gone; vsync refuses to work on my current setup.

Your point about double 690s being an unlikely path is solid.

Dominoes fucked around with this message at 00:23 on Dec 21, 2012

craig588
Nov 19, 2005

by Nyc_Tattoo


From my experience with SLI I still got weird dips but much higher average frame rates. Just sometimes the cards would get confused and kick out a few seconds of single digit frame rates for seemingly no reason. This was with 8800 GTXes so it's probably a more mature technology now.

Adbot
ADBOT LOVES YOU

Guni
Mar 11, 2010


What sort of overclock could I do that would be safe/easy on my sapphire 7870GHZ oc edition? It's at 1050MHZ and 5000MHZ, I've got (well will be at Christmas) a seasonic m12 520w modular PSU, so there will be no worries there right?

Edit: I forgot there was an overclocking thread..this question should probably go in there, though I'd still appreciate any answers in this thread

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply