Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Rawrbomb
Mar 11, 2011

rawrrrrr

Dominoes posted:

Is that where the cursor will turn into staticy vertical lines? I didn't realize that was a video card glitch.

It happens with multiple monitors, though I haven't had any corruption for a few months. At the very least, not long lasting. You can wiggle the mouse back and forth between two of the screens and fix it. Otherwise a restart will restore it, or open up windows magnifier at 1x.

Adbot
ADBOT LOVES YOU

hobbesmaster
Jan 28, 2008

Dominoes posted:

I'm switching to nvidia because ATI can't get their poo poo together on vsync with multiple monitors.

vv What do you mean?

Are you using analog connectors?

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance

kuddles posted:

I was going to post that. Of course both the AMD and Nvidia fanboys are going nuts but the real point of the article is that current benchmarking comparisons of just listing the average or maximum FPS don't tell the whole story about how well a card works on particular games.

I'm not an Nvidia or AMD fanboy and both of my HD7850 cards are doing this in both of my systems (single card in each machine; no crossfire) in the same games. Even with the newest beta driver and catalyst application profile I still get stuttering in Farcry 3 if I enable AA. With Skyrim I get stuttering and graphical glitches. With Arkham City I get stuttering even if I disable the DX11 features and AA. And its not like these games use OpenGL like Rage does these games all use DirectX although I have read reports online that the new drivers break some DX9 games or games in DX9 mode for some reason Skyrim being one of them. Is it the hardware itself that is causing this issue or did they just botch the drivers?

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

spasticColon posted:

Is it the hardware itself that is causing this issue or did they just botch the drivers?

Sadly, that's not even known at this point. I've read through the Guru3D post that was linked earlier, and it does seem to vary wildly between games and drivers. I didn't really have any issues when I first got the 7850s, and it's been getting noticeably worse over time, so it could be drivers or just hardware degrading.

I'll be running some more tests, although it's hard to test anything without having a precise repro case.

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance

Jan posted:

Sadly, that's not even known at this point. I've read through the Guru3D post that was linked earlier, and it does seem to vary wildly between games and drivers. I didn't really have any issues when I first got the 7850s, and it's been getting noticeably worse over time, so it could be drivers or just hardware degrading.

I'll be running some more tests, although it's hard to test anything without having a precise repro case.

The problem started for me with the 12.10 drivers and the subsequent beta drivers really haven't improved anything other than Farcry 3 running smoother as long as I keep AA disabled. If they issue a recall of these cards I'm going to cry and depending on how much xmas money I get, order a few nvidia cards for my systems.

Boten Anna
Feb 22, 2010

Rawrbomb posted:

For anecdotal evidence the other way, I had 6 NVidia cards blow out in 2007-9 between me (2) my friends (4). I also had non stop crashing issues. I moved to ATi/AMD and never looked back

edit: I didn't a word.

Yeah see, and I'd take this over software issues any day. If there are software issues it costs me time to gently caress with it to find a workaround or fix. If it's hardware I can return/rma/use it as an excuse to upgrade. This is further mitigated by buying from manufacturers with better reputations and warranties.

I can see where others might feel differently, but my experience with nvidia cards has been great overall. The one problem I ever had was a DOA card I just sent back to Amazon and used the refund toward a better card.

Rawrbomb
Mar 11, 2011

rawrrrrr
I had constant driver crashing issues as well. I've not had major video driver issues for 2~ years or so now with my amd cards, aside from cursor corruption once every so often.

Its really personal preference / price point now.

I've stuck with AMD to date due to better displayport cards (which didn't happen until this generation on NVidia). But I also drive 3 monitors and have been for 2+ years now since the displayport option came about on the 5000 series.

Ham Sandwiches
Jul 7, 2000

After having great luck with two Nvidia cards in a row (8800 GTS, then a GTX 570) I decided to try out a GTX 680. I got an EVGA model from Amazon and plugged it right in. Almost right away I noticed this weird hitching / stuttering that was present both during gaming and video playback. If I tried watching youtube videos or regular videos, about every 30 seconds there would be a noticable stutter and then it would resume smooth playback.

In games my FPS was much higher than with the 570 but again, same issue. About every 30-60 seconds it would seem like I was getting 4 FPS - but Fraps did not bear this out, it claimed my framerate was steady. The effect was super noticable and super annoying. Even my roommate commented on it when we were watching some TV shows together. I tried the whole works - clean reinstalls, registry sweepers, driver cleaners, changing settings (disabling vsync) and none of it helped.

I went back to the 570 and the problem completely went away. I have a 600 watt PSU and an i7-2600k and I don't believe either are the issue - it was when the card was not under load or almost completely idle that it would happen most often.

When I started googling '680 stutter' it seems this is a very common issue with 670/680 series cards, and (my speculation here) from what I read seems likely related to the power management / voltage throttling that Nvidia introduced with this series of cards.

I would like to upgrade my GPU but am kinda lost here. I could try a 670 but that's less of an upgrade than the 680 and still seems to suffer from the same problem - although less often it seems. I was looking at a 7970 but I have concerns about ATI's drivers and overall stability, with issues like the frame latency being top of mind.

Does anyone have either any more info about the stuttering issue with the 670 / 680 cards, or suggestions on how the 7970 performs based on their experience with it?

Navaash
Aug 15, 2001

FEED ME


I've surprisingly never encountered the cursor corruption issue, and I've been using ATI/AMD cards for about 10 years now (9800 -> X1950XT -> 6950).

I'm not biased against nVidia per se; I remember I couldn't get a Riva TNT to work with an old VIA computer of mine, so I ended up going with a Voodoo4 4500 back in the day (good timing, that :v: - the 9800 followed when I was able to afford it). Also, I put a Geforce in my brother's build 5 years ago, which works fine - though into a computer with an nForce chipset, which has its own issues.

That series of articles has really brought out the worst in the red/green partisans. The article's advice was pertaining to the newest games coming out during the holiday season at the highest resolutions and eyecandy settings - if it turns out to be a driver issue, the 7950 will be the better card hands down once it's fixed. Instead you have people jumping all over AMD like hyenas (with a particularly vicious subset crowing for the company's death), people jumping all over TR (pretty sure the article was only intended to be a "uh, AMD, you need to look into this", but people have been savaging them because they blasted their PR department recently over the Trinity review schedule), people ripping into each other, etc.

The one thing nobody wants (well, almost nobody) is an nVidia monopoly.

td4guy
Jun 13, 2005

I always hated that guy.

McCoy Pauley posted:

Can anyone help me figure out the difference between two different EVGA 660 ti cards -- the 660 ti FTW Signature2 and the regular 660 ti FTW?

I'm looking at them in a comparison chart on Newegg, and the only relevant difference I detect (with my admittedly untrained eye) is that the Signature2 has a slightly faster clock and 2 fans instead of 1. I assume this is slightly better. But also, it's $20 cheaper. Am I missing something, or is the Signature2 both slightly better and cheaper?
I agree. It looks like the Signature 2 is both slightly better and cheaper. However, your case has to have really good airflow to deal with the wacky dual-fan setup. That's the drawback.

kuddles
Jul 16, 2006

Like a fist wrapped in blood...

Navaash posted:

people jumping all over TR (pretty sure the article was only intended to be a "uh, AMD, you need to look into this", but people have been savaging them because they blasted their PR department recently over the Trinity review schedule)
That's the kind of poo poo that really bothers me. It reminds me of back in the day when TR was also dragged through the mud for genuinely pointing out issues with some SSD brands having read times degrade drastically over time, or Consumer Reports also getting a swath of negativity - even from other tech sites - just for pointing out that the next gen of iPads ran significantly warmer. It's kind of sad that people have a vocal hissy-fit if anyone dares question a brand they like, and then turn around and bemoan the state of tech journalism being a bunch of shills.

Dominoes
Sep 20, 2007

hobbesmaster posted:

Are you using analog connectors?
No, all monitors connected via displayport.

Boten Anna
Feb 22, 2010

Even as much as I personally go for nvidia I certainly don't want to see them have a monopoly. In fact, I hope Intel's work on igpus lights fires under asses to keep things innovative and competitive.

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.
So, on the polygon warping issue, I managed to get a frame capture of the problem with Intel GPA, by setting it to capture every 10 seconds and hope it eventually triggers spot on. I can look at the frame screenshot in the capture browser and see the issue, like so:



But then when I load the capture to try and see which draw call is freaking out:



I should have known better, of course it's not going to show. It's playing back the frame and getting the expected results, not those of the hardware being lovely. :suicide:

Oh well, looks like my awesome debugging process ends here.

movax
Aug 30, 2008

Rakthar posted:

After having great luck with two Nvidia cards in a row (8800 GTS, then a GTX 570) I decided to try out a GTX 680. I got an EVGA model from Amazon and plugged it right in. Almost right away I noticed this weird hitching / stuttering that was present both during gaming and video playback. If I tried watching youtube videos or regular videos, about every 30 seconds there would be a noticable stutter and then it would resume smooth playback.

In games my FPS was much higher than with the 570 but again, same issue. About every 30-60 seconds it would seem like I was getting 4 FPS - but Fraps did not bear this out, it claimed my framerate was steady. The effect was super noticable and super annoying. Even my roommate commented on it when we were watching some TV shows together. I tried the whole works - clean reinstalls, registry sweepers, driver cleaners, changing settings (disabling vsync) and none of it helped.

I went back to the 570 and the problem completely went away. I have a 600 watt PSU and an i7-2600k and I don't believe either are the issue - it was when the card was not under load or almost completely idle that it would happen most often.

When I started googling '680 stutter' it seems this is a very common issue with 670/680 series cards, and (my speculation here) from what I read seems likely related to the power management / voltage throttling that Nvidia introduced with this series of cards.

I would like to upgrade my GPU but am kinda lost here. I could try a 670 but that's less of an upgrade than the 680 and still seems to suffer from the same problem - although less often it seems. I was looking at a 7970 but I have concerns about ATI's drivers and overall stability, with issues like the frame latency being top of mind.

Does anyone have either any more info about the stuttering issue with the 670 / 680 cards, or suggestions on how the 7970 performs based on their experience with it?

Did you fully clean drivers before upgrading?

Do you have any OCing software (like EVGA Precision) installed?

Mega Comrade
Apr 22, 2004

Listen buddy, we all got problems!
This might be common knowledge to many of you but I foolishly didn't do enough research before hand.

Avoid any 7950 with the boost bios like the plague.

It's a total nightmare to use. Because the boost state goes to 1.25v you hit the power threshold for the card instantly so the card flips between 850 and 925 causing huge fluctuations in GPU load. You can overcome this by putting the power slider to 16% and above but this doesn't completely alleviate the problem as you now have a card that gets incredibly hot and eats power. Changing the voltage also doesn't work as every piece of software only alters the base voltage, the boost voltage stays at 1.25v.

In the end I've flashed a 7970 bios (couldn't find a 7950 bios for my card that would stick) and I've now got a card with lower voltage, outperforming the boost bios in every way. Now that I've got the card performing like I want to it's excellent and would recommend it to anyone, just avoid the boost versions.

I have no idea what AMD were thinking when they put that bios out.

Charles Martel
Mar 7, 2007

"The Hero of the Age..."

The hero of all ages
Been following this thread since my last post in it and, good lord, the last few pages basically read like "save a headache and buy Nvidia".

I want to build a computer by the end of next year and really with I could go for the underdog and build a full AMD system since I don't want a company monopoly either, but the research I've been reading for the past month or so make a system like that less than ideal.

Rexxed
May 1, 2010

Dis is amazing!
I gotta try dis!

Charles Martel posted:

Been following this thread since my last post in it and, good lord, the last few pages basically read like "save a headache and buy Nvidia".

I want to build a computer by the end of next year and really with I could go for the underdog and build a full AMD system since I don't want a company monopoly either, but the research I've been reading for the past month or so make a system like that less than ideal.

I've had a lot of AMD systems and they've been great, but currently it's not worthwhile to buy one of their new CPUs as they're just outperformed by similarly priced Intel CPUs. The latest generation of video cards has a few issues as you've seen, but they'll likely be sorted out. I tend to buy video cards based on what the best price vs. performance I'll get based on the mid level cards I usually buy. This has changed whether I buy ATI/AMD or Nvidia over the years, but I think it's the most sound way to make a purchase rather than buy based on brand name. Both companies have had hits and misses in their releases and at any point one is usually a little bit ahead if you're looking at price vs. performance. Reliability often comes down to the third party manufacturer that actually assembles the card.

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.
Well, to be fair, my 7850s are doing the job very nicely, not counting the polygon corruption issues. And on that particular note, I went ahead and tried the newest 12.11 beta11 drivers (I was at beta4), and they pretty much entirely get rid of the problem. I think I noticed a flicker or two in Skyrim, but it could easily have been my imagination -- nothing like the psychedelicfest from my video capture.

The periodic high latency issue (or whatever you could call it) mentioned in that TechReport article has been acknowledged by AMD, and it shouldn't hopefully be too long before they address it.

The only other issues I've had are the occasional dud driver release that actually worsens performance in some games, but that's happened when I had nVidias as well.

The negative issues just happen to be more apparent in these discussions. :v:

Endymion FRS MK1
Oct 29, 2011

I don't know what this thing is, and I don't care. I'm just tired of seeing your stupid newbie av from 2011.
Yeah, AMD is still a winner for me. My 7950 is doing nicely, handling anything I throw at it. My only complaint being a lack of voltage control, which is more Gigabyte's fault than anything. I'll just flash to a 7970 bios and go from there. If it is still locked at least I have more voltage to play with (1.175v compared to 1.090v) and avoid the boost bios insanity.

GRINDCORE MEGGIDO
Feb 28, 1985


I never did find the fault with the fans on my MSI 7850 spinning up when the card idled (PITA), but I fitted an Accelero S1 Rev. 2 with a ~600rpm fan and it's inaudible, better temps too.

Just incase anyone was wondering if those old coolers fitted 7850's, YMMV but it fitted a Twin Frozr IV 7850. I had to replace one of the included ramsinks with a low profile one from the bits box, all fine.

Impressed by this old cooler - used it as it was cheap and was more or less the biggest that would fit in this ITX box. All crammed in nicely.

GRINDCORE MEGGIDO fucked around with this message at 00:56 on Dec 16, 2012

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
So I was just reading this article on high framerate (HFR) filmmaking and The Hobbit; there's hubbub that 48 FPS showings of the movie are triggering Uncanny Valley effects in a good number of people by being effectively faster than the brain's Conscious Moment Per Second rate - about 40 at rest state, 80 to 100 during intense activity. While the eye can distinguish motion at a faster rate, again varying by person but on average 66 Hz, apparently that 40 Moments per Second rate is killing the suspension of disbelief, because 48 FPS film no longer looks sufficiently different from reality for a lot of folks.

It makes me wonder: Why doesn't this happen with video games? Or will it, in the future? For many people, the difference between 30 FPS and a 60 FPS in gaming is not only incredibly noticeable, but it's incredibly desirable, even if we have no problem with 30/25/24 FPS video. What distinguishes these situations? Is it the stylized aesthetics or other cues for non-reality, so that the uncanny valley isn't approached for other reasons? Or does the interactivity make a game sufficiently "real" in terms of our brain function that the imagery itself has different standards for uncanniness? Is it simply a remaining enormous gulf between real-time generated imagery and photoreality? Or does the higher framerate reduce latency between action and reaction in a way that reduces unreality more than the increased framerate increases it by itself? Or are all video game players hyperspergs who don't find Japanese robotics incredibly weird?

A lot of TVs with temporal interpolation - like, upscaling the framerate to 60 or 120 FPS from 30p or 60i content - look great to some people and awful to others. Yet I'm not sure those categories map or correlate with video gaming - I know I love a solid 60 FPS in games, but motion interpolated video has a kind of hyperreal queasiness to me, like the difference between a 24p film and a 60i football game taken five steps too far.

Brains man. What gives with brains?

Factory Factory fucked around with this message at 01:22 on Dec 16, 2012

Rawrbomb
Mar 11, 2011

rawrrrrr
I think when we talk about video games, there is an active input, when we do something, we expect an instant reaction. When framerates are lower, there is a perceived slowness. I think we tolerate lower frame rates on movies/tv/etc because its not interactive. Everything syncs well with what we're watching.

At the very least, that is my perspective on it. GF wants to go watch the hobbit in the hifps theater, I'll deal and have fun with it regardless.

Frozenfries
Nov 1, 2012

Charles Martel posted:

Been following this thread since my last post in it and, good lord, the last few pages basically read like "save a headache and buy Nvidia".

I want to build a computer by the end of next year and really with I could go for the underdog and build a full AMD system since I don't want a company monopoly either, but the research I've been reading for the past month or so make a system like that less than ideal.

I've had very small issues with AMD cards over the past 3-4 years, and they were mostly resolved very quickly. I'm currently running a HD6990 and it's perfectly fine and performs exceptionally well in almost every game I own. While AMD does get a lot of flak, their cards are still quite good. The frame latency issue is a bit of a poo poo problem with the 7000 series though, which hopefully gets resolved soon.

Endymion FRS MK1
Oct 29, 2011

I don't know what this thing is, and I don't care. I'm just tired of seeing your stupid newbie av from 2011.
Alright, so I got over my voltage lock barrier on my Gigabyte 7950 by flashing an older bios. However, changing voltage in Afterburner produces a different result in both HWiNFO64 and GPU-Z. I have Afterburner set to do 1100/1350 @1.2v, but HWiNFO64 still reports the card's stock 1.090v, while GPU-Z reports 1.175v. It seems wherever I put voltage on Afterburner, GPU-Z reports .025 less. I know the voltage change has to be working, because I just played Call of Pripyat for 20 minutes on those settings, whereas before I could only change core clock to 1000 and either not launch at all or hang at the main menu with a driver crash upon exiting. How do I know what my real voltage is?

Edit: And now Afterburner refuses to let my GPU go into idle clock mode. Nothing running, still at 1100/1350/1.2v

Endymion FRS MK1 fucked around with this message at 07:39 on Dec 16, 2012

sethsez
Jul 14, 2006

He's soooo dreamy...

Factory Factory posted:

Is it simply a remaining enormous gulf between real-time generated imagery and photoreality?

This is a large part of it, mostly because videogames still don't have realistic motion blur, which is a huge part of why 24 FPS works as well as it does in film but looks like a jerky mess in games.

KillHour
Oct 28, 2007


Factory Factory posted:

So I was just reading this article on high framerate (HFR) filmmaking and The Hobbit; there's hubbub that 48 FPS showings of the movie are triggering Uncanny Valley effects in a good number of people by being effectively faster than the brain's Conscious Moment Per Second rate - about 40 at rest state, 80 to 100 during intense activity. While the eye can distinguish motion at a faster rate, again varying by person but on average 66 Hz, apparently that 40 Moments per Second rate is killing the suspension of disbelief, because 48 FPS film no longer looks sufficiently different from reality for a lot of folks.

It makes me wonder: Why doesn't this happen with video games? Or will it, in the future? For many people, the difference between 30 FPS and a 60 FPS in gaming is not only incredibly noticeable, but it's incredibly desirable, even if we have no problem with 30/25/24 FPS video. What distinguishes these situations? Is it the stylized aesthetics or other cues for non-reality, so that the uncanny valley isn't approached for other reasons? Or does the interactivity make a game sufficiently "real" in terms of our brain function that the imagery itself has different standards for uncanniness? Is it simply a remaining enormous gulf between real-time generated imagery and photoreality? Or does the higher framerate reduce latency between action and reaction in a way that reduces unreality more than the increased framerate increases it by itself? Or are all video game players hyperspergs who don't find Japanese robotics incredibly weird?

A lot of TVs with temporal interpolation - like, upscaling the framerate to 60 or 120 FPS from 30p or 60i content - look great to some people and awful to others. Yet I'm not sure those categories map or correlate with video gaming - I know I love a solid 60 FPS in games, but motion interpolated video has a kind of hyperreal queasiness to me, like the difference between a 24p film and a 60i football game taken five steps too far.

Brains man. What gives with brains?

Maybe this will finally stop the "But the human eye can't see faster than 24 FPS anyways!" :downs: crap that I hear on a daily basis. Probably not, but one can hope.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

As somebody who has actively tried to keep an even-handed approach to recommendations I have to admit it is getting harder to say that ATI's driver issues are just bad press taken too far. It seems like for about a year, year and a half now, nVidia's had one showstopper issue with the 560Ti and Battlefield 3. I don't even want to list the showstoppers ATI cards have experienced in the same time frame. Apart from that, nVidia's cards have had one or two substantial driver-based game rendering smoothness improvements (e.g. Skyrim, which saw a double-whammy that wrought over 50% greater overall performance from Fermi and Kepler hardware due to initial driver difficulty with the games that was resolved by the driver team), but for the most part they don't even have to fluff a narrative; their story has been uneventful and good.

Whereas now if I want to convince someone who is skeptical of ATI's driver quality, there's too much stuff to explain away. I personally think it has to be part of the overall financial situation AMD unluckily finds itself in, forcing an already somewhat reactive driver update process (due to reduced access to in-development titles) to be even more out in the open about playing catch-up and having significant launch issues.

Part of what has made nVidia's high performance lineup so apparently solid is that it's all derived from one chip with extremely similar layouts. Any time you can get that degree of homogeneity in hardware, it's going to be smoother sailing than the alternative. I conjecture that a big problem for them is just that combined with limited resources, ATI has a much greater diversity of products with unique hardware configurations in broad usage across price categories and in diverse setups, requiring significantly more attention and work on the drivers side of things... Where their capability is suffering from the sickness of the company of which they are just one part, so it's one thing after another at the worst times for 'em.

nVidia's had its fair share of driver problems in the past, but as someone who doesn't want to see one of the (surprisingly narrowly, but...) still profitable parts of AMD suffer, it blows that the prevalent idea that their drivers are problematic or just outright broken with many new games being pretty correct lately really sucks.

Longinus00
Dec 29, 2005
Ur-Quan

sethsez posted:

This is a large part of it, mostly because videogames still don't have realistic motion blur, which is a huge part of why 24 FPS works as well as it does in film but looks like a jerky mess in games.

Ding ding ding, we have a winner. Motion blur is either completely missing or looks nothing like the real thing in all games I've ever seen. Lower frame rates just make games look jerky since there's nothing smoothing it out. If you want uncanny valley in games then heavy rain might be a contender, along with those nvidia dawn demos (not games but I'm going to consider all real time rendering).

Dominoes
Sep 20, 2007

Additionally, camera panning in games can be rapid, while it's generally slow in film. It's much easier to notice low framerates with a fast camera pan, or fast-moving objects.

bend it like baked ham
Feb 16, 2009

Fries.

Longinus00 posted:

Ding ding ding, we have a winner. Motion blur is either completely missing or looks nothing like the real thing in all games I've ever seen. Lower frame rates just make games look jerky since there's nothing smoothing it out.

I don't know much about film/graphics/human perception, but it seems that these are all pretty loving basic and fundamental things to be discovering this late. I recall reading in the spring, I think, when those first clips of the Hobbit movie or whatever were shown at some movie festival at the higher FPS rate. It seemed to have caused quite a stir.

Why? Hasn't stuff like "FPS v latency" for games and "48fps v 24fps" for film been known/researched for years?

Dominoes
Sep 20, 2007

I'm running 2x 6950s on 3820x1920 resolution. Performance is marginal in some games, and vsync will not work. I get tons of tearing. I finally convinced myself that waiting for the next gen of cards is folly.

Should I buy 2 680s or a 690? Performance and price appear similar. What other things should I look for? Even the anantech benchmarks for sound and temp are similar. Anantech doesn't have crossfire 6950s in their 2012 benchmarks, but comparing a 6950 to 680 shows a doubling in FPS.

Which specific model/brand should I buy?

I use HDMI audio. My current card has 4 DPs, and 2 DVIs that can also be HDMI with an adapter. I use DPs for the monitors and HDMI for audio. The nvidia cards have 3x DVI and 1xminiDP. The only way that would work is 2x DVI monitors, 1xDP monitor, 1x DVI-HDMI for audio. Will this work?

Dominoes fucked around with this message at 18:46 on Dec 16, 2012

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Local Resident posted:

Why? Hasn't stuff like "FPS v latency" for games and "48fps v 24fps" for film been known/researched for years?
I don't think any of this is new or unknown, it just hasn't been something encountered by most people because the technology wasn't available at a low cost. Motion blur in games is a challenge because any real motion blur adds at least one frame of latency, which generally isn't acceptable. You can fake it well for motion of the camera, like how Valve does it in Source engine games, but this doesn't work for moving objects.

Spatial
Nov 15, 2007

Factory Factory posted:

It makes me wonder: Why doesn't this happen with video games? Or will it, in the future? For many people, the difference between 30 FPS and a 60 FPS in gaming is not only incredibly noticeable, but it's incredibly desirable, even if we have no problem with 30/25/24 FPS video. What distinguishes these situations?
Short answer: Games don't have real motion blur.

There's a lot more movement information per frame in a 24hz movie than in a 60hz game. The movie has been sampled from light at 24hz intervals and each one of those samples has accumulated light for the duration of the exposure time, where movement has been happening continuously. Subtle movements faster than 1/24th of a second still show because of this.

A frame from a game is a rendering of an instant in time without motion. Games don't have motion, they produce the illusion by teleporting objects over sufficiently small distances that it's not too distracting. Unlike a movie they aren't a sampling of a continuum, they're truly discrete, and hiding this through brute force would mean rendering at frequencies in the high hundreds.

Some games approximate motion blur. Most only do it for camera movement, but some do it for everything and these do look better at lower framerates. Crysis for example does motion blur both on objects and relative camera movement when at max settings. It's still only an approximation; some real-life visual effects happen precisely because you see a sampling of a continuum and these can't be simulated merely by blending discrete frames.

If we had super-duper computational power it wouldn't be an issue. You could produce nearly identical motion characteristics to a movie by updating the screen at 24hz while rendering the game internally at >1000hz and then averaging groups of frames together as appropriate for the exposure time. This is [in effect] how CG effects are able to blend with live action scenes so effectively.

There's a bunch of other issues too. But that's the big one as far as smoothness of motion goes, and I've gone on long enough. :v:

Maxwell Adams
Oct 21, 2000

T E E F S
I've seen Quake 3 videos where they got the engine to crank out a 300 fps video file, which is then downsampled to a 60 fps video with motion blur. It looks better than normal motion blur, but it still doesn't look natural.

Lord Dekks
Jan 24, 2005

My wife's machine has a 7850 and while it runs games absolutely fine, for some reason chromes built in flash always causes a TDR, Firefox is fine with hardware acceleration turned off though. Everything else works perfectly, no corruption of any kind in games, gives nice performance.

We're past the point where we could RMA now as we originally thought it was just a Chrome/Flash issue but later versions (And drivers) didn't fix the problem and no one else with a similar card seems to have this issue.

Makes me wish we'd gone with a Nvidia but could just be we were hit with the one faulty card out of thousands. That said though it runs everything perfectly other than the bizarre flash issue, but next time I upgrade I'll probably go with Nvidia, while someone who had a bad 680 will probably swear their a Radeon person for life for now on.

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.
What the hell is a TDR?

Devian666
Aug 20, 2008

Take some advice Chris.

Fun Shoe

Local Resident posted:

I don't know much about film/graphics/human perception, but it seems that these are all pretty loving basic and fundamental things to be discovering this late. I recall reading in the spring, I think, when those first clips of the Hobbit movie or whatever were shown at some movie festival at the higher FPS rate. It seemed to have caused quite a stir.

Why? Hasn't stuff like "FPS v latency" for games and "48fps v 24fps" for film been known/researched for years?

Most of the people complaining are idiots and all I can determine is that they hate change. In the past any fast panning in 24 fps films has been rather noticable to me. I watched the hobbit in 3d, 48 fps and dolby atmos at the weekend. It was noticable as a higher frame rate than I'm used to in film but I got used to it after about 5 minutes. Perhaps the preview was actually too short for people to adjust because most of the moaning was about the frame rate.

My experience by the end of the film is that I'd rather see more films at 48 fps or higher.

Professor Science
Mar 8, 2006
diplodocus + mortarboard = party

Jan posted:

What the hell is a TDR?
timeout detection and recovery in Vista and later, aka "your GPU or the GPU driver has done something bad so we restarted everything" and a message pops up

Adbot
ADBOT LOVES YOU

Longinus00
Dec 29, 2005
Ur-Quan

Lord Dekks posted:

My wife's machine has a 7850 and while it runs games absolutely fine, for some reason chromes built in flash always causes a TDR, Firefox is fine with hardware acceleration turned off though. Everything else works perfectly, no corruption of any kind in games, gives nice performance.

We're past the point where we could RMA now as we originally thought it was just a Chrome/Flash issue but later versions (And drivers) didn't fix the problem and no one else with a similar card seems to have this issue.

Makes me wish we'd gone with a Nvidia but could just be we were hit with the one faulty card out of thousands. That said though it runs everything perfectly other than the bizarre flash issue, but next time I upgrade I'll probably go with Nvidia, while someone who had a bad 680 will probably swear their a Radeon person for life for now on.

I knew someone who would get TDRs non stop with his nvidia card and was driving him crazy until he finally figured out it was because he was streaming audio over hdmi. Once he turned that off and moved to using the sound card the problem disappeared.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply