Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Subjunctive
Sep 12, 2006

✨sparkle and shine✨

So why does gsync (and I think freesync/adaptive sync) require exclusive full-screen? Can the Windows compositor not scan out at variable rates?

Adbot
ADBOT LOVES YOU

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Malcolm XML posted:

I would not go there

?

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Malcolm XML posted:

The way that windows interfaces with graphics and DX is complicated and the answer is, maybe but it would break applications so I am not surprised that dynamic refresh can only be done in single application fullscreen mode.

Yeah, I work with some ex-Windows-graphics people, but we weren't sure and I couldn't find an explanation anywhere, nor indication as to whether it was something they could fix in a driver update (like they did when adding support for windowed apps to Shadowplay, f.e.). I guess you're saying it could violate applications' expectations of how often DWM scans out, but DWM composites already so apps update whenever they want, so I don't really follow.

I'll keep looking on the web. Does AMD's freesync stuff have the same limitation?

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

If multiple windows are visible then the compositor is already handling this case; apps don't all paint at 60Hz, often none do on a given desktop. You can have an app that's totally not responsive and therefore not updating at all, and other apps still render at their present rate. When you have a game running in windowed mode, does it get locked down to the 24Hz of a video that's playing?

The compositor takes all the visible windows' swap chains when they get painted, and then composites them together. It then scans out the result at the monitor's refresh rate. Why can't it scan out whenever one of the windows is finished updating and the compositing is complete? The desktop is itself a full-screen exclusive app, really, and it gets regions from other programs the way Chrome does from its rendering processes.

(I think that's how it works; corrections welcome.)

All I can think of is that DWM itself has assumptions about a fixed frame rate, as an implementation artifact.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

re: *sync and full-screen necessity, I just remembered that I have driver people from AMD visiting on Tuesday, so I'll see what they say about it!

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

AVeryLargeRadish posted:

From what I understand it's not about the fact that the card normally only uses 3.5-3.8gb of it's memory, it's that when it does use the memory it normally avoids it causes massive stuttering and frame rate drops.

Are people able to trigger this in games? I hadn't seen anyone link it to content behavior yet.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Is it exceeding that much in use, or exceeding that much in transfer size? I may have misunderstood the NVIDIA statement.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨


:eyepop:

I have 2 970s in my new build, and they sit at 60C and 43C in the WIndows desktop. The hotter one (primary) has the fan at about 15%. I've OC'd them to +150 core with +87mV voltage increase, but I don't think that should really matter when they're effectively idle. Is there anything I should look into before just digging into airflow in the case?

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Desuwa posted:

Which 970s do you have? Those aren't dangerous temperatures, and I wouldn't be worried about them, but they are higher than I'd expect for the MSI/ASUS custom coolers. One running hotter than the other is expected though; the bottom card is going to be blocking the airflow of the top card and there's not much you can do about it.

If you have room you could try moving the bottom 970 to a lower slot, provided it's going to offer the same number of PCIe lanes.

I have the MSI Gaming ones. I removed some unused drive cages to improve flow from the front-panel intake fans, but it didn't really help: the cooler card is even cooler, but the hot one didn't change.

The motherboard only has two x16 slots, but I'd need a longer SLI bridge anyway.

I can add a side fan to my case and see if that helps.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Yeah, I'll probably just learn to be comfortable with it. Under FurMark/Heaven load it doesn't crack 80C.

Now if only I'd been smarter and bought a semi-fanless PSU, I'd be totally happy with my setup...

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Rastor posted:

FreeSync is an AMD term for Adaptive Sync. Mobile GSync (at least the leaked Alpha version) is Adaptive Sync. Therefore, Mobile GSync is FreeSync. And nVidia are very much shady assholes for saying they can't possibly support an Adaptive Sync solution, when it has been proven that not only can they, they've developed drivers that do so.

Where did they say that? Drop a link?

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

I'm sure they'll learn their lesson when this fracas causes people to flock away from their cards into the market-snuggling embrace of AMD.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

FaustianQ posted:

Can't tell if this is sarcastic or not

Very.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

FaustianQ posted:

Nvidia learning a lesson, anyone buying AMD cards, or just "Yes"?

Pretty much.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Mr.PayDay posted:

In other words, the 970 now has far worse tech specs than an AMD 290x

Could you elaborate on this? Which specs are far worse?

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

veedubfreak posted:

I look forward to the day that AMD goes under and Nvidia starts selling GPUs for no less than 500 bucks. Lets all root for the failure of AMD, monopolies always work out alright for the consumer.

Intel seems to be pushing ahead pretty hard, and I wouldn't say that there is robust competition in the non-mobile CPU market.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Intel's problem is that a delay means that their current market-leading part stays in the market longer, and that the new part will be the market leader for less time before they overtake it themselves with another winner. They are tripping over their own best-in-class products.

This is not the situation AMD is in.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Kazinsal posted:

AMD's not in that situation because they don't *have* any best-in-class products in the CPU market.

Yes, that is precisely my point.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Annath posted:

What ever came out of the whole 970 VRAM snafu? Anything at all?

:words::qq::ssj::munch::ohdear:

:shrug:

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

I wouldn't be surprised if NVIDIA's OEM contracts for laptops required them to disable overclocking to keep everything within thermal limits.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

I don't recall a popup when OCing my 970s, but I might be forgetting. It's not just the thermal on the GPU, it's that it disrupts the thermal environment of everything in the laptop.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

There is a version of the 970 that fixes the memory issue, and it's called the 980.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Yay picture time! One of our boxes of test GPUs at the office. I'd lay them out on a table but I'm lazy.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

I drive the heck out of a 2560x1440 with a 970.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Panty Saluter posted:

What detail levels? I find if I want maxed details and 60 fps I have to keep it at 1920 x 1080 on my 970. Haven't really fooled with overclocking though.

Usually very high, depends on the game. I've OCd a reasonable amount, nothing unusual.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

veedubfreak posted:

Nvidia has already said that the 980 is the full chip. The Ti would have to be a different chip. Looks like another videocardz made up picture.

That table shows the 980Ti being the GM200, though, right?

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

The future of VR rendering is still uncertain. Even if I could tell you everything I know, I couldn't give you a confident prediction of how SLI/etc. will benefit VR applications in specific cases or in general, at the point we ship CV1. I don't recommend making assumptions about what configurations will work best for the consumer Rift until we release more information; I don't have a timeline to share on that, unfortunately.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨


Is release of specifications to press considered marketing? This will be interesting. I don't recall NVIDIA marketing the number of ROPs on their site when I was comparing cards, and I'm pretty sure it wasn't listed anywhere in MSI's marketing or packaging materials.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

HalloKitty posted:

Yeah, but you don't let off a cake manufacturer because they put tons of crap in that wasn't on the ingredients list, just because some websites said the cake was tasty even before they knew the recipe.

Your analogy is pretty bad, if not also your reasoning. Most food reviews have no idea of the recipe, and only address the "performance". You can't see inside food or drivers or many important aspects of GPU hardware.

Both people who bought the 970 because their careful analysis of a workload (the equivalent of a very rare food allergy) matched up exactly with the reported ROP count should return their cards (and apply for jobs, because they must have intuited a lot about the inner workings of the card). Everyone else should stop pretending that the misreporting, whether intentional or otherwise, was in any way material to their purchase, just as those not affected by a stray allergen would be unlikely to qualify members of a class action suit against General Mills. There's a reason all those late-night-TV ads say "if you've taken Superpill and developed Condition".

Many products have differential performance characteristics in different configurations and workloads, and are marketed on the best-case performance measurements. SSDs are a common example, where performance often varied widely according to how full the drive is. If NVIDIA is going to be punished for misadvertising the ingredients while people are satisfied with the taste, a lot of conpanies should be going down on similar complaints.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

I should amend that: anyone who is dissatisfied with their purchase should return it, whether for technical or emotional reasons. NVIDIA and its manufacturers should gracefully accept them and apologize.

But the pitchforks and torches and lawsuits are silly.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

SwissCM posted:

You realize what you just wrote right?

Yeah, I should have left the relativism out of my post. I'm OK with it as the basis for an argument for proportionate popular reaction, but anchoring isn't a good way to make law.

That said, I'm pretty sure there's case law about reporting maximal performance being OK, but I'm thinking of something from like the 70s related to gas consumption, and I can't find it now. Maybe there's a law-knower here who can say.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Mr SoupTeeth posted:

poo poo, Microsoft still needs to make an OS with 4K in mind. It blows my mind how the market is becoming saturated with these displays yet the scaling is still trash.

Is Windows 10 not better on that score?

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

FaustianQ posted:

The 300 series isn't promising anything good, and since that's an easier and quicker to resolve toxx for now, :toxx: I'll buy an avatar/username combo of the mods picking if the 300 series beats the 200 series in performance per watt while having a better price/performance ratio than comparable Nvidia cards. This isn't an impossible standard to set, correct?

Comparing worst 300 to best 200? Mean of the whole lines?

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Nah, it'll still be a vector for AMD-specific improvements, just not for optimizing away draw-call overhead since that's solved in new D3D/GL.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Wiggly Wayne DDS posted:

Do you have some sort of blog where I can read more of these... creative interpretations of technology?

Yeah, that's some good stuff. (NVIDIA and AMD both announced support for DX12 the same day that MSFT revealed it, right?)

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

sauer kraut posted:

Did we find out yet which cards have full feature hardware DX12 support?

IIRC NVIDIA said that all their DX11 cards would also support DX12 (which says interesting things about their architecture in terms of the pre-emption stuff), and AMD as well.

E: AMD says all GCN parts, and NVIDIA says Fermi, Kepler, Maxwell

Subjunctive fucked around with this message at 01:55 on Mar 4, 2015

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Sir Unimaginative posted:

nVidia explicitly refused to update the DisplayPort spec on their Maxwell I cards to the level that includes Adaptive Sync so they could flog G-Sync some more.

Whoa. Where were they explicit about that?

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Tearing can be an issue at any frame rate, no?

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

veedubfreak posted:

I thought tearing was mostly when the video card is pushing frames faster than the monitor can keep up.

Happens when the frame buffer is modified during scan out, so more common with high frame rates, but don't think it's inherent to that. I don't want to draw a picture and think harder about it, though.

Adbot
ADBOT LOVES YOU

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

I always worry that I don't put enough on to fill the contact surface, but that's probably not a reasonable concern.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply