|
So why does gsync (and I think freesync/adaptive sync) require exclusive full-screen? Can the Windows compositor not scan out at variable rates?
|
# ¿ Jan 15, 2015 15:32 |
|
|
# ¿ Apr 27, 2024 08:43 |
|
Malcolm XML posted:I would not go there ?
|
# ¿ Jan 15, 2015 16:57 |
|
Malcolm XML posted:The way that windows interfaces with graphics and DX is complicated and the answer is, maybe but it would break applications so I am not surprised that dynamic refresh can only be done in single application fullscreen mode. Yeah, I work with some ex-Windows-graphics people, but we weren't sure and I couldn't find an explanation anywhere, nor indication as to whether it was something they could fix in a driver update (like they did when adding support for windowed apps to Shadowplay, f.e.). I guess you're saying it could violate applications' expectations of how often DWM scans out, but DWM composites already so apps update whenever they want, so I don't really follow. I'll keep looking on the web. Does AMD's freesync stuff have the same limitation?
|
# ¿ Jan 15, 2015 19:06 |
|
If multiple windows are visible then the compositor is already handling this case; apps don't all paint at 60Hz, often none do on a given desktop. You can have an app that's totally not responsive and therefore not updating at all, and other apps still render at their present rate. When you have a game running in windowed mode, does it get locked down to the 24Hz of a video that's playing? The compositor takes all the visible windows' swap chains when they get painted, and then composites them together. It then scans out the result at the monitor's refresh rate. Why can't it scan out whenever one of the windows is finished updating and the compositing is complete? The desktop is itself a full-screen exclusive app, really, and it gets regions from other programs the way Chrome does from its rendering processes. (I think that's how it works; corrections welcome.) All I can think of is that DWM itself has assumptions about a fixed frame rate, as an implementation artifact.
|
# ¿ Jan 15, 2015 19:59 |
|
re: *sync and full-screen necessity, I just remembered that I have driver people from AMD visiting on Tuesday, so I'll see what they say about it!
|
# ¿ Jan 15, 2015 21:17 |
|
AVeryLargeRadish posted:From what I understand it's not about the fact that the card normally only uses 3.5-3.8gb of it's memory, it's that when it does use the memory it normally avoids it causes massive stuttering and frame rate drops. Are people able to trigger this in games? I hadn't seen anyone link it to content behavior yet.
|
# ¿ Jan 24, 2015 19:00 |
|
Is it exceeding that much in use, or exceeding that much in transfer size? I may have misunderstood the NVIDIA statement.
|
# ¿ Jan 24, 2015 19:54 |
|
I have 2 970s in my new build, and they sit at 60C and 43C in the WIndows desktop. The hotter one (primary) has the fan at about 15%. I've OC'd them to +150 core with +87mV voltage increase, but I don't think that should really matter when they're effectively idle. Is there anything I should look into before just digging into airflow in the case?
|
# ¿ Feb 1, 2015 00:39 |
|
Desuwa posted:Which 970s do you have? Those aren't dangerous temperatures, and I wouldn't be worried about them, but they are higher than I'd expect for the MSI/ASUS custom coolers. One running hotter than the other is expected though; the bottom card is going to be blocking the airflow of the top card and there's not much you can do about it. I have the MSI Gaming ones. I removed some unused drive cages to improve flow from the front-panel intake fans, but it didn't really help: the cooler card is even cooler, but the hot one didn't change. The motherboard only has two x16 slots, but I'd need a longer SLI bridge anyway. I can add a side fan to my case and see if that helps.
|
# ¿ Feb 1, 2015 01:53 |
|
Yeah, I'll probably just learn to be comfortable with it. Under FurMark/Heaven load it doesn't crack 80C. Now if only I'd been smarter and bought a semi-fanless PSU, I'd be totally happy with my setup...
|
# ¿ Feb 1, 2015 02:09 |
|
Rastor posted:FreeSync is an AMD term for Adaptive Sync. Mobile GSync (at least the leaked Alpha version) is Adaptive Sync. Therefore, Mobile GSync is FreeSync. And nVidia are very much shady assholes for saying they can't possibly support an Adaptive Sync solution, when it has been proven that not only can they, they've developed drivers that do so. Where did they say that? Drop a link?
|
# ¿ Feb 4, 2015 03:14 |
|
I'm sure they'll learn their lesson when this fracas causes people to flock away from their cards into the market-snuggling embrace of AMD.
|
# ¿ Feb 4, 2015 04:46 |
|
FaustianQ posted:Can't tell if this is sarcastic or not Very.
|
# ¿ Feb 4, 2015 05:26 |
|
FaustianQ posted:Nvidia learning a lesson, anyone buying AMD cards, or just "Yes"? Pretty much.
|
# ¿ Feb 4, 2015 05:29 |
|
Mr.PayDay posted:In other words, the 970 now has far worse tech specs than an AMD 290x Could you elaborate on this? Which specs are far worse?
|
# ¿ Feb 5, 2015 15:08 |
|
veedubfreak posted:I look forward to the day that AMD goes under and Nvidia starts selling GPUs for no less than 500 bucks. Lets all root for the failure of AMD, monopolies always work out alright for the consumer. Intel seems to be pushing ahead pretty hard, and I wouldn't say that there is robust competition in the non-mobile CPU market.
|
# ¿ Feb 5, 2015 17:14 |
|
Intel's problem is that a delay means that their current market-leading part stays in the market longer, and that the new part will be the market leader for less time before they overtake it themselves with another winner. They are tripping over their own best-in-class products. This is not the situation AMD is in.
|
# ¿ Feb 5, 2015 22:36 |
|
Kazinsal posted:AMD's not in that situation because they don't *have* any best-in-class products in the CPU market. Yes, that is precisely my point.
|
# ¿ Feb 5, 2015 22:53 |
|
Annath posted:What ever came out of the whole 970 VRAM snafu? Anything at all?
|
# ¿ Feb 7, 2015 22:51 |
|
I wouldn't be surprised if NVIDIA's OEM contracts for laptops required them to disable overclocking to keep everything within thermal limits.
|
# ¿ Feb 14, 2015 03:29 |
|
I don't recall a popup when OCing my 970s, but I might be forgetting. It's not just the thermal on the GPU, it's that it disrupts the thermal environment of everything in the laptop.
|
# ¿ Feb 14, 2015 10:08 |
|
There is a version of the 970 that fixes the memory issue, and it's called the 980.
|
# ¿ Feb 18, 2015 15:30 |
|
Yay picture time! One of our boxes of test GPUs at the office. I'd lay them out on a table but I'm lazy.
|
# ¿ Feb 18, 2015 18:36 |
|
I drive the heck out of a 2560x1440 with a 970.
|
# ¿ Feb 18, 2015 21:50 |
|
Panty Saluter posted:What detail levels? I find if I want maxed details and 60 fps I have to keep it at 1920 x 1080 on my 970. Haven't really fooled with overclocking though. Usually very high, depends on the game. I've OCd a reasonable amount, nothing unusual.
|
# ¿ Feb 18, 2015 22:10 |
|
veedubfreak posted:Nvidia has already said that the 980 is the full chip. The Ti would have to be a different chip. Looks like another videocardz made up picture. That table shows the 980Ti being the GM200, though, right?
|
# ¿ Feb 19, 2015 22:24 |
|
The future of VR rendering is still uncertain. Even if I could tell you everything I know, I couldn't give you a confident prediction of how SLI/etc. will benefit VR applications in specific cases or in general, at the point we ship CV1. I don't recommend making assumptions about what configurations will work best for the consumer Rift until we release more information; I don't have a timeline to share on that, unfortunately.
|
# ¿ Feb 21, 2015 23:14 |
|
Zero VGS posted:Edit: It begins: http://wccftech.com/nvidia-face-lawsuit-gtx-970-false-advertising/ Is release of specifications to press considered marketing? This will be interesting. I don't recall NVIDIA marketing the number of ROPs on their site when I was comparing cards, and I'm pretty sure it wasn't listed anywhere in MSI's marketing or packaging materials.
|
# ¿ Feb 22, 2015 00:34 |
|
HalloKitty posted:Yeah, but you don't let off a cake manufacturer because they put tons of crap in that wasn't on the ingredients list, just because some websites said the cake was tasty even before they knew the recipe. Your analogy is pretty bad, if not also your reasoning. Most food reviews have no idea of the recipe, and only address the "performance". You can't see inside food or drivers or many important aspects of GPU hardware. Both people who bought the 970 because their careful analysis of a workload (the equivalent of a very rare food allergy) matched up exactly with the reported ROP count should return their cards (and apply for jobs, because they must have intuited a lot about the inner workings of the card). Everyone else should stop pretending that the misreporting, whether intentional or otherwise, was in any way material to their purchase, just as those not affected by a stray allergen would be unlikely to qualify members of a class action suit against General Mills. There's a reason all those late-night-TV ads say "if you've taken Superpill and developed Condition". Many products have differential performance characteristics in different configurations and workloads, and are marketed on the best-case performance measurements. SSDs are a common example, where performance often varied widely according to how full the drive is. If NVIDIA is going to be punished for misadvertising the ingredients while people are satisfied with the taste, a lot of conpanies should be going down on similar complaints.
|
# ¿ Feb 24, 2015 16:07 |
|
I should amend that: anyone who is dissatisfied with their purchase should return it, whether for technical or emotional reasons. NVIDIA and its manufacturers should gracefully accept them and apologize. But the pitchforks and torches and lawsuits are silly.
|
# ¿ Feb 24, 2015 16:53 |
|
SwissCM posted:You realize what you just wrote right? Yeah, I should have left the relativism out of my post. I'm OK with it as the basis for an argument for proportionate popular reaction, but anchoring isn't a good way to make law. That said, I'm pretty sure there's case law about reporting maximal performance being OK, but I'm thinking of something from like the 70s related to gas consumption, and I can't find it now. Maybe there's a law-knower here who can say.
|
# ¿ Feb 24, 2015 16:56 |
|
Mr SoupTeeth posted:poo poo, Microsoft still needs to make an OS with 4K in mind. It blows my mind how the market is becoming saturated with these displays yet the scaling is still trash. Is Windows 10 not better on that score?
|
# ¿ Feb 25, 2015 23:32 |
|
FaustianQ posted:The 300 series isn't promising anything good, and since that's an easier and quicker to resolve toxx for now, I'll buy an avatar/username combo of the mods picking if the 300 series beats the 200 series in performance per watt while having a better price/performance ratio than comparable Nvidia cards. This isn't an impossible standard to set, correct? Comparing worst 300 to best 200? Mean of the whole lines?
|
# ¿ Feb 26, 2015 01:29 |
|
Nah, it'll still be a vector for AMD-specific improvements, just not for optimizing away draw-call overhead since that's solved in new D3D/GL.
|
# ¿ Mar 3, 2015 22:01 |
|
Wiggly Wayne DDS posted:Do you have some sort of blog where I can read more of these... creative interpretations of technology? Yeah, that's some good stuff. (NVIDIA and AMD both announced support for DX12 the same day that MSFT revealed it, right?)
|
# ¿ Mar 4, 2015 00:30 |
|
sauer kraut posted:Did we find out yet which cards have full feature hardware DX12 support? IIRC NVIDIA said that all their DX11 cards would also support DX12 (which says interesting things about their architecture in terms of the pre-emption stuff), and AMD as well. E: AMD says all GCN parts, and NVIDIA says Fermi, Kepler, Maxwell Subjunctive fucked around with this message at 01:55 on Mar 4, 2015 |
# ¿ Mar 4, 2015 01:53 |
|
Sir Unimaginative posted:nVidia explicitly refused to update the DisplayPort spec on their Maxwell I cards to the level that includes Adaptive Sync so they could flog G-Sync some more. Whoa. Where were they explicit about that?
|
# ¿ Mar 4, 2015 23:33 |
|
Tearing can be an issue at any frame rate, no?
|
# ¿ Mar 7, 2015 22:33 |
|
veedubfreak posted:I thought tearing was mostly when the video card is pushing frames faster than the monitor can keep up. Happens when the frame buffer is modified during scan out, so more common with high frame rates, but don't think it's inherent to that. I don't want to draw a picture and think harder about it, though.
|
# ¿ Mar 7, 2015 22:36 |
|
|
# ¿ Apr 27, 2024 08:43 |
|
I always worry that I don't put enough on to fill the contact surface, but that's probably not a reasonable concern.
|
# ¿ Mar 9, 2015 18:45 |