Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
repiv
Aug 13, 2009

https://www.micron.com/about/blogs/2016/may/nvidia-launches-gtx1080-with-micron-gddr5x

"I first talked about Micron’s GDDR5X in September 2015 and in February of this year, provided an update that the technology was on track for mass production by summer.
Today, I am happy to announce that GDDR5X, the fastest discrete memory component in the world, has already entered mass production."


Maybe supply won't be as bad as we feared :)

Adbot
ADBOT LOVES YOU

SwissArmyDruid
Feb 14, 2014

by sebmojo
I am concerned that AMD may not have access to GDDR5X or enough of it, though.

froody guy
Jun 25, 2013

If you want to get moist reading nerdy papers about how fraggin awesome the gddrx are, try this one

The TL;DR version being.....



Arzachel
May 12, 2012

SwissArmyDruid posted:

I am concerned that AMD may not have access to GDDR5X or enough of it, though.

They will probably skip it for now, the extra price, power draw and supply issues make it unsuited for Polaris and HBM2 is straight up better and the advertised feature for Vega.

fozzy fosbourne
Apr 21, 2010

froody guy posted:

If you want to get moist reading nerdy papers about how fraggin awesome the gddrx are, try this one

The TL;DR version being.....





The current stuff is 10gbps though, right? Last I saw, the 12 and 14 were in "Sampling" while the 10 is still in production. I wonder how much the 1080 is bottlenecked by memory bandwidth and when we'll see the first 12/14 gbps chips available to consumers. Will the fast stuff be in 3rd party 1080s? A whole new sku? Next year? September?

Hubis
May 18, 2003

Boy, I wish we had one of those doomsday machines...

xthetenth posted:

You're still getting a wider variation between when the frame starts rendering and when it's displayed with *sync, but it's going to be lessened.

If you are consistently operating above Vsync there will be no variation because your rendering queue will be full and so you will stall on present, locking you're entire engine to a consistent 15ms update rate.

Hubis fucked around with this message at 14:27 on May 11, 2016

froody guy
Jun 25, 2013

fozzy fosbourne posted:

The current stuff is 10gbps though, right?
I suppose so.

fozzy fosbourne posted:

Last I saw, the 12 and 14 were in "Sampling" while the 10 is still in production. I wonder how much the 1080 is bottlenecked by memory bandwidth and when we'll see the first 12/14 gbps chips available to consumers. Will the fast stuff be in 3rd party 1080s? A whole new sku? Next year? September?
I think just from 3rd parties unless nvidia is going with a rev2 of the 1080 which would be quite surprising and I'd say "irresponsible" but for sure the 10nm offer a massive improvement on the clock/overclock capabilities and thus the bandwidth, but we don't exactly know what kind of a monster will be the full-chip paired with HBM2 so..... standard partners in June with 20nm/10Gbps and speshul ones in september at a premium (lightning, kingpin and that kind of thing)? :shrug:

For sure if they start putting the speshul ones on all the 1080 sold from a certain point on I'd be massively pissed.

froody guy
Jun 25, 2013

Scammers Edition's pcb unveiled

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Hubis posted:

If you are consistently operating above Vsync there will be no variation because your rendering queue will be full and so you will stall on present, locking you're entire engine to a consistent 15ms update rate.

Right, sorry, it'd be less consistent between end of render and display but more consistent between start of frame and display, which I think would actually be better barring some of the VR tech like transforms based on game state changes between start and end of rendering a frame.

That all changes if you dip below max FPS though, but we're specifically discounting that case.

HMS Boromir
Jul 16, 2011

by Lowtax

For a second my brain didn't parse what you meant by Scammers Edition and thought some more extremely good wood screws poo poo had come to light.

repiv
Aug 13, 2009


Come on Nvidia, if you're going to charge a premium you could at least populate all the VRM phases :sigh:

penus penus penus
Nov 9, 2014

by piss__donald

Hubis posted:

If you are consistently operating above Vsync there will be no variation because your rendering queue will be full and so you will stall on present, locking you're entire engine to a consistent 15ms update rate.

It works, in theory and mostly in practice, but here is my main issue with vsync. This is a graph without vsync with an average fps of 99. You can clearly see the framerate is actually all over the place constantly. This is what games look like if you do frame by frame times. This looks terrible but its actually a very smooth example of gameplay, it looks much much worse when there is serious tearing. This is an average fps of 99 over 5 minutes, which on a side note is what the typical benchmark website would report as a result.



Now vsync prevents the back buffer frames from drawing on the screen until the front buffer frame finishes, which in theory should make that a straight line across 60 fps (and its pretty close!). The problem with that is if you look at the graph above there is no way this mathematically works out time wise. All those frames with wildly varying times are being forced into locked frame times, which isn't what the GPU was truly outputting at the time it rendered the frame. This results in a constant speeding up and slowing down of "time" in the game itself, which is what we perceive as lag. If it didn't happen all the time it would be unnoticeable but as you can see above there is no way for it not be happening almost the entire time vsync is on whether or not the gpu is average above 60 fps or not. This is where the input lag comes from among other visual issues. Now it can never be more or less lag than the variation within 1 frame, but its simply too noticeable.

On top of that its not actually perfect anyways and there are still pretty consistent dropped frames, albeit far reduced. But combine the varying lag and input lag with the fact you're making your GPU work harder to do this at all, reducing your settings potential... I'm just not a fan. Some games are worse than others and in some games it doesnt matter if it lags a little bit, but it was obnoxious enough for me to just accept tearing instead. And I really really really tried to like it for a long time and tried every vsync setting combination and mode there was.

I'm well aware im splitting hairs here but having the monitor and GPU work together to sync refresh rate with fps is a really elegant solution that we all deserve to have.

penus penus penus fucked around with this message at 15:47 on May 11, 2016

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

THE DOG HOUSE posted:

I'm well aware im splitting hairs here but having the monitor and GPU work together to sync refresh rate with fps is a really elegant solution that we all deserve to have.
I agree and ubiquitous *sync will be a grand day, indeed. In the meantime, though, I'm still solidly in the camp that, for those of us with limited funds, the $200-$300 extra it'd cost you to get a GSync monitor is probably better spent on a better GPU--that's the difference between a 970 and a 908Ti with change left over, after all. And yes, resale, price over time, multiple generations, blah blah blah. $300 can still get you at least one, if not two brackets up on the GPU ladder for at least two, possibly three generations if you're not dumb, and in 5 years if we're all still talking about a GSync tax something has gone terribly wrong.

Blackfyre
Jul 8, 2012

I want wings.

DrDork posted:

I agree and ubiquitous *sync will be a grand day, indeed. In the meantime, though, I'm still solidly in the camp that, for those of us with limited funds, the $200-$300 extra it'd cost you to get a GSync monitor is probably better spent on a better GPU--that's the difference between a 970 and a 908Ti with change left over, after all. And yes, resale, price over time, multiple generations, blah blah blah. $300 can still get you at least one, if not two brackets up on the GPU ladder for at least two, possibly three generations if you're not dumb, and in 5 years if we're all still talking about a GSync tax something has gone terribly wrong.

Yeah the cost of a gsync has to really drop, its silly for how much it costs.

The_Franz
Aug 8, 2003

Hubis posted:

If you are consistently operating above Vsync there will be no variation because your rendering queue will be full and so you will stall on present, locking you're entire engine to a consistent 15ms update rate.

Unless the presentation queue is using "mailbox" mode in which case the renderer can run independently of vsync without tearing. Judder is the only problem then.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Blackfyre posted:

Yeah the cost of a gsync has to really drop, its silly for how much it costs.
NVidia: All you early adopters can just shut up and pay us.

Holyshoot
May 6, 2010
Would gsync help at all if I want to upgrade my center monitor to a 144hz one and still use my two side monitors that are 60hz? right now running a 120hz monitor for my center one and two 60hz on the side (all 1920x1080 27inch) I get screen tearing on the side monitors but the center one is fine.

bull3964
Nov 18, 2000

DO YOU HEAR THAT? THAT'S THE SOUND OF ME PATTING MYSELF ON THE BACK.


Cost wasn't even that much of a concern for me, I could stomach $200 over normal to get the tech into a monitor. The problem is what's available. If I could get a 4k Dell IPS monitor in the upper 20" range I would easily pay upwards of $750 for it. It could even be only 60hz.

But, something like that doesn't exist.

In theory, the Asus PG27AQ is the right monitor, but it's about $150 too expensive and has atrocious QC.

Seriously, just slap a gsync module into the P2715Q and I would be willing to pay a $200 premium on it.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Holyshoot posted:

Would gsync help at all if I want to upgrade my center monitor to a 144hz one and still use my two side monitors that are 60hz? right now running a 120hz monitor for my center one and two 60hz on the side (all 1920x1080 27inch) I get screen tearing on the side monitors but the center one is fine.
GSync requires a compatible monitor, so unless those side monitors are already GSync compatible (you'd know it if they were) then no, it will do nothing for them.

Blackfyre
Jul 8, 2012

I want wings.

Holyshoot posted:

Would gsync help at all if I want to upgrade my center monitor to a 144hz one and still use my two side monitors that are 60hz? right now running a 120hz monitor for my center one and two 60hz on the side (all 1920x1080 27inch) I get screen tearing on the side monitors but the center one is fine.

Wouldn't help the side monitors

Sir Kodiak
May 14, 2007


bull3964 posted:

In theory, the Asus PG27AQ is the right monitor, but it's about $150 too expensive and has atrocious QC.

Seriously, I was looking into a monitor upgrade, and gently caress being expected to do my own quality control rather than them checking whether they built it properly before shipping it to me. Just not worth the hassle, I'll stick with what I've got.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Holyshoot posted:

Would gsync help at all if I want to upgrade my center monitor to a 144hz one and still use my two side monitors that are 60hz? right now running a 120hz monitor for my center one and two 60hz on the side (all 1920x1080 27inch) I get screen tearing on the side monitors but the center one is fine.

In addition to the side monitors not having gsync support, it only works with a single display at a time right now. I don't think that's inherent, and might even be software-fixable, but it's the situation right now.

EdEddnEddy
Apr 5, 2012



Ok currently I just seem to be a slight bit confused.

First Gsync/Freesync, from what I believe I understand, both sort of work together to either enable V-Sync above the Monitors refresh rate and disables it when framerate goes below, and/or adjust the refresh rate to sync the frames when it is below as well?

When you have >60hz screen (120/144Hz range) unless you are going above 120FPS then what benefit does Freesync/G-Sync provide again?

Holyshoot
May 6, 2010
If I pick up a gsync monitor where I have to use a display port. Will I be ok going from display port out of the monitor to mini display port into my graphics card? My 690 has an HDMI, 3 DVI's and a mini display port. No actual display port.

Truga
May 4, 2014
Lipstick Apathy

EdEddnEddy posted:

Ok currently I just seem to be a slight bit confused.

First Gsync/Freesync, from what I believe I understand, both sort of work together to either enable V-Sync above the Monitors refresh rate and disables it when framerate goes below, and/or adjust the refresh rate to sync the frames when it is below as well?

When you have >60hz screen (120/144Hz range) unless you are going above 120FPS then what benefit does Freesync/G-Sync provide again?

*sync makes it so the monitor outputs the next picture when it finishes rendering (unless the rendering happened before max refresh rate). Basically, it eliminates tearing and vsync stutter stutter, because vsync always has to wait for the next refresh.

penus penus penus
Nov 9, 2014

by piss__donald

EdEddnEddy posted:

Ok currently I just seem to be a slight bit confused.

First Gsync/Freesync, from what I believe I understand, both sort of work together to either enable V-Sync above the Monitors refresh rate and disables it when framerate goes below, and/or adjust the refresh rate to sync the frames when it is below as well?

When you have >60hz screen (120/144Hz range) unless you are going above 120FPS then what benefit does Freesync/G-Sync provide again?

They match the refresh rate of the monitor with the output of the GPU, eliminating the need for resource intense things like vsync. It makes for smooth fps across a large range of fps, greatly reducing stutters, and as far as I know completely eliminating all tearing. Best of all it does this without introducing lag and other lame things. Since all those partial frames dont exist anymore its just a very smooth gameplay experience even if your GPU is totally maxed out struggling to render across a very wide range (which allows you to increase settings too)

edit: I guess I should note I've never actually owned one so I might be omitting some quirks they might experience. I've just played on some in stores, events, etc

penus penus penus fucked around with this message at 17:58 on May 11, 2016

MrYenko
Jun 18, 2012

#2 isn't ALWAYS bad...

Holyshoot posted:

If I pick up a gsync monitor where I have to use a display port. Will I be ok going from display port out of the monitor to mini display port into my graphics card? My 690 has an HDMI, 3 DVI's and a mini display port. No actual display port.

AFAIK mini-displayport is just a different physical standard, but is completely compatible, kindof like miniUSB. Note that I'm often wrong, and might be an idiot.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

THE DOG HOUSE posted:

They match the refresh rate of the monitor with the output of the GPU, eliminating the need for resource intense things like vsync. It makes for smooth fps across a large range of fps, greatly reducing stutters, and as far as I know completely eliminating all tearing. Best of all it does this without introducing lag and other lame things. Since all those partial frames dont exist anymore its just a very smooth gameplay experience even if your GPU is totally maxed out struggling to render across a very wide range (which allows you to increase settings too)

edit: I guess I should note I've never actually owned one so I might be omitting some quirks they might experience. I've just played on some in stores, events, etc

It's pretty much that. It's way smoother, and actually has a greater benefit the lower the frame rate.

EdEddnEddy
Apr 5, 2012



Last I checked though, tearing only happens when the FPS > Hz of the screen so if you have a 120/144hz screen and are rendering at 90fps, there is no reason to have anySYNC turned on because your screen doesn't have any problem keeping up.

Now for the COD/CS players that game at 200FPS+ then I can understand that sync/input lag/studder benefit as it is back above the Hz range and tearing would most definitely happen with that high of a FPS/Hz spread.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

MrYenko posted:

AFAIK mini-displayport is just a different physical standard, but is completely compatible, kindof like miniUSB. Note that I'm often wrong, and might be an idiot.

Yeah, that's right. The adapters are simple, cheap and passive.

EdEddnEddy posted:

Last I checked though, tearing only happens when the FPS > Hz of the screen so if you have a 120/144hz screen and are rendering at 90fps, there is no reason to have anySYNC turned on because your screen doesn't have any problem keeping up.

That's not quite correct, because you're not guaranteeing that the frame is delivered when the monitor refreshes. They'd need to sync up exactly.

HalloKitty fucked around with this message at 19:37 on May 11, 2016

froody guy
Jun 25, 2013

EdEddnEddy posted:

Last I checked though, tearing only happens when the FPS > Hz of the screen so if you have a 120/144hz screen and are rendering at 90fps, there is no reason to have anySYNC turned on because your screen doesn't have any problem keeping up.

Now for the COD/CS players that game at 200FPS+ then I can understand that sync/input lag/studder benefit as it is back above the Hz range and tearing would most definitely happen with that high of a FPS/Hz spread.

With gsync fps = Hz, always. That's what it does: it changes dynamically the refresh rate of the monitor in order to be always paired (in sync) with the buffer of frames coming from the gpu at the given rate.

penus penus penus
Nov 9, 2014

by piss__donald

EdEddnEddy posted:

Last I checked though, tearing only happens when the FPS > Hz of the screen so if you have a 120/144hz screen and are rendering at 90fps, there is no reason to have anySYNC turned on because your screen doesn't have any problem keeping up.

Now for the COD/CS players that game at 200FPS+ then I can understand that sync/input lag/studder benefit as it is back above the Hz range and tearing would most definitely happen with that high of a FPS/Hz spread.

It definitely tears below the refresh rate. Often the worst tearing is when the GPU is struggling with something below the refresh rate. The problem is the monitor is refreshing at a more or less exact time and the frames themselves are definitely not, regardless of what the average fps is. It depends a lot on the game of course. Also how bad it is determines with you see actual tears across the screen going up or down in a pattern vs a faint impression of something being off. If you slowed it down enough, you will see tears though

Gwaihir
Dec 8, 2009
Hair Elf

EdEddnEddy posted:

Last I checked though, tearing only happens when the FPS > Hz of the screen so if you have a 120/144hz screen and are rendering at 90fps, there is no reason to have anySYNC turned on because your screen doesn't have any problem keeping up.

Now for the COD/CS players that game at 200FPS+ then I can understand that sync/input lag/studder benefit as it is back above the Hz range and tearing would most definitely happen with that high of a FPS/Hz spread.

It's not that simple, even if a GPU is maintaining an average FPS lower than your monitor's refresh rate, it's not necessarily sending frames to the monitor at the same timing as the screen is getting re-drawn.

THE DOG HOUSE posted:

It definitely tears below the refresh rate. Often the worst tearing is when the GPU is struggling with something below the refresh rate. The problem is the monitor is refreshing at a more or less exact time and the frames themselves are definitely not, regardless of what the average fps is. It depends a lot on the game of course. Also how bad it is determines with you see actual tears across the screen going up or down in a pattern vs a faint impression of something being off. If you slowed it down enough, you will see tears though

^ This

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.
GSync is actually best for people who don't upgrade their GPU very often, which is unfortunate it has such a premium on price because it's a decent way to attempt to future proof. I am running a GTX770 right now so if my FPS drops from 60->45 it's really not that big of a deal and far less noticeable.

You can get an Acer XB270HU for $450 from their refurb site, which isn't too bad for a 27" 1440 monitor. The Acer's tend to have some backlight bleed though, which is unfortunate.

Bleh Maestro
Aug 30, 2003
I assume there will still be a titan / ti card this generation?

SlayVus
Jul 10, 2009
Grimey Drawer
Like I said earlier, waiting for prosumer GP100. I want to run my X34 at 3440x1440@100Hz constant in all games.

GTX 1080 is probably good for 2560x1440 and 2560x1080, but I doubt it can 3440x1440 at max settings and desirable FPS. However, there are not any reviews of yet.

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!
The primary benefit of gsync and freesync is when your GPU is rendering below the maximum framerate.

With a 60hz monitor if you miss the 16.67ms cutoff you have to wait for the next sync, at 33.33ms. With either adaptive sync technology you can instead present a frame at 20ms.

It's best when you're not dipping too much below the referred rate. 35fps average is still going to look awful and you'd likely be better off activating half rate vsync to lock to 30fps, but 50-60 fps ends up much smoother.

Lungboy
Aug 23, 2002

NEED SQUAT FORM HELP
I'm trying to crank up my gtx560 using Asus gpu tweak. I've turned the core speed to 910mhz and it's perfectly stable, but it won't let me go any higher. I've tried turning the voltage up to 1.035 but still nothing. Is there any way to go higher?

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Lockback posted:

You can get an Acer XB270HU for $450 from their refurb site, which isn't too bad for a 27" 1440 monitor. The Acer's tend to have some backlight bleed though, which is unfortunate.

They pack their refurbs like poo poo though, I think it's deliberate to try and get Fedex/UPS to pay out insurance on crappy panels. Their retail units are packed much, much better and that's the most obvious reason why they'd do that.

Adbot
ADBOT LOVES YOU

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Lockback posted:

GSync is actually best for people who don't upgrade their GPU very often, which is unfortunate it has such a premium on price because it's a decent way to attempt to future proof. I am running a GTX770 right now so if my FPS drops from 60->45 it's really not that big of a deal and far less noticeable.

You can get an Acer XB270HU for $450 from their refurb site, which isn't too bad for a 27" 1440 monitor. The Acer's tend to have some backlight bleed though, which is unfortunate.

As long as nothing drastic happens, people who are trying to keep their total cost of ownership low, especially ones who don't buy/flip cards frequently should probably go AMD. Freesync is going to make the biggest difference to people trying to get a good experience out of a marginal card while gsync is functionally irrelevant because of the price tag, and if historical evidence is anything to go by, getting at least two good years out of a mid range/old top range card (7970 once the 290 was out, 290 once the 970 was out) and probably another two marginal years after that (7970 is still a pretty reasonable cheap 1080 card) is totally doable. I expect my 290 to still be putting in good service in a friend's computer in 2018 and still running in 2020 (okay, it's replacing a GTX 430, it's going to last till it causes the heat death of the universe, but still).

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply