Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

froody guy posted:

I totally agree but one must say that those TN panels mounted on these high refresh rate monitors are really the best you can squeeze out of a TN panel, also in terms of color fidelity and reproduction so they're not "just TN" at all. Also, the pricing of the Acer is loving insane. I mean, it costs 650€ in euroland and we generally have much higher prices due to higher vat so screw it. For that price I'd totally go 21:9 with the Dell or the AOC.

Putting my S2716DG side by side with an IPS monitor (either my P2715Q or my XB270HU) and its picture quality definitely isn't quite as good. It's not bad but the IPS monitors are notably contrastier.

Adbot
ADBOT LOVES YOU

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
Also: I was reading my Microcenter ad this morning and I noticed that Asus is now producing a 1440p/144hz 34" ultrawide with GSync. So the X34 is no longer the only option in that segment.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Paul MaudDib posted:

Also: I was reading my Microcenter ad this morning and I noticed that Asus is now producing a 1440p/144hz 34" ultrawide with GSync. So the X34 is no longer the only option in that segment.
Tell me it's not part of their obnoxious ROG branding segment.

Calidus
Oct 31, 2011

Stand back I'm going to try science!

DrDork posted:

Tell me it's not part of their obnoxious ROG branding segment.

Do you like being lied to?

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Calidus posted:

Do you like being lied to?

Sometimes it's all I've got.

fozzy fosbourne
Apr 21, 2010

These look insanely epic:
2500x1600 20 inch 150ppi e ink displays
https://www.youtube.com/watch?v=N2V9iuTW3sA

Not sure what gaming or yospos applications there are, but it just looks bad rear end

bull3964
Nov 18, 2000

DO YOU HEAR THAT? THAT'S THE SOUND OF ME PATTING MYSELF ON THE BACK.


Their low refresh rate (several seconds to change the display) limits their use to signage at the moment. If they can lower that and get the pixel density up, it would be a great ereader screen.

Get the cost down and increase the density and they would be great for digital picture frames. Since they only consume power when the screen changes, you could power them for a long time off a single battery.

fozzy fosbourne
Apr 21, 2010

bull3964 posted:

Their low refresh rate (several seconds to change the display) limits their use to signage at the moment. If they can lower that and get the pixel density up, it would be a great ereader screen.

Get the cost down and increase the density and they would be great for digital picture frames. Since they only consume power when the screen changes, you could power them for a long time off a single battery.

Give me a whole wall of the things.

Tony Montana
Aug 6, 2005

by FactsAreUseless

DrDork posted:

Tell me it's not part of their obnoxious ROG branding segment.

We talked about it a few pages back, when I was tossing up between the Acer and the ASUS. The price difference is significant, something like $400 AUD for the same panel. The Acer looks better too, Acer isn't displayed anywhere on the front and the stand is nice and normal looking. The bezel is tiny. The ASUS has a transformer inspired stand and is most certainly part of the Republic of Gamers which pretty much turned me off straight away. I like ASUS stuff, my last monitor was a 120Hz 27" ASUS 3D TN panel that I still love, and I've built computers with them for a long time with their motherboards and GPUs in my last few iterations. I also dislike Acer because I remember when Acer meant cheap, shithouse computer or laptop in the 90s and 00s. But after extended poking around, it really is the same panel in both monitors with the same specs and features and you're just paying hundreds more for the ASUS brand, which in this case, is quite obnoxious.

So I bought a freaking Acer. It's great. It OCed to 100Hz without a hiccup (when I used a short enough cable, anything over 2m is a HELL NO) and everyone that walks into the room says 'whoa'. The smart money seems to be on the Acer, so as the Americans would say.. go figure.

B-Mac
Apr 21, 2003
I'll never catch "the gay"!

Anti-Hero posted:

Really the bullshit gamer stuff on the Acer is easily solved. I swapped the stand out and put a piece of black electrical tape over the badge. The rest of the bezel is actually pretty professional looking. Behold!



The Acer is the best 144Hz IPS 1440P IPS on the market right now. It has QC problems, but not near as much as the Asus. Yeah, it's build leaves much to be desired compared to the Asus, but it is indeed the better monitor and the one folks should be considering.

That said, if you've always had TN panels then get the Dell.

What stand did you sea the stock one out for? Only thing holding me back about on the Acer is the non adjustable stand.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Tony Montana posted:

We talked about it a few pages back

I presume it's a new monitor, though, since the current ASUS counter to the X34 (the PG348Q) is a 100Hz panel, whereas Paul quoted a 144Hz one.

But yeah, between the X34 and the ASUS PG348Q I'd probably go with the Acer as the price difference does not seem to equate out to much of a better chance at avoiding QA issues. I've seen the ASUS in person, and the whole transformer bullshit actually isn't very obnoxious. I mean, they shouldn't have done it in the first place, but it looks a lot worse in their ads and such than in real life.

BurritoJustice
Oct 9, 2012

DrDork posted:

I presume it's a new monitor, though, since the current ASUS counter to the X34 (the PG348Q) is a 100Hz panel, whereas Paul quoted a 144Hz one.

But yeah, between the X34 and the ASUS PG348Q I'd probably go with the Acer as the price difference does not seem to equate out to much of a better chance at avoiding QA issues. I've seen the ASUS in person, and the whole transformer bullshit actually isn't very obnoxious. I mean, they shouldn't have done it in the first place, but it looks a lot worse in their ads and such than in real life.

Probably a typo. I'm a bit dubious on monitor manufacturers releasing a monitor that requires a 10x0 GPU to push it's quoted resolution before the 1080 even launches.

Anti-Hero
Feb 26, 2004

B-Mac posted:

What stand did you sea the stock one out for? Only thing holding me back about on the Acer is the non adjustable stand.

That's actually the stand from my 27" Eizo EV2736W. I don't have room on my desk for both monitors right now, but after I get some home projects finished up I'll be getting a larger desk and monitor arms for both monitors.

Speaking of, what kind of potential headaches am I looking at for having two screens where (a) differing refresh rates, 144 and 60 Hz and (b) nvidia GPUS. I would like to use the Acer for gaming and the Eizo for browsing, youtube, et al.

froody guy
Jun 25, 2013

BurritoJustice posted:

Probably a typo. I'm a bit dubious on monitor manufacturers releasing a monitor that requires a 10x0 GPU to push it's quoted resolution before the 1080 even launches.

Surely a typo. They both use the Dell's panel which is a 60hz-born and overclocked by gsync to its very limits. In fact many Acers can push it up to 95 but they struggle to reach 100 so not a chance that panel can be clocked to 144hz.

That's also why I was waiting for Computex in the remote chance to see some natural born 100/144hz panels in the 3440x1440 family but I'm not exactly trustful on that.

Tony Montana
Aug 6, 2005

by FactsAreUseless
There is no ultrawide I could find running at 3440x1440 at 144Hz. As we said a few pages ago, 100Hz at that res is the bleeding edge.

Etrips
Nov 9, 2004

Having Teemo Problems?
I Feel Bad For You, Son.
I Got 99 Shrooms
And You Just Hit One.

Tony Montana posted:

There is no ultrawide I could find running at 3440x1440 at 144Hz. As we said a few pages ago, 100Hz at that res is the bleeding edge.

Because currently technology does not allow it. We are having to use DP 1.2 which doesn't support the bandwidth needed. However, the GTX1080 are capable of using DP 1.3/1.4. So I would not totally count it out that someone will showcase something at Computex that can run it at that rate.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Tony Montana posted:

There is no ultrawide I could find running at 3440x1440 at 144Hz. As we said a few pages ago, 100Hz at that res is the bleeding edge.

Yeah, had a brain fart there, I meant to say the new Asus is the same as the X34.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

froody guy posted:

Surely a typo. They both use the Dell's panel which is a 60hz-born and overclocked by gsync to its very limits.

How does gsync overclock a panel?

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Paul MaudDib posted:

Yeah, had a brain fart there, I meant to say the new Asus is the same as the X34.

Well color me disappointed. Especially since the PG348Q has been out for awhile now (albeit hard to find in stock for actual purchase).

froody guy
Jun 25, 2013

Subjunctive posted:

How does gsync overclock a panel?

I don't know technically, I don't even know if anyone knows in detail since it's proprietary tech, but basically the gsync is a regulator of the frequency so it does it by design. But other than that it's probably the whole pcb/electronic onboard that allows it, not the gsync itself which works just as a sort of controller that manage the buffer coming from the gpu (and that's why gsync introduces a bit of lag) pulsing the monitor with a refresh each frame received..... in sync. There are other aspects related to overclocking a panel but as regards the chip called "gsync" that's pretty much it as far as I know at least.

froody guy fucked around with this message at 16:01 on May 26, 2016

ThatOneGuy
Jul 26, 2001

Revolutionary Taste.

Anti-Hero posted:

That's actually the stand from my 27" Eizo EV2736W. I don't have room on my desk for both monitors right now, but after I get some home projects finished up I'll be getting a larger desk and monitor arms for both monitors.

Speaking of, what kind of potential headaches am I looking at for having two screens where (a) differing refresh rates, 144 and 60 Hz and (b) nvidia GPUS. I would like to use the Acer for gaming and the Eizo for browsing, youtube, et al.

I am also interested in this as I want to do something similar. And my planned setup will have 144Hz w/Gsync (XB270HU) and 60Hz without (Crossfire of some sort) as well. One on a DP cable and the other on DVI-D for extra flavor.

Any recommendations on monitor arms?

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

froody guy posted:

I don't know technically, I don't even know if anyone knows in detail since it's proprietary tech, but basically the gsync is a regulator of the frequency so it does it by design. But other than that it's probably the whole pcb/electronic onboard that allows it, not the gsync itself which works just as a sort of controller that manage the buffer coming from the gpu (and that's why gsync introduces a bit of lag) pulsing the monitor with a refresh each frame received..... in sync. There are other aspects related to overclocking a panel but as regards the chip called "gsync" that's pretty much it as far as I know at least.

No, gsync reduces the effective refresh rate when the game doesn't present in time for the vsync interval. It doesn't in any way increase the refresh rate beyond that which the panel would have without gsync.

NVIDIA has published a fair bit of information about this, if you want to learn more about it. I haven't see measurements of lag, so I'd be interested in those. I'd be especially interested in understanding why the lag is worse than simply having vsync on; it should reduce lag for cases where the game exceeds its time budget for a frame, because it will be scanned out, f.e., 3ms late instead of 16ms.

fozzy fosbourne
Apr 21, 2010

Subjunctive posted:

No, gsync reduces the effective refresh rate when the game doesn't present in time for the vsync interval. It doesn't in any way increase the refresh rate beyond that which the panel would have without gsync.

NVIDIA has published a fair bit of information about this, if you want to learn more about it. I haven't see measurements of lag, so I'd be interested in those. I'd be especially interested in understanding why the lag is worse than simply having vsync on; it should reduce lag for cases where the game exceeds its time budget for a frame, because it will be scanned out, f.e., 3ms late instead of 16ms.

I'm certain that gsync has less lag than vsync. I haven't actually seen any evidence of significant lag at all, unless it was in that strange scenario where FPS = MAX_PANEL_HZ and vsync turned on by default (which is now an option to disable). Measurements taken here back that up. I know TFTCentral and NCX at wecravegamestoo test for this sort of thing and I haven't seen them report any input lag from gsync. They explain their testing methodology pretty thoroughly, too. (http://www.tftcentral.co.uk/articles/input_lag.htm). That NCX fellow says that freesync has a frame of lag relative to gsync, but I haven't looked into it too deeply. He's responded to my messages before so maybe he could suggest how he measured that.

I know early on there was a 1ms polling time, but Nvidia mentioned there had plans to eliminate that years ago when the first actual monitors shipped. (midwayt here http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-G-Sync-Tech-Preview-and-First-Impressions)

Regarding refresh rates and gsync, what I've been curious about is how come we see the same panel, same manufacturer, but different refresh rates depending on whether it has the gsync hardware or not (edit: different potential refresh rates after overclocking). Take for example the Acer X34 vs X341ck, Asus Asus PG279Q vs Asus MG279Q, etc. The gsync hardware seems to replace the scaler. But I don't think it replaces the timing controller. However, gsync monitors (especially the same panel across different manufacturers) tend to have very similar behavior regarding adaptive overdrive behavior and pixel response. I've read that Nvidia requires certain standards with regards to overdrive tuning for gsync monitors, and cites that as an advantage of gsync over freesync, so maybe it's not a capability of the gsync technology but rather a standard that needs to be met by other internal bits that they might not implement in their non-gsync versions of the monitor, due to cost? Cost is the only explanation I can imagine for why they simply wouldn't implement higher refresh rates and better overdrive tuning in the freesync versions of those displays I mentioned above.

fozzy fosbourne fucked around with this message at 22:36 on May 26, 2016

froody guy
Jun 25, 2013

Subjunctive posted:

No, gsync reduces the effective refresh rate when the game doesn't present in time for the vsync interval. It doesn't in any way increase the refresh rate beyond that which the panel would have without gsync.

NVIDIA has published a fair bit of information about this, if you want to learn more about it. I haven't see measurements of lag, so I'd be interested in those. I'd be especially interested in understanding why the lag is worse than simply having vsync on; it should reduce lag for cases where the game exceeds its time budget for a frame, because it will be scanned out, f.e., 3ms late instead of 16ms.

Well so it must be magic. I guess?

The lag thing is just because it has a buffer. You tell me how's possible that introducing a buffer in between two devices reduces lag and I'm done with science.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

froody guy posted:

You tell me how's possible that introducing a buffer in between two devices reduces lag and I'm done with science.

Statistics. Duh.

e; with Newegg selling a 27" 1440p IPS monitor for $250, I'm really trying to figure out just HOW big a desk I would need to fit two of those astride a X(R)34. And what sort of VESA stand to get...

DrDork fucked around with this message at 20:18 on May 26, 2016

fozzy fosbourne
Apr 21, 2010

froody guy posted:

Well so it must be magic. I guess?

The lag thing is just because it has a buffer. You tell me how's possible that introducing a buffer in between two devices reduces lag and I'm done with science.

I don't think there is a new buffer being introduced with gsync (relative to gsync=OFF)? It's using the front buffer

froody guy
Jun 25, 2013

Taken from some old discussion

quote:

the gsync module has direct control of the panel and can alter the voltage being supplied - that is why most panels that have a freesync and gsync version have a wider range on the gsync version... normal non-sync monitors have an overdrive feature to reduce ghosting, but it is tuned for the fixed refresh rates the monitor supports, the gysnc module is tuned for the full range they can get the panel to support, where as freesync is stuck with tuning for a fixed refresh which then ends up with monitors having a narrower range, because drift too far from where the overdrive is tuned for and you would get bad ghosting

the 75/100hz monitors/panels are still operating off the same controller as the 60hz ones, so it is limited in how it can control the panel... the gsync module is programmable, so the benefit of that is stuff like this

also, normal panel controllers are ASIC's, so developing a new one just for a 3440x1440 @100hz monitor would be cost prohibitive... as high res/high refresh monitors become more common, they will make one eventually, but as of right now I doubt there is a controller than can do 3440@100hz, hence why the programmable gsync module is needed (and obviously extra development time has been spent on) to get it to work

quote:

on a "normal" monitor with a fixed refresh of 60hz with vsync on, re-reading the same frame from the buffer twice or more is exactly what happens, so the fact that gsync intelligently handles that instead of what freesync does is still a benefit, because it reverts the monitor to its maximum refresh, meaning that as soon as the next GPU generated frame is ready it gets displayed on the next cycle, so on a 100hz monitor the maximum lag will be 9.999ms, but if you have a 100hz monitor, to really get the benefit you probably want to target 50hz as a minimum, so you will never get in to one of these low refresh double read situations anyway

and this is from the guy at TFTCentral writing the reviews (the link is from a discussion before the X34 was out, so he was guessing too)

quote:

keep in mind we are talking in theory at the moment as so far noone has actually released a 100Hz capable screen of this type. The panel is almost certainly the same as that used in the XR341CK and Dell U3415W models as the OP has said, but it is the G-sync module added afterwards which seems to be the key here.

Most panels will have a recommended refresh rate of 60Hz and a maximum of 75Hz so it's not that unusuale for a screen to support up to 75Hz. it just means you need to push it to the maximum possible and have a reliable controller which can handle that properly and not drop frames. you will see a fair few screens support up ot 75Hz if pushed.

edit:

fozzy fosbourne posted:

I don't think there is a new buffer being introduced with gsync (relative to gsync=OFF)? It's using the front buffer
I'm pretty sure I've read about gsync using its own buffer but I can't remember where so I don't know if it can do all the magic using simply the front buffer. I think it's not just the FB though.

froody guy fucked around with this message at 20:32 on May 26, 2016

fozzy fosbourne
Apr 21, 2010

froody guy posted:

Taken from some old discussion



and this is from the guy at TFTCentral writing the reviews (the link is from a discussion before the X34 was out, so he was guessing too)


edit:

I'm pretty sure I've read about gsync using its own buffer but I can't remember where so I don't know if it can do all the magic using simply the front buffer. I think it's not just the FB though.

I think that lag referenced in that quote is referring to situations where your refresh rate would be so low so as to induce flickering. In that scenario, gsync redraws the frame at a fixed rate 40hz (edit: actually, I think the rate is higher, see article below) rather than waiting until the front buffer is ready. In this article they mentioned that that threshold was below 29fps

http://www.pcper.com/reviews/Graphics-Cards/Dissecting-G-Sync-and-FreeSync-How-Technologies-Differ

Edit: Actually, it looks like it redraws 1 frame at below 37fps and 2 at below 20fps, at least on their measurements of this XB270HU:

quote:

Below that 40 FPS mark though things shift. The red line shows how AMD's FreeSync and Adaptive Sync work: the refresh rate stays static at 40 Hz even as the frame rate dips below 40 FPS, to 35 FPS, 30 FPS, etc. G-Sync works differently, doubling the refresh rate and inserting duplicate frames starting at around the 37 FPS mark. This continues until the game frame rate hits 19/20 FPS where a third frame is inserted and the refresh rate is increased again. The result is the dashed line representing the effective experience of G-Sync.

So maybe the moral of the story is don't go below 40fps, even with gsync

edit: another article explaining the flicker and why they redraw frames at lower framerates: http://www.pcper.com/reviews/Editorial/Look-Reported-G-Sync-Display-Flickering

fozzy fosbourne fucked around with this message at 20:57 on May 26, 2016

froody guy
Jun 25, 2013

fozzy fosbourne posted:

I think that lag referenced in that quote is referring to situations where your refresh rate would be so low so as to induce flickering. In that scenario, gsync redraws the frame at a fixed rate 40hz (edit: actually, I think the rate is higher, see article below) rather than waiting until the front buffer is ready. In this article they mentioned that that threshold was below 29fps

http://www.pcper.com/reviews/Graphics-Cards/Dissecting-G-Sync-and-FreeSync-How-Technologies-Differ

Edit: Actually, it looks like it redraws 1 frame at below 37fps and 2 at below 20fps, at least on their measurements of this XB270HU:


So maybe the moral of the story is don't go below 40fps, even with gsync

edit: another article explaining the flicker and why they redraw frames at lower framerates: http://www.pcper.com/reviews/Editorial/Look-Reported-G-Sync-Display-Flickering

Yeah I'm referring to the same link or actually this vid here

https://www.youtube.com/watch?v=VkrJU5d2RfA

Which shows the differences between gsync and freesync when the framerate drops below the minimum working specs but it doesn't say a lot about how the gsync extends the refresh rate of a panel that should go up to 75hz max as these 3440x1440 guys here and where in fact the freesync stops (there's no 100hz freesync monitor). So, in regards to that I think the answer is "voltage regulator" and "programmable controller". That's what the gsync does. How _exactly_ it does it I don't know honestly.

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

froody guy posted:

Taken from some old discussion



and this is from the guy at TFTCentral writing the reviews (the link is from a discussion before the X34 was out, so he was guessing too)


edit:

I'm pretty sure I've read about gsync using its own buffer but I can't remember where so I don't know if it can do all the magic using simply the front buffer. I think it's not just the FB though.

gsync is a real loving expensive fpga

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

froody guy posted:

The lag thing is just because it has a buffer. You tell me how's possible that introducing a buffer in between two devices reduces lag and I'm done with science.

Here's what I wrote (emphasis added):

quote:

it should reduce lag for cases where the game exceeds its time budget for a frame, because it will be scanned out, f.e., 3ms late instead of 16ms

Consider, assuming 60Hz/16ms per frame:

Frame generation begins at time 0; input is sampled here.
Frame generation completes at 19ms, missing the time budget by 3ms.
Without gsync, the frame is scanned out at the end of the next vsync interval, being 32ms. It is therefore 32ms latent, or 16ms more latent than if it had hit the time target.
With gsync, the frame is scanned out immediately at 19ms, making it 3 ms more latent than if the target were hit, and 13ms less latent than the non-gsync case.

Don't give up on science just yet.

E: this latency penalty reduction in the case of missed frame timing is why gsync-like stuff is interesting for VR applications, FWIW. We didn't have time to get it into CV1, but I wouldn't be surprised to see equivalent things in future headsets.

Subjunctive fucked around with this message at 06:55 on May 27, 2016

Tony Montana
Aug 6, 2005

by FactsAreUseless
I scanned your mum and apparently her rear end is latent

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Tony Montana posted:

I scanned your mum and apparently her rear end is latent

Rude.

KingEup
Nov 18, 2004
I am a REAL ADDICT
(to threadshitting)


Please ask me for my google inspired wisdom on shit I know nothing about. Actually, you don't even have to ask.

fozzy fosbourne posted:

I think that lag referenced in that quote is referring to situations where your refresh rate would be so low so as to induce flickering. In that scenario, gsync redraws the frame at a fixed rate 40hz (edit: actually, I think the rate is higher, see article below) rather than waiting until the front buffer is ready. In this article they mentioned that that threshold was below 29fps

http://www.pcper.com/reviews/Graphics-Cards/Dissecting-G-Sync-and-FreeSync-How-Technologies-Differ

Edit: Actually, it looks like it redraws 1 frame at below 37fps and 2 at below 20fps, at least on their measurements of this XB270HU:


So maybe the moral of the story is don't go below 40fps, even with gsync

edit: another article explaining the flicker and why they redraw frames at lower framerates: http://www.pcper.com/reviews/Editorial/Look-Reported-G-Sync-Display-Flickering

That article is woefully out of date. Freesync works differently now.

Anti-Hero
Feb 26, 2004

ThatOneGuy posted:

I am also interested in this as I want to do something similar. And my planned setup will have 144Hz w/Gsync (XB270HU) and 60Hz without (Crossfire of some sort) as well. One on a DP cable and the other on DVI-D for extra flavor.

Any recommendations on monitor arms?

I was going to get Ergotron arms as I'm familiar with them through work.

My XB271HU is still humming along nicely, but I'm thinking of being foolish. I came from a 1440P IPS 60Hz panel, so 144 Hz and Gsync is very welcome, but I want....more. I really have no complaints with the monitor, but feel like if I'm going to spend this much money on a display, I might as well go hog wild and consider the G-sync ultra-wides.

How much of a pain in the rear end is it to get games working with 21:9's? I know that Blizzard intentionally limits their games due to 16:9 for "esport" concerns. Am I going to limit myself to a bunch of ini fuckery to get other AAA titles to display properly?

fozzy fosbourne
Apr 21, 2010

I would set a 21:9 custom resolution in your nvidia control panel and experiment a bit. I plan on trying that out at some point.

E: instructions http://www.overclock.net/t/1564522/i-teach-you-how-to-widescreen-21-9-native-16-9-monitor

Be sure to scoot forward your current display a couple inches

fozzy fosbourne fucked around with this message at 18:21 on May 28, 2016

Holyshoot
May 6, 2010
Is 4k really worth it? I just picked up a acer 1440 144hz ips monitor with gsync a week ago and the quality is nice but not really noticing a crazy difference over my 1080 120hz I was using. I will be picking up a 1080 so will the 4k be a far better display even though its only 60hz versus the 144hz 1440 I have? I am not just sure if I will miss anything not having 144hz.

Right now my games are wow,path of exile, overwatch, marvel heroes. And none of those really hit 144fps with my current card. When I get the 1080 they all will hit that FPS mark. Will having 144fps @ 144hz be better then getting a 4k monitor running at 60hz with 60+ fps?

RiperSnifel
Jul 13, 2007

Pull the handle, let it go, now you know you're ready to roll.

Holyshoot posted:

Is 4k really worth it? I just picked up a acer 1440 144hz ips monitor with gsync a week ago and the quality is nice but not really noticing a crazy difference over my 1080 120hz I was using.

Dude, are you using the right resolution with that new 1440 monitor? I just did a similar upgrade and I've been floored with the increase in quality from a TN 1080p panel to a 1440ips. Have you configured the color profile and everything on the new monitor?
I'm always chasing the dragon tech wise, and I'm contemplating a 4k monitor next, but I can't deny that the 1440p monitor was one of the biggest game changers in my pc experience in a while.

Holyshoot
May 6, 2010

RiperSnifel posted:

Dude, are you using the right resolution with that new 1440 monitor? I just did a similar upgrade and I've been floored with the increase in quality from a TN 1080p panel to a 1440ips. Have you configured the color profile and everything on the new monitor?
I'm always chasing the dragon tech wise, and I'm contemplating a 4k monitor next, but I can't deny that the 1440p monitor was one of the biggest game changers in my pc experience in a while.

The color compared to my asus 120hz is noticeably different for sure. But maybe I'm just tone color deaf. Like it's very nice just not sure if it's 700 dollars very nice. And a 4k monitor would be about 100 dollars less then the 1440.

Adbot
ADBOT LOVES YOU

RiperSnifel
Jul 13, 2007

Pull the handle, let it go, now you know you're ready to roll.
When you step up to a beefier GPU you may change your stance on the monitor. With a 980ti my Asus pg279q has totally blown me away.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply