Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Subjunctive
Sep 12, 2006

✨sparkle and shine✨

So I just had a bad experience and I'm not quite sure what happened.

I have 3 displays connected to 2x970s (all on the primary; 1xDP, 2xDVI). The main display is a QNIX 2710, connected by DP, and I was playing Dragon Age on it when everything rebooted. When it came back up, Windows only recognized one of the DVI displays and had reverted to Microsoft Basic Display Driver. The picture was messed up, and I could only make it clear by reducing res to 1024x768. Reinstalling the NVIDIA drivers manually got all the displays working again at the right resolutions.

Except that the QNIX one now flashes on/off and has static-looking stuff on it when it's on; about a 4 second loop. I tried the DP connection into another display, and that works fine. I haven't yet tried DVI into the QNIX, but I'll give that a shot in a bit.

Can anyone construct a plausible narrative as to what happened that both seems to have damaged the monitor and screwed up the driver situation?

(I also suspect I'm in the market for a new display somewhat urgently, so I'll be taking a peek through the OP!)

Adbot
ADBOT LOVES YOU

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Gwyrgyn Blood posted:

The OP hasn't been updated since 2013 so I wouldn't trust anything in it in terms of specific recommended buys, especially not for gaming related monitors. Unfortunately I have no idea what could even be the cause of what happened to your monitor.

OK, thanks. I'd been looking at the ROG 1440p 144Hz thing lately with a covetous eye. Any caveats I should be aware of?

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Gwyrgyn Blood posted:

Well, it's very expensive for a TN panel!

There's some quality control issues, which appear like they might be Panel Lottery related. Some issues like the ones above, and some with bad backlight bleeding.

That panel's blur reduction tech is Ultra Low Motion Blur (ULMB), which is a newer version of LightBoost. The main thing to know about this is that you can't use it at 144hz or with G-Sync enabled. So if you are looking to absolutely minimize motion blur, you will have to run at 120hz without G-Sync, and all the other usual caveats that strobing brings. I am also not clear on if any of the extra tricks to improving blur reduction (like adjusting the Vertical Total) are possibly over Display Port, or on those Asus monitors at all.

Input lag is extremely low (~4ms total time including motion blur) and pixel response time is extremely good in general. Tons of details here if you want them: http://www.tftcentral.co.uk/reviews/asus_rog_swift_pg278q.htm

So yeah, it's very very good for gaming but very expensive. Sounds like you have the horsepower to make that res+refresh worth it though.

I'm not especially price sensitive, and I think the main drawback of TN (viewing angle) isn't super relevant to me given my desk layout. (And I game alone. :saddowns:) Running at 120 would be fine I think, and G-Sync doesn't really matter to me until it works outside full-screen mode.

BurritoJustice posted:

I wouldn't bother getting the Swift when there is an IPS equivalent coming from Acer soon (1440p, 144hz, G-Sync, IPS).

How soon? I'm reduced to this old Dell thing right now, so I'm not feeling especially patient. I could maybe go a week, if I decided to not spend much time on the machine, but there's a new FFXIV patch out tomorrow. :ohdear:

Edit: Oh, it's not until March. Yeah, I can't wait that long. Maybe when it comes time to replace the secondary monitors...

I think I'll go get a Swift from a local shop for ease of return if it turns out to be defective, since in the non-defective case it seems like what I want.

Subjunctive fucked around with this message at 23:46 on Jan 19, 2015

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

BurritoJustice posted:

Really I think its a bad time to buy a Swift right now.

Yeah, it wasn't ideal, but I needed a new monitor ASAP and the Acer isn't available yet. When the Acer starts shipping I'll probably get one as a new primary and make the Swift my secondary display.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

So I'm having some buyer's remorse about the Swift, I think mostly because it's a TN panel and with a panel that size it's hard to avoid having a bad angle on part of the screen.

I'm thinking about switching to a 4K IPS panel, which I've wanted for programming and such for some time, and then waiting for Acer to get my G-sync fix. What are the crowd-favorite 4K displays? I might be able to get a deal on a friend's ASUS PQ321Q, which looks good, but I'm not sure what other good options are.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

BurritoJustice posted:

The BenQ BL3210PH is a high quality 32" 4K IPS, that comes with a really high quality stand and a SST design (a HUGE benefit over the original MST 4K screens like the Asus that I think puts them out of contention) for only $1000 flat.

(Assuming you mean BL3201.)

That display looks pretty great -- how does it manage to be 1/3 cheaper than the ASUS?

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Don Lapre posted:

Cause it says BENQ on the label.

Best possible reason. Thanks!

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

I have a 2-week-old Swift, and just started to notice ghosting when I scroll black text on white background. That's bad, right?

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

repiv posted:

Trying experimenting with the Overdrive (OD) setting in the OSD. It's supposed to reduce ghosting but too much can cause inverse ghosting.

Switching from ULMB to OD seems to have fixed it, actually.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Instant Grat posted:

Even then, there's a feature-identical monitor to the ROG Swift coming from Acer that's IPS rather than TN, so...

Rather, there will be in a couple of months.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Chompers posted:

I'm looking to buy a non-korean 1440p 27" screen, ive narrowed it down to the following models:

Samsung S27D850T £320 (reduced down from 400£)

Hate to break it to you, but Samsung is Korean.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

DrDork posted:

Except for the fact that, quite frankly, the industry has a lot more plans in the pipeline for upping resolution than they do for Hz. VR isn't going to change much because it's going to be pushed via headsets running two independent screens, so you can still get away with 60Hz display hardware (though obviously it'll take quite a bit more to actually drive them, GPU-wise).

Nobody doing PC-attached VR headsets is doing 60Hz displays.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

DrDork posted:

The Galaxy Gear VR uses a 1440p@60Hz display (which admittedly isn't PC-attached), and the Razor one uses a 1080p@60Hz display. As to the others, they're also not really breaking new ground, either. Most of them look to be in the 1080p-ish range at 90-120Hz. No one's getting too crazy because, again, the computational expense of 240+Hz displays gets pretty hefty really fast.

Is Razor really doing low-persistence at 60Hz in their consumer version? That's going to be pretty bad compared to the others. I guess it's really just a dev kit though.

I think 90Hz is going to be the standard for PC-attached HMDs; I don't think anyone has been able to produce evidence that LP 120Hz is distinguishably better than 90, and the content performance requirements go up dramatically. (Faster response in terms of change from one pixel value to another will be a good area of improvement, but also harder AIUI than raising the refresh rate.)

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

DrDork posted:

Yeah, though as you noted, it's just a dev kit right now. There's hope that'll it'll get bumped up--Oculus started with a 60Hz dev kit, too.

Yeah, but we never considered it to be good enough for high-quality VR. Even DK2's 75Hz was known all along to be a stepping stone, we knew we had to hit at least 90Hz for it to be comfortable.

(120Hz at HMD resolutions also exceeds HDMI bandwidth, and some GPUs can't even push 90Hz at Crescent Bay resolutions because they don't have the pixel clock for it. Better spending bandwidth on increased resolution than refresh rate above 90Hz, I think.)

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

My understanding from the folks working on the display side of the hardware is that it's harder to accommodate DP with the low-persistence panels that are available in quantity. Might not be inherent, but just how they were built when the current gen of HMDs were being designed.

I'll ask on Monday if I remember.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Foveated rendering is something we've looked at; you have to push as many pixels really (they all have to light up) unless you invent new signaling, but you could make those pixels much cheaper to generate. (The human eye only perceives full resolution over something like 3° of arc.) There are definitely a lot of tricks left to play on the human brain, but I don't think anyone is going to get foveated working in this headset generation. I'm not sure fast enough eye tracking is possible with current sensing tech, though eye and head motion both top out at about 1000°/sec so the general rendering responsiveness requirement might be about the same as for tracking head rotation.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

DrDork posted:

Agreed. Either way, high-quality 3D is really a HDMI 2.0 or bust kinda deal, which at least gets you stereo 1080p@60Hz.

Oh, one thing here is that nobody has announced doing 1920x1080p for each eye, which is why 90Hz is attainable with the current connections in use. The Vive for example is 1080x1200 for each eye, I think.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

There's supposed to be an ROG 4K IPS G-sync monitor coming out, I think it was demoed at CES. Anyone know more about the timing of that?

Also wtf Acer gimme the IPS Swift killer already.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Zorilla posted:

Maybe EDID is poorly implemented on your monitor. It's supposed to information about the display even when the monitor is turned off. If you turn off the affected monitor, you see any devices disappear/reappear in Device Manager on the remaining monitor or hear the disconnect event sound play?

I have this happen on my Swift when my computer sleeps, drives me a little nuts.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

mr. nobody posted:

How this translates ingame is, when I'm turning/looking around quickly enough at 60hz things get 'stuttery' not fps lag, but you're seeing a limited number of static images/afterimages. The difference that's immediate at 144hz compared to 60hz even with the same framerate, is that when turning the visuals are a LOT more smooth.

I don't understand this. Where do the intermediate images come from, if the game is only producing a new frame every 16ms? In the mouse pointer case, Windows is generating images more frequently, so you see the intermediate steps. But if instead of A,C,E in terms of images with smaller increments of turning, you have A,B,C,D,E, something has to generate B and D, which means it's generating more frames in a second.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

isndl posted:

Generally speaking, your GPU generates an image, places it into a buffer, and the buffer is pushed to the display at specific intervals. If the buffer is updated without having pushed the old frame, that frame is effectively dropped. Worst case scenario is the buffer is pushed mid-update which results in screen tearing, which is why you use V-sync to lock your framerate to your refresh rate.

Yes, but he's saying that given the same 60Hz frame rate, he sees differences with a higher monitor refresh rate. If the game isn't calling Present() or whatever more frequently than 60Hz, how is the monitor able to display intermediate frames?

Are you saying that the game is really producing, f.e., frames at 90Hz and some of them are just getting obsoleted before scan out? That doesn't sound like what the OP was describing, since he was replying to someone asking about the benefits of 144Hz where the game could only keep up with 60. And he says that in the 144Hz mode he's still seeing the same ~60Hz frame rate, which I would expect to become 90 once the refresh rate is no longer the bottleneck.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

isndl posted:

Assuming a fixed 60 frames per second, you could have up to a 16ms delay before you see the new frame with a 60hz monitor, while it's closer to 6ms on the 144hz monitor. That helps account for the 'smoothness'. There shouldn't be any intermediate image difference between those though; his example of moving the mouse cursor on the desktop would have FPS in the hundreds if not thousands because it's relatively simple computation for even integrated GPUs these days and I doubt they lock the framerate for that.

That would reduce latency, but I don't know why it would be smoother. You're still getting a snapshot every 16ms, regardless of whether that frame is 1ms stale or 15ms stale. He explicitly said that he wasn't talking about lag, which would be that sort of latency. I can see it feeling more responsive, but I don't see how he could get the visual smoothness he was talking about.

(I think the modern Windows desktop does composite and scan out at monitor refresh rate, including the mouse pointer.)

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Yeah, it's really more of an image quality thing than a real-estate thing at most sizes.

How does 1440-scaled-to-4K look? Is it generally good, or do you really notice that something is off?

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

My 2710 died after about a year, not sure what others' longevity experiences have been like. Was sweet-rear end while it lasted though.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Yeah, my plan is a 4K and 1440@144 dual setup, with a keystroke to switch primary.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

pr0zac posted:

Stop giving me ideas, my wallet is unhappy enough as is.

We can go shopping together!

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Tunga posted:

I think the answer is no but I'll ask anyway: it's not possible to use the DP-out on my motherboard as an additional port, right? It's an Asus Maximus VI Impact.

You can drive a monitor off the integrated graphics in addition to those attached to your discrete GPU, yes.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

wolrah posted:

Minor warning that nVidia's drivers disable GPU accelerated PhysX if they detect any non-nVidia GPUs in the system. Obviously not the biggest loss in the world but worth noting.

That can't be true for IGPs, or nobody would ever have it enabled.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

JFairfax posted:

In my last gig we used some awful windows XP based thin client which had some loving horrific lag so I could probably cope with a couple extra milliseconds to be able to make use of the real estate of a 4k monitor.

You deserve better.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Truga posted:

For gaming this is a good thing, though. My old 2008 30" IPS has better response time than most recent IPS screens due to the lack of OSD/scaler.

Does OSD affect that? The scaler I understand, but I didn't know that having OSD capability impacted latency.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Zorilla posted:

Yes, on the NVIDIA side of things, you need at least a Kepler GPU for HDMI 1.4a, and 2nd-gen Broadwell for 2.0.

Maxwell?

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

I've got a SWIFT, which I like well enough, but the viewing angle is becoming a problem my kid wants to play with me or even just watch. Are there any 144Hz Gsync IPS displays other than the Acer whatever-HU that I should be considering?

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Ah, thanks; missed that.

So I guess with the QC issues on the Acer maybe I order two so I can easily return the one that sucks? Hmm.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

It would be destiny.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

pr0zac posted:

Should probably get three just in case.

I'll ask someone in data science how many I need for a statistically significant sample.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

I like the ergotron ones.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Sidesaddle Cavalry posted:

the 4K 60Hz IPS G-Sync screens are probably making some nerds happy right now

This nerd would be a lot happier if they were 30"+; 4K on a 28" screen isn't really going to be noticeably better than 1440p I think.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨


Seriously.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Parker Lewis posted:

Someday we'll look back on IPS glow and laugh that monitors weren't displaying black areas as black.. right?

In the grim future of contrast, there is only OLED.

Adbot
ADBOT LOVES YOU

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Papercut posted:

Why not just get the Crossover 404K discussed above?

I can't find a review of it via Google -- what am I doing wrong?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply