|
froody guy posted:I totally agree but one must say that those TN panels mounted on these high refresh rate monitors are really the best you can squeeze out of a TN panel, also in terms of color fidelity and reproduction so they're not "just TN" at all. Also, the pricing of the Acer is loving insane. I mean, it costs 650€ in euroland and we generally have much higher prices due to higher vat so screw it. For that price I'd totally go 21:9 with the Dell or the AOC. Putting my S2716DG side by side with an IPS monitor (either my P2715Q or my XB270HU) and its picture quality definitely isn't quite as good. It's not bad but the IPS monitors are notably contrastier.
|
# ? May 25, 2016 15:26 |
|
|
# ? Apr 20, 2024 12:46 |
|
Also: I was reading my Microcenter ad this morning and I noticed that Asus is now producing a 1440p/144hz 34" ultrawide with GSync. So the X34 is no longer the only option in that segment.
|
# ? May 25, 2016 15:30 |
|
Paul MaudDib posted:Also: I was reading my Microcenter ad this morning and I noticed that Asus is now producing a 1440p/144hz 34" ultrawide with GSync. So the X34 is no longer the only option in that segment.
|
# ? May 25, 2016 15:43 |
|
DrDork posted:Tell me it's not part of their obnoxious ROG branding segment. Do you like being lied to?
|
# ? May 25, 2016 15:51 |
|
Calidus posted:Do you like being lied to? Sometimes it's all I've got.
|
# ? May 25, 2016 16:47 |
|
These look insanely epic: 2500x1600 20 inch 150ppi e ink displays https://www.youtube.com/watch?v=N2V9iuTW3sA Not sure what gaming or yospos applications there are, but it just looks bad rear end
|
# ? May 25, 2016 16:47 |
|
Their low refresh rate (several seconds to change the display) limits their use to signage at the moment. If they can lower that and get the pixel density up, it would be a great ereader screen. Get the cost down and increase the density and they would be great for digital picture frames. Since they only consume power when the screen changes, you could power them for a long time off a single battery.
|
# ? May 25, 2016 16:55 |
|
bull3964 posted:Their low refresh rate (several seconds to change the display) limits their use to signage at the moment. If they can lower that and get the pixel density up, it would be a great ereader screen. Give me a whole wall of the things.
|
# ? May 25, 2016 17:11 |
|
DrDork posted:Tell me it's not part of their obnoxious ROG branding segment. We talked about it a few pages back, when I was tossing up between the Acer and the ASUS. The price difference is significant, something like $400 AUD for the same panel. The Acer looks better too, Acer isn't displayed anywhere on the front and the stand is nice and normal looking. The bezel is tiny. The ASUS has a transformer inspired stand and is most certainly part of the Republic of Gamers which pretty much turned me off straight away. I like ASUS stuff, my last monitor was a 120Hz 27" ASUS 3D TN panel that I still love, and I've built computers with them for a long time with their motherboards and GPUs in my last few iterations. I also dislike Acer because I remember when Acer meant cheap, shithouse computer or laptop in the 90s and 00s. But after extended poking around, it really is the same panel in both monitors with the same specs and features and you're just paying hundreds more for the ASUS brand, which in this case, is quite obnoxious. So I bought a freaking Acer. It's great. It OCed to 100Hz without a hiccup (when I used a short enough cable, anything over 2m is a HELL NO) and everyone that walks into the room says 'whoa'. The smart money seems to be on the Acer, so as the Americans would say.. go figure.
|
# ? May 26, 2016 04:02 |
|
Anti-Hero posted:Really the bullshit gamer stuff on the Acer is easily solved. I swapped the stand out and put a piece of black electrical tape over the badge. The rest of the bezel is actually pretty professional looking. Behold! What stand did you sea the stock one out for? Only thing holding me back about on the Acer is the non adjustable stand.
|
# ? May 26, 2016 04:05 |
|
Tony Montana posted:We talked about it a few pages back I presume it's a new monitor, though, since the current ASUS counter to the X34 (the PG348Q) is a 100Hz panel, whereas Paul quoted a 144Hz one. But yeah, between the X34 and the ASUS PG348Q I'd probably go with the Acer as the price difference does not seem to equate out to much of a better chance at avoiding QA issues. I've seen the ASUS in person, and the whole transformer bullshit actually isn't very obnoxious. I mean, they shouldn't have done it in the first place, but it looks a lot worse in their ads and such than in real life.
|
# ? May 26, 2016 04:09 |
|
DrDork posted:I presume it's a new monitor, though, since the current ASUS counter to the X34 (the PG348Q) is a 100Hz panel, whereas Paul quoted a 144Hz one. Probably a typo. I'm a bit dubious on monitor manufacturers releasing a monitor that requires a 10x0 GPU to push it's quoted resolution before the 1080 even launches.
|
# ? May 26, 2016 06:53 |
|
B-Mac posted:What stand did you sea the stock one out for? Only thing holding me back about on the Acer is the non adjustable stand. That's actually the stand from my 27" Eizo EV2736W. I don't have room on my desk for both monitors right now, but after I get some home projects finished up I'll be getting a larger desk and monitor arms for both monitors. Speaking of, what kind of potential headaches am I looking at for having two screens where (a) differing refresh rates, 144 and 60 Hz and (b) nvidia GPUS. I would like to use the Acer for gaming and the Eizo for browsing, youtube, et al.
|
# ? May 26, 2016 07:07 |
|
BurritoJustice posted:Probably a typo. I'm a bit dubious on monitor manufacturers releasing a monitor that requires a 10x0 GPU to push it's quoted resolution before the 1080 even launches. Surely a typo. They both use the Dell's panel which is a 60hz-born and overclocked by gsync to its very limits. In fact many Acers can push it up to 95 but they struggle to reach 100 so not a chance that panel can be clocked to 144hz. That's also why I was waiting for Computex in the remote chance to see some natural born 100/144hz panels in the 3440x1440 family but I'm not exactly trustful on that.
|
# ? May 26, 2016 07:08 |
|
There is no ultrawide I could find running at 3440x1440 at 144Hz. As we said a few pages ago, 100Hz at that res is the bleeding edge.
|
# ? May 26, 2016 07:28 |
|
Tony Montana posted:There is no ultrawide I could find running at 3440x1440 at 144Hz. As we said a few pages ago, 100Hz at that res is the bleeding edge. Because currently technology does not allow it. We are having to use DP 1.2 which doesn't support the bandwidth needed. However, the GTX1080 are capable of using DP 1.3/1.4. So I would not totally count it out that someone will showcase something at Computex that can run it at that rate.
|
# ? May 26, 2016 08:17 |
|
Tony Montana posted:There is no ultrawide I could find running at 3440x1440 at 144Hz. As we said a few pages ago, 100Hz at that res is the bleeding edge. Yeah, had a brain fart there, I meant to say the new Asus is the same as the X34.
|
# ? May 26, 2016 12:57 |
|
froody guy posted:Surely a typo. They both use the Dell's panel which is a 60hz-born and overclocked by gsync to its very limits. How does gsync overclock a panel?
|
# ? May 26, 2016 13:41 |
|
Paul MaudDib posted:Yeah, had a brain fart there, I meant to say the new Asus is the same as the X34. Well color me disappointed. Especially since the PG348Q has been out for awhile now (albeit hard to find in stock for actual purchase).
|
# ? May 26, 2016 15:17 |
|
Subjunctive posted:How does gsync overclock a panel? I don't know technically, I don't even know if anyone knows in detail since it's proprietary tech, but basically the gsync is a regulator of the frequency so it does it by design. But other than that it's probably the whole pcb/electronic onboard that allows it, not the gsync itself which works just as a sort of controller that manage the buffer coming from the gpu (and that's why gsync introduces a bit of lag) pulsing the monitor with a refresh each frame received..... in sync. There are other aspects related to overclocking a panel but as regards the chip called "gsync" that's pretty much it as far as I know at least. froody guy fucked around with this message at 16:01 on May 26, 2016 |
# ? May 26, 2016 15:59 |
|
Anti-Hero posted:That's actually the stand from my 27" Eizo EV2736W. I don't have room on my desk for both monitors right now, but after I get some home projects finished up I'll be getting a larger desk and monitor arms for both monitors. I am also interested in this as I want to do something similar. And my planned setup will have 144Hz w/Gsync (XB270HU) and 60Hz without (Crossfire of some sort) as well. One on a DP cable and the other on DVI-D for extra flavor. Any recommendations on monitor arms?
|
# ? May 26, 2016 18:43 |
|
froody guy posted:I don't know technically, I don't even know if anyone knows in detail since it's proprietary tech, but basically the gsync is a regulator of the frequency so it does it by design. But other than that it's probably the whole pcb/electronic onboard that allows it, not the gsync itself which works just as a sort of controller that manage the buffer coming from the gpu (and that's why gsync introduces a bit of lag) pulsing the monitor with a refresh each frame received..... in sync. There are other aspects related to overclocking a panel but as regards the chip called "gsync" that's pretty much it as far as I know at least. No, gsync reduces the effective refresh rate when the game doesn't present in time for the vsync interval. It doesn't in any way increase the refresh rate beyond that which the panel would have without gsync. NVIDIA has published a fair bit of information about this, if you want to learn more about it. I haven't see measurements of lag, so I'd be interested in those. I'd be especially interested in understanding why the lag is worse than simply having vsync on; it should reduce lag for cases where the game exceeds its time budget for a frame, because it will be scanned out, f.e., 3ms late instead of 16ms.
|
# ? May 26, 2016 19:21 |
|
Subjunctive posted:No, gsync reduces the effective refresh rate when the game doesn't present in time for the vsync interval. It doesn't in any way increase the refresh rate beyond that which the panel would have without gsync. I'm certain that gsync has less lag than vsync. I haven't actually seen any evidence of significant lag at all, unless it was in that strange scenario where FPS = MAX_PANEL_HZ and vsync turned on by default (which is now an option to disable). Measurements taken here back that up. I know TFTCentral and NCX at wecravegamestoo test for this sort of thing and I haven't seen them report any input lag from gsync. They explain their testing methodology pretty thoroughly, too. (http://www.tftcentral.co.uk/articles/input_lag.htm). That NCX fellow says that freesync has a frame of lag relative to gsync, but I haven't looked into it too deeply. He's responded to my messages before so maybe he could suggest how he measured that. I know early on there was a 1ms polling time, but Nvidia mentioned there had plans to eliminate that years ago when the first actual monitors shipped. (midwayt here http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-G-Sync-Tech-Preview-and-First-Impressions) Regarding refresh rates and gsync, what I've been curious about is how come we see the same panel, same manufacturer, but different refresh rates depending on whether it has the gsync hardware or not (edit: different potential refresh rates after overclocking). Take for example the Acer X34 vs X341ck, Asus Asus PG279Q vs Asus MG279Q, etc. The gsync hardware seems to replace the scaler. But I don't think it replaces the timing controller. However, gsync monitors (especially the same panel across different manufacturers) tend to have very similar behavior regarding adaptive overdrive behavior and pixel response. I've read that Nvidia requires certain standards with regards to overdrive tuning for gsync monitors, and cites that as an advantage of gsync over freesync, so maybe it's not a capability of the gsync technology but rather a standard that needs to be met by other internal bits that they might not implement in their non-gsync versions of the monitor, due to cost? Cost is the only explanation I can imagine for why they simply wouldn't implement higher refresh rates and better overdrive tuning in the freesync versions of those displays I mentioned above. fozzy fosbourne fucked around with this message at 22:36 on May 26, 2016 |
# ? May 26, 2016 19:58 |
|
Subjunctive posted:No, gsync reduces the effective refresh rate when the game doesn't present in time for the vsync interval. It doesn't in any way increase the refresh rate beyond that which the panel would have without gsync. Well so it must be magic. I guess? The lag thing is just because it has a buffer. You tell me how's possible that introducing a buffer in between two devices reduces lag and I'm done with science.
|
# ? May 26, 2016 20:10 |
|
froody guy posted:You tell me how's possible that introducing a buffer in between two devices reduces lag and I'm done with science. Statistics. Duh. e; with Newegg selling a 27" 1440p IPS monitor for $250, I'm really trying to figure out just HOW big a desk I would need to fit two of those astride a X(R)34. And what sort of VESA stand to get... DrDork fucked around with this message at 20:18 on May 26, 2016 |
# ? May 26, 2016 20:11 |
|
froody guy posted:Well so it must be magic. I guess? I don't think there is a new buffer being introduced with gsync (relative to gsync=OFF)? It's using the front buffer
|
# ? May 26, 2016 20:22 |
|
Taken from some old discussionquote:the gsync module has direct control of the panel and can alter the voltage being supplied - that is why most panels that have a freesync and gsync version have a wider range on the gsync version... normal non-sync monitors have an overdrive feature to reduce ghosting, but it is tuned for the fixed refresh rates the monitor supports, the gysnc module is tuned for the full range they can get the panel to support, where as freesync is stuck with tuning for a fixed refresh which then ends up with monitors having a narrower range, because drift too far from where the overdrive is tuned for and you would get bad ghosting quote:on a "normal" monitor with a fixed refresh of 60hz with vsync on, re-reading the same frame from the buffer twice or more is exactly what happens, so the fact that gsync intelligently handles that instead of what freesync does is still a benefit, because it reverts the monitor to its maximum refresh, meaning that as soon as the next GPU generated frame is ready it gets displayed on the next cycle, so on a 100hz monitor the maximum lag will be 9.999ms, but if you have a 100hz monitor, to really get the benefit you probably want to target 50hz as a minimum, so you will never get in to one of these low refresh double read situations anyway and this is from the guy at TFTCentral writing the reviews (the link is from a discussion before the X34 was out, so he was guessing too) quote:keep in mind we are talking in theory at the moment as so far noone has actually released a 100Hz capable screen of this type. The panel is almost certainly the same as that used in the XR341CK and Dell U3415W models as the OP has said, but it is the G-sync module added afterwards which seems to be the key here. edit: fozzy fosbourne posted:I don't think there is a new buffer being introduced with gsync (relative to gsync=OFF)? It's using the front buffer froody guy fucked around with this message at 20:32 on May 26, 2016 |
# ? May 26, 2016 20:27 |
|
froody guy posted:Taken from some old discussion I think that lag referenced in that quote is referring to situations where your refresh rate would be so low so as to induce flickering. In that scenario, gsync redraws the frame at a fixed rate 40hz (edit: actually, I think the rate is higher, see article below) rather than waiting until the front buffer is ready. In this article they mentioned that that threshold was below 29fps http://www.pcper.com/reviews/Graphics-Cards/Dissecting-G-Sync-and-FreeSync-How-Technologies-Differ Edit: Actually, it looks like it redraws 1 frame at below 37fps and 2 at below 20fps, at least on their measurements of this XB270HU: quote:Below that 40 FPS mark though things shift. The red line shows how AMD's FreeSync and Adaptive Sync work: the refresh rate stays static at 40 Hz even as the frame rate dips below 40 FPS, to 35 FPS, 30 FPS, etc. G-Sync works differently, doubling the refresh rate and inserting duplicate frames starting at around the 37 FPS mark. This continues until the game frame rate hits 19/20 FPS where a third frame is inserted and the refresh rate is increased again. The result is the dashed line representing the effective experience of G-Sync. So maybe the moral of the story is don't go below 40fps, even with gsync edit: another article explaining the flicker and why they redraw frames at lower framerates: http://www.pcper.com/reviews/Editorial/Look-Reported-G-Sync-Display-Flickering fozzy fosbourne fucked around with this message at 20:57 on May 26, 2016 |
# ? May 26, 2016 20:36 |
|
fozzy fosbourne posted:I think that lag referenced in that quote is referring to situations where your refresh rate would be so low so as to induce flickering. In that scenario, gsync redraws the frame at a fixed rate 40hz (edit: actually, I think the rate is higher, see article below) rather than waiting until the front buffer is ready. In this article they mentioned that that threshold was below 29fps Yeah I'm referring to the same link or actually this vid here https://www.youtube.com/watch?v=VkrJU5d2RfA Which shows the differences between gsync and freesync when the framerate drops below the minimum working specs but it doesn't say a lot about how the gsync extends the refresh rate of a panel that should go up to 75hz max as these 3440x1440 guys here and where in fact the freesync stops (there's no 100hz freesync monitor). So, in regards to that I think the answer is "voltage regulator" and "programmable controller". That's what the gsync does. How _exactly_ it does it I don't know honestly.
|
# ? May 26, 2016 21:09 |
|
froody guy posted:Taken from some old discussion gsync is a real loving expensive fpga
|
# ? May 26, 2016 21:09 |
|
froody guy posted:The lag thing is just because it has a buffer. You tell me how's possible that introducing a buffer in between two devices reduces lag and I'm done with science. Here's what I wrote (emphasis added): quote:it should reduce lag for cases where the game exceeds its time budget for a frame, because it will be scanned out, f.e., 3ms late instead of 16ms Consider, assuming 60Hz/16ms per frame: Frame generation begins at time 0; input is sampled here. Frame generation completes at 19ms, missing the time budget by 3ms. Without gsync, the frame is scanned out at the end of the next vsync interval, being 32ms. It is therefore 32ms latent, or 16ms more latent than if it had hit the time target. With gsync, the frame is scanned out immediately at 19ms, making it 3 ms more latent than if the target were hit, and 13ms less latent than the non-gsync case. Don't give up on science just yet. E: this latency penalty reduction in the case of missed frame timing is why gsync-like stuff is interesting for VR applications, FWIW. We didn't have time to get it into CV1, but I wouldn't be surprised to see equivalent things in future headsets. Subjunctive fucked around with this message at 06:55 on May 27, 2016 |
# ? May 27, 2016 06:53 |
|
I scanned your mum and apparently her rear end is latent
|
# ? May 27, 2016 06:57 |
|
Tony Montana posted:I scanned your mum and apparently her rear end is latent Rude.
|
# ? May 27, 2016 06:59 |
|
fozzy fosbourne posted:I think that lag referenced in that quote is referring to situations where your refresh rate would be so low so as to induce flickering. In that scenario, gsync redraws the frame at a fixed rate 40hz (edit: actually, I think the rate is higher, see article below) rather than waiting until the front buffer is ready. In this article they mentioned that that threshold was below 29fps That article is woefully out of date. Freesync works differently now.
|
# ? May 27, 2016 10:06 |
|
ThatOneGuy posted:I am also interested in this as I want to do something similar. And my planned setup will have 144Hz w/Gsync (XB270HU) and 60Hz without (Crossfire of some sort) as well. One on a DP cable and the other on DVI-D for extra flavor. I was going to get Ergotron arms as I'm familiar with them through work. My XB271HU is still humming along nicely, but I'm thinking of being foolish. I came from a 1440P IPS 60Hz panel, so 144 Hz and Gsync is very welcome, but I want....more. I really have no complaints with the monitor, but feel like if I'm going to spend this much money on a display, I might as well go hog wild and consider the G-sync ultra-wides. How much of a pain in the rear end is it to get games working with 21:9's? I know that Blizzard intentionally limits their games due to 16:9 for "esport" concerns. Am I going to limit myself to a bunch of ini fuckery to get other AAA titles to display properly?
|
# ? May 28, 2016 17:58 |
|
I would set a 21:9 custom resolution in your nvidia control panel and experiment a bit. I plan on trying that out at some point. E: instructions http://www.overclock.net/t/1564522/i-teach-you-how-to-widescreen-21-9-native-16-9-monitor Be sure to scoot forward your current display a couple inches fozzy fosbourne fucked around with this message at 18:21 on May 28, 2016 |
# ? May 28, 2016 18:13 |
|
Is 4k really worth it? I just picked up a acer 1440 144hz ips monitor with gsync a week ago and the quality is nice but not really noticing a crazy difference over my 1080 120hz I was using. I will be picking up a 1080 so will the 4k be a far better display even though its only 60hz versus the 144hz 1440 I have? I am not just sure if I will miss anything not having 144hz. Right now my games are wow,path of exile, overwatch, marvel heroes. And none of those really hit 144fps with my current card. When I get the 1080 they all will hit that FPS mark. Will having 144fps @ 144hz be better then getting a 4k monitor running at 60hz with 60+ fps?
|
# ? May 28, 2016 19:04 |
|
Holyshoot posted:Is 4k really worth it? I just picked up a acer 1440 144hz ips monitor with gsync a week ago and the quality is nice but not really noticing a crazy difference over my 1080 120hz I was using. Dude, are you using the right resolution with that new 1440 monitor? I just did a similar upgrade and I've been floored with the increase in quality from a TN 1080p panel to a 1440ips. Have you configured the color profile and everything on the new monitor? I'm always chasing the dragon tech wise, and I'm contemplating a 4k monitor next, but I can't deny that the 1440p monitor was one of the biggest game changers in my pc experience in a while.
|
# ? May 28, 2016 21:34 |
|
RiperSnifel posted:Dude, are you using the right resolution with that new 1440 monitor? I just did a similar upgrade and I've been floored with the increase in quality from a TN 1080p panel to a 1440ips. Have you configured the color profile and everything on the new monitor? The color compared to my asus 120hz is noticeably different for sure. But maybe I'm just tone color deaf. Like it's very nice just not sure if it's 700 dollars very nice. And a 4k monitor would be about 100 dollars less then the 1440.
|
# ? May 28, 2016 22:09 |
|
|
# ? Apr 20, 2024 12:46 |
|
When you step up to a beefier GPU you may change your stance on the monitor. With a 980ti my Asus pg279q has totally blown me away.
|
# ? May 28, 2016 23:16 |