Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
jownzy
Apr 20, 2012

I love Rainbow Moon.

It is the deepest game ever. Nothing compares to its epic story.
Is this a steal at $139? ASUS VN248H-P

http://www.newegg.com/Product/Produ..._-24236335-L02A

Adbot
ADBOT LOVES YOU

Doyzer
Jul 28, 2013

Looking at pcpartpicker it is cheaper than normal. It normally moves down every couple of weeks.

http://pcpartpicker.com/part/asus-monitor-vn248h

jownzy
Apr 20, 2012

I love Rainbow Moon.

It is the deepest game ever. Nothing compares to its epic story.
I guess more importantly is this a steal compared to the price points for the Dell and other Asus models?

Brut
Aug 21, 2007

Are monitors with VGA and HDMI in but no DVI becoming a standard now?

Thanks Ants
May 21, 2004

#essereFerrari


It totally depends on the monitor, I'd say HDMI is becoming more popular in monitors aimed at the consumer market (either that or VGA only, still :barf:). 'Professional' monitors tend to have DVI, DisplayPort and possibly still VGA.

It doesn't really matter, DVI and HDMI are compatible with each other, you just need a cheap adaptor to change where the pins go.

Gonkish
May 19, 2004

So do we have any kind of estimate as to when G-Sync monitors are due out? Last I heard it was ASUS only, and only for their 144Hz monitors, right?

Ak Gara
Jul 29, 2005

That's just the way he rolls.

Gonkish posted:

So do we have any kind of estimate as to when G-Sync monitors are due out? Last I heard it was ASUS only, and only for their 144Hz monitors, right?

I think G-Sync is going to be game changing but WHY put it on their fastest monitors? Surely the 1600p screens with low FPS would benefit from G-Sync the most?

Wistful of Dollars
Aug 25, 2009

Probably sales volume. (Not on something niche like the 144hz models, but on 1080p's)

1440/1600p g-sync monitors will arrive, just a question of when at this point. You're quite right though, the higher the resolution you go (hello 4k), the more g-sync will actually be useful.

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map

Gonkish posted:

So do we have any kind of estimate as to when G-Sync monitors are due out? Last I heard it was ASUS only, and only for their 144Hz monitors, right?

Apparently when the technology was announced, BenQ, Phillips, and ViewSonic were already on board too, but those three have announced zip, zilch, and nada, respectively, regarding their own offerings with the tech.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Sidesaddle Cavalry posted:

Apparently when the technology was announced, BenQ, Phillips, and ViewSonic were already on board too, but those three have announced zip, zilch, and nada, respectively, regarding their own offerings with the tech.

That's because Asus bought an exclusive for all of 2014.

Wasabi the J
Jan 23, 2008

MOM WAS RIGHT

Doctor rear end in a top hat posted:

Horry poo poo you're not kidding.


e: oh drat that one's due to an Nvidia SLI bug. The others are still impressive.

Also, :stare: that stutter from the AMD cards. I can tell you those big dips and spikes being so large and sudden are very jarring on the eyes, feeling sort of like a slipping clutch or something.

EDIT: Looked again, and you're completely right.

Wasabi the J fucked around with this message at 23:43 on Nov 10, 2013

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Look again, that's 4K resolution and everything above the green line is faster than screen refresh. The dips down below 60 FPS are there, yeah, but they line up with where the 780 and Titan solutions drop as well. It doesn't look like the GeForce cards are dropping as much, but the R9s go from solid 60 down to ~46 FPS (usually above 50), whereas the GeForce are dropping from ~40 to below 30, sometimes to 23.

The R9s are doing the far better job there. If you're doing Vsync, the Radeons are giving you mostly 60 with some drops to 30, which is stuttery but it's what you get. The GeForces are giving you mostly 30 with drops to 20, which looks pretty crappy in many titles.

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map

Factory Factory posted:

That's because Asus bought an exclusive for all of 2014.

Is it normally a good business strategy to focus only on one partner in the beginning for things like this? I'm not fully following why this decision was made. Are they really hedging their bets on ASUS's 144Hz TN that heavily?

Wistful of Dollars
Aug 25, 2009

Sidesaddle Cavalry posted:

Are they really hedging their bets on ASUS's 144Hz TN that heavily?

I haven't seen anything stating that it'll be exclusive to the 144. If someone can correct me on that I'd be happy to know.

GrizzlyCow
May 30, 2011

Sidesaddle Cavalry posted:

Is it normally a good business strategy to focus only on one partner in the beginning for things like this? I'm not fully following why this decision was made. Are they really hedging their bets on ASUS's 144Hz TN that heavily?

ASUS literally paid NVIDIA money for exclusivity. The decision was made because no other company made it worth NVIDIA's wile to accept ASUS's offer. ASUS gets a couple of months of exclusivity with a brand new, unproven tech that may revolutionize (gaming) displays or not, and NVIDIA gets a fat sack of cash. Even if this move cause G-Sync initial sales and adoption be anemic, NVIDIA has already paid over it.

Shaocaholica
Oct 29, 2002

Fig. 5E
Display self calibration, is this a 'thing' yet?

circa 2011
http://www.eizo.com/na/press/releases/htmls/cg275w.html

Because 3rd party apps, different OSs, not-computers, color profile files and loss of tonal range are all lovely things. Even for businesses with big bucks and staff to manage it.

edit:

Haha, awesome. The colorimeter is on a servo:

https://www.youtube.com/watch?v=ym0tKDbdd0E

Shaocaholica fucked around with this message at 03:21 on Nov 11, 2013

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map

GrizzlyCow posted:

ASUS literally paid NVIDIA money for exclusivity. The decision was made because no other company made it worth NVIDIA's wile to accept ASUS's offer. ASUS gets a couple of months of exclusivity with a brand new, unproven tech that may revolutionize (gaming) displays or not, and NVIDIA gets a fat sack of cash. Even if this move cause G-Sync initial sales and adoption be anemic, NVIDIA has already paid over it.

:stare: Okay. I must have had the misconception that competition was actually the key to making technologies work, and work better, but I guess I'm wrong.

BurritoJustice
Oct 9, 2012

Wasn't the ASUS exclusivity thing an unsubstantiated rumour?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

BurritoJustice posted:

Wasn't the ASUS exclusivity thing an unsubstantiated rumour?

If it is, it's one that was published as truth on AnandTech. We are assuming that Asus paid for the privilege, though, since the alternative is that Nvidia created gold and gave it to Ron Paul to hoard.

BurritoJustice
Oct 9, 2012

Factory Factory posted:

If it is, it's one that was published as truth on AnandTech. We are assuming that Asus paid for the privilege, though, since the alternative is that Nvidia created gold and gave it to Ron Paul to hoard.

Do you have a link on that? I can't really find anything solid on it. If so, that is a drat shame. I have been looking at high refresh rate monitors lately, and the BenQ XL2420TE seems an all round more attractive option than the Asus alternative, so it is a shame Asus will be the only option without waiting a long time.

Teriyaki Koinku
Nov 25, 2008

Bread! Bread! Bread!

Bread! BREAD! BREAD!
What are some subjective opinions on upgrading to a monitor beyond 1920x1080 for gaming and productivity purposes?

After seeing some 4K HDTVs on display at Best Buy, I really want to upgrade my computer monitor to a higher-resolution one. That being said, I see monitors at 2560x1440 resolution (e.g.: ASUS PB278Q 27-Inch WQHD LED-lit PLS Professional Graphics Monitor) going for a lot cheaper relative to true 4K monitors (e.g.: ASUS PQ321Q 31.5-Inch 4K Monitor) AKA 3840x2160 native resolution.

(This is my current monitor for reference: Asus VS238H-P 23-Inch Full-HD LED-Lit LCD Monitor at native 1920x1080 resolution.)

So, if I wanted to get improved "eye-popping" visuals upgrading from 1920x1080, would it be okay to just stick with 1440p or is it better to pinch pennies until I can afford a 4k monitor?

Teriyaki Koinku fucked around with this message at 16:40 on Nov 11, 2013

fookolt
Mar 13, 2012

Where there is power
There is resistance

TheRamblingSoul posted:

What are some subjective opinions on upgrading to a monitor beyond 1920x1080 for gaming and productivity purposes?

After seeing some 4K HDTVs on display at Best Buy, I really want to upgrade my computer monitor to a higher-resolution one. That being said, I see monitors at 2560x1440 resolution (e.g.: ASUS PB278Q 27-Inch WQHD LED-lit PLS Professional Graphics Monitor) going for a lot cheaper relative to true 4K monitors (e.g.: ASUS PQ321Q 31.5-Inch 4K Monitor) AKA 3840x2160 native resolution.

(This is my current monitor for reference: Asus VS238H-P 23-Inch Full-HD LED-Lit LCD Monitor at native 1920x1080 resolution.)

So, if I wanted to get improved "eye-popping" visuals upgrading from 1920x1080, would it be okay to just stick with 1440p or is it better to pinch pennies until I can afford a 4k monitor?

It's going to be a long drat time before prices for 4k monitors will approach anything close to what you pay for 2560x1440. And that's not even getting into the video card hardware you'd need to get to actually play at 4k.

In the meantime, I think upgrading to 1440p is still a huge step up from 1080p, for both gaming and productivity (especially productivity).

Wistful of Dollars
Aug 25, 2009

TheRamblingSoul posted:

What are some subjective opinions on upgrading to a monitor beyond 1920x1080 for gaming and productivity purposes?

After seeing some 4K HDTVs on display at Best Buy, I really want to upgrade my computer monitor to a higher-resolution one. That being said, I see monitors at 2560x1440 resolution (e.g.: ASUS PB278Q 27-Inch WQHD LED-lit PLS Professional Graphics Monitor) going for a lot cheaper relative to true 4K monitors (e.g.: ASUS PQ321Q 31.5-Inch 4K Monitor) AKA 3840x2160 native resolution.

(This is my current monitor for reference: Asus VS238H-P 23-Inch Full-HD LED-Lit LCD Monitor at native 1920x1080 resolution.)

So, if I wanted to get improved "eye-popping" visuals upgrading from 1920x1080, would it be okay to just stick with 1440p or is it better to pinch pennies until I can afford a 4k monitor?

If you want 4k you'll have to pony up 4k and probably another 800-1000 for a graphics set up that can actually run on it. Wait a couple years.

1440 or 1600 is a huge jump over 1080 and much more affordable. I moved up to 1440 a few months ago and now I'm plotting to replace my remaining 1080s with a second 1440. You'll have trouble ever going back to 1080.

Teriyaki Koinku
Nov 25, 2008

Bread! Bread! Bread!

Bread! BREAD! BREAD!

El Scotch posted:

If you want 4k you'll have to pony up 4k and probably another 800-1000 for a graphics set up that can actually run on it. Wait a couple years.

1440 or 1600 is a huge jump over 1080 and much more affordable. I moved up to 1440 a few months ago and now I'm plotting to replace my remaining 1080s with a second 1440. You'll have trouble ever going back to 1080.

I'm still kind of confused about 1600p: Is that different from 4K? If so, then is there an ASUS monitor or whatever you'd recommend looking at for 1600p native resolution?

[e]: Also, does it make a difference using HDMI over DisplayPort or vice versa with a monitor?

[e2]: And is there a real difference between 2560x1440 and 2560x1600? If not, I'd probably still go with the ASUS 27-inch monitor I linked above for $550 shipped.

Teriyaki Koinku fucked around with this message at 18:30 on Nov 11, 2013

butt dickus
Jul 7, 2007

top ten juiced up coaches
and the top ten juiced up players

TheRamblingSoul posted:

I'm still kind of confused about 1600p: Is that different from 4K? If so, then is there an ASUS monitor or whatever you'd recommend looking at for 1600p native resolution?

[e]: Also, does it make a difference using HDMI over DisplayPort or vice versa with a monitor?

[e2]: And is there a real difference between 2560x1440 and 2560x1600? If not, I'd probably still go with the ASUS 27-inch monitor I linked above for $550 shipped.
1600p is 2560x1600. 4K can mean a lot of things, usually 3840x2160 when referring to TVs or monitors, but "real" 4K is 4096x2160.

DisplayPort has higher bandwidth than HDMI versions under 2.0. If you're using a display with resolution higher than 1920x1200 it's recommended you use DisplayPort or DVI. I'm not sure about the newer models, but my Dell U3011s can only do 1920x1200 with HDMI, but can do 2560x1600 with DisplayPort and DVI. I think that's mostly a limitation of the monitor.

The difference between 2560x1440 and 2560x1600 are the extra 2560x160 pixels that makes the display 11% taller. For just movies and games, this probably doesn't matter. For doing other kinds of computing the extra real estate can be nice.

GrizzlyCow
May 30, 2011

Sidesaddle Cavalry posted:

:stare: Okay. I must have had the misconception that competition was actually the key to making technologies work, and work better, but I guess I'm wrong.

Holy poo poo was my post poorly written.

But yeah, ASUS gets a few months of exclusivity because they, presumably, paid for that right. And form what I can guess, no other company made an offer as sweet as ASUS's. This may hurt G-Sync initial adoption/sales, but NVIDIA has presumably already been paid by ASUS for the exclusive use of G-Sync, so NVIDIA probably doesn't care if this may hurt G-Sync's initial adoption. Its only for a few months at most considering AnandTech indicated that we will being seeing other companies release their G-Sync'd monitors.

NVIDIA seems confident that AMD won't release their (probably open source) version of G-Sync and give it to the competition.

Wait, this is the monitor/display megathread . . .

TFT Central has released their review of Eizo's new FG2421 monitor. Their particular monitor seemed to have a problem with color accuracy at least in comparison to similarly price monitors. The folks at the HardForum seem to be having issues with crosshatching, though.

Teriyaki Koinku
Nov 25, 2008

Bread! Bread! Bread!

Bread! BREAD! BREAD!

Doctor rear end in a top hat posted:

1600p is 2560x1600. 4K can mean a lot of things, usually 3840x2160 when referring to TVs or monitors, but "real" 4K is 4096x2160.

DisplayPort has higher bandwidth than HDMI versions under 2.0. If you're using a display with resolution higher than 1920x1200 it's recommended you use DisplayPort or DVI. I'm not sure about the newer models, but my Dell U3011s can only do 1920x1200 with HDMI, but can do 2560x1600 with DisplayPort and DVI. I think that's mostly a limitation of the monitor.

The difference between 2560x1440 and 2560x1600 are the extra 2560x160 pixels that makes the display 11% taller. For just movies and games, this probably doesn't matter. For doing other kinds of computing the extra real estate can be nice.

Thanks, I appreciate the explanation.

Thinking over the gaming and productivity (ie copywriting work) gains from getting a new monitor, I will definitely hotlist getting a new monitor and mechanical keyboard when I can allocate the money for it. Looking forward to ordering that ASUS monitor and eventually another one for a 1440p dual-monitor set-up. :)

Coredump
Dec 1, 2002


Seiki 39" TV is less than $600 at Amazon right now. http://www.amazon.com/Seiki-Digital-SE39UY04-39-Inch-Ultra/dp/B00DOPGO2G/ref=sr_1_1?ie=UTF8&qid=1384203821&sr=8-1&keywords=seiki+4k+39

Teriyaki Koinku
Nov 25, 2008

Bread! Bread! Bread!

Bread! BREAD! BREAD!

Are there advantages or disadvantages to using an HDTV as a computer monitor versus using an actual high-definition resolution computer monitor?

Magic Underwear
May 14, 2003


Young Orc

TheRamblingSoul posted:

Are there advantages or disadvantages to using an HDTV as a computer monitor versus using an actual high-definition resolution computer monitor?

Well, that monitor can only run at 30hz@4k. So it would be very annoying to use it as a monitor. You would constantly lose track of the mouse and things would be kinda jerky.

Brut
Aug 21, 2007

TheRamblingSoul posted:

Are there advantages or disadvantages to using an HDTV as a computer monitor versus using an actual high-definition resolution computer monitor?

PCMag posted:

http://www.pcmag.com/article2/0,2817,2421100,00.asp

The back of the panel holds a USB port, an HDMI port, component video inputs, and a 3.5mm headphone jack facing left. Two more HDMI ports, another USB port, a VGA video input, 3.5mm, coaxial, and RCA stereo audio inputs, and an F connector for an antenna face downward.

See a few posts up about using DVI/DisplayPort vs HDMI, this TV, being a TV, only has HDMI. Also see the review itself for that particular model.

Ak Gara
Jul 29, 2005

That's just the way he rolls.
Also I know for a fact I'm not the only one that brought a new PC monitor because GTAV looked absolutely poo poo on our 1080p big screen TV's. Going from my 50 inch 1080p "slow" tv to a 24 inch 1ms g2g ASUS vg248qe made the game look a ton better.

[edit] Wait, that's not what you're asking I think.

Ak Gara fucked around with this message at 23:40 on Nov 11, 2013

TheRationalRedditor
Jul 17, 2000

WHO ABUSED HIM. WHO ABUSED THE BOY.
That's mainly because GTA5 is locked at 720p and has to upscale for any output exceeding that. GIANT pixels!

Wistful of Dollars
Aug 25, 2009


On the general scale of 'one' to 'bad idea', I consider that one a rather bad idea.

Anti-Hero
Feb 26, 2004

Ak Gara posted:

Also I know for a fact I'm not the only one that brought a new PC monitor because GTAV looked absolutely poo poo on our 1080p big screen TV's. Going from my 50 inch 1080p "slow" tv to a 24 inch 1ms g2g ASUS vg248qe made the game look a ton better.

[edit] Wait, that's not what you're asking I think.

I hooked it up to my Eizo FS2333 and think it looks worse compared to my 46" TV. Most likely because I sit pretty close to the computer monitor so the aliasing is REALLY apparent.

Coredump
Dec 1, 2002

TheRamblingSoul posted:

Are there advantages or disadvantages to using an HDTV as a computer monitor versus using an actual high-definition resolution computer monitor?

Well there's no other way you're going to get that much resolution for that cheap. Think about it, you'll be getting 4k at a useable size for the price that the Dell u2713hm goes for. Think about how much more screen and pixels that is for the same dollars.

Yes you will be stuck at 30hz but for general desktop multitasking it don't really make a drat. I know this from personal experience because I had a tv I dropped down to 30hz to check it. Window snapping makes it a non issue.

Then when it's time to play some games you can drop it down to 1080p and it will do 60hz. There's supposed to be a firmware update coming where it will do 120hz at 1080p. That way you won't need a monstrous video card to get decent frames at 4k since you will be running at 1080p.

Izam
Jun 6, 2005

Shaocaholica posted:

Display self calibration, is this a 'thing' yet?

circa 2011
http://www.eizo.com/na/press/releases/htmls/cg275w.html

Because 3rd party apps, different OSs, not-computers, color profile files and loss of tonal range are all lovely things. Even for businesses with big bucks and staff to manage it.

edit:

Haha, awesome. The colorimeter is on a servo:

Still not a thing, Eizo has been a producer of premium high end monitors (some going as high as 12 and 14 bit according to their marketing) so it might still be a luxury rather than a viable standard. Then again the current generation of colorimeters have feature sets that go above and beyond the built in models so users may simply want to rely on better devices.

Wasabi the J
Jan 23, 2008

MOM WAS RIGHT
Regarding Seiki chat:

This article reviews the larger model's (50") performance as a gaming monitor.

https://www.youtube.com/watch?v=uXBu9nxLN78

Edit: Just found the 39" version video from the same guys.

https://www.youtube.com/watch?v=sFkdWHfb1kM

Wasabi the J fucked around with this message at 07:42 on Nov 12, 2013

Rakins
Apr 6, 2009

I just got my asus pb278q and can't get it to display in 1440, I am using the supplied DVI cable in the box and have an ati 6950 I'm assuming both of these should be dual link. I can't use the displayport cord in the box because my card has a mini port and it's fullsized. Am I missing some setting or something or do I really have to gently caress with different cables? Have latest ATi catalyst installed and all that

edit: nvm switched DVI ports on my card and fixed it

Rakins fucked around with this message at 23:22 on Nov 12, 2013

Adbot
ADBOT LOVES YOU

Rawrbomb
Mar 11, 2011

rawrrrrr

Rakins posted:

I just got my asus pb278q and can't get it to display in 1440, I am using the supplied DVI cable in the box and have an ati 6950 I'm assuming both of these should be dual link. I can't use the displayport cord in the box because my card has a mini port and it's fullsized. Am I missing some setting or something or do I really have to gently caress with different cables? Have latest ATi catalyst installed and all that

edit: nvm switched DVI ports on my card and fixed it

One of the DVI outs is single link, IIRC. You can get a mini displayport to displayport dongle, and use mini displayport, of the other way around.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply