Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Mr SoupTeeth
Jan 16, 2015

Fuzzy Mammal posted:

I just realized there are people who have never had a CRT monitor and never known anything above 60 fps (rip 100hz 16x12 Trinitron) and it is making me feel :corsair:

It's a bit of a trip to think that flat panel displays of the future have been the norm for nearly 15 years now, and were unequivocally godawful until about 5 years ago. They were such a massive step down for so long I'm amazed they caught on at all.

Adbot
ADBOT LOVES YOU

Parker Lewis
Jan 4, 2006

Can't Lose


Fuzzy Mammal posted:

I just realized there are people who have never had a CRT monitor and never known anything above 60 fps (rip 100hz 16x12 Trinitron) and it is making me feel :corsair:

I was on CRTs from 1989-2004 and don't think any of them were capable of doing over 60Hz.

Volguus
Mar 3, 2009

Parker Lewis posted:

I was on CRTs from 1989-2004 and don't think any of them were capable of doing over 60Hz.

Oh yes they did. I had a Sony Trinitron back in 2002 and I remember it could do 100Hz. Just found some web page: http://www.karbosguide.com/hardware/module7a4.htm . The lovely CRTs couldn't, that's true. I remember my eyes hurting when I finally switched to LCD.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

shrike82 posted:

Yup, I have a Nvidia 970 but am happy with running games at 1080...
Would getting a g-sync monitor help then?

GSync's main selling point is that it's much more playable when you can't keep your framerate faster than the refresh rate. I think 1440p on a 970 is reasonable if you have GSync - I play at 1440p on a 780 Ti which performs about the same as the 970.

It can also help if you are pushing a super high framerate too. It fixes tearing and judder entirely.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Mr SoupTeeth posted:

It's a bit of a trip to think that flat panel displays of the future have been the norm for nearly 15 years now, and were unequivocally godawful until about 5 years ago. They were such a massive step down for so long I'm amazed they caught on at all.

If LCDs never caught on, the planet would be two degrees warmer right now from all the CRTs burning up double the wattage for an extra decade.

pigdog
Apr 23, 2004

by Smythe

xthetenth posted:

Never got more than 60 Hz out of my crt. And a headache.

That's murder. 60 Hz or interlaced modes were basically "overclocking"/marketing reasons and not really usable. 72 Hz was really the bare usable minimum, 75 Hz was okay, 100+ was good.

quote:

I was on CRTs from 1989-2004 and don't think any of them were capable of doing over 60Hz.
You probably didn't know how to set the refresh rate back then. If we're talking SVGA era, then if they supported higher resolutions, then they almost certainly supported higher refresh rates at lower resolutions.

pigdog fucked around with this message at 07:00 on Jun 5, 2016

pigdog
Apr 23, 2004

by Smythe
doublepost

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
I always used to prefer the real-estate of 1280x1024 even if it meant 60hz. Never had any problems with headaches or anything :frogbon:

ThatOneGuy
Jul 26, 2001

Revolutionary Taste.

Mr SoupTeeth posted:

It's a bit of a trip to think that flat panel displays of the future have been the norm for nearly 15 years now, and were unequivocally godawful until about 5 years ago. They were such a massive step down for so long I'm amazed they caught on at all.

You could get a bigger screen, higher resolution, 16:9 or 16:10, and not have it weigh 75lbs. Then they also draw way less power than CRTs, etc. No real mystery.

bull3964
Nov 18, 2000

DO YOU HEAR THAT? THAT'S THE SOUND OF ME PATTING MYSELF ON THE BACK.


Also, they aged much more gracefully.

I sure loved having my $500 Trinitron start going blurry after 3 years.

My last CRT was a KDS AV-195TF, it was one of the top of the line at the time. 19" Trinitron. Looked great when I bought it, it really started going downhill after 2 years and I was happy to replace it with a TN LCD back in 2002 and I've never looked back.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless
My old 1280x1024 crt went up in blue smoke.

My current ultra wide is literally two of those stuck together, resolution-wise.

What a weird reality

Lungboy
Aug 23, 2002

NEED SQUAT FORM HELP
If upgrading both monitor and gpu, is it better to pick gsync vs freesync first and then get the card that fits, or pick the gpu and then the monitor that fits? Or should xsync not really be a deciding factor?

VulgarandStupid
Aug 5, 2003
I AM, AND ALWAYS WILL BE, UNFUCKABLE AND A TOTAL DISAPPOINTMENT TO EVERYONE. DAE WANNA CUM PLAY WITH ME!?




Lungboy posted:

If upgrading both monitor and gpu, is it better to pick gsync vs freesync first and then get the card that fits, or pick the gpu and then the monitor that fits? Or should xsync not really be a deciding factor?

At this point you should decide on your budget first. Nvidia cards and Gsync are the better product and technology, but they are certainly more expensive.

shrike82
Jun 11, 2005

I ended up picking up an Acer XB271HU at a local store based on their recommendations. Pretty happy with it and I guess 1440P isn't bad at all!
I'll pick up a 1080 GTX if I see any performance issues but I play games like Stellaris and Total Warhammer so should be fine for the time being.

krampster2
Jun 26, 2014

Ordered a XB271HU 2 days ago but yesterday my GPU died and so I'm having to run things with my on board graphics on an Intel 6700k until I can get my hands on a 1080. Is it alright to use on board graphics with this monitor? Should I use it in 1080p until I get a new GPU?

Phlegmish
Jul 2, 2011



I'm looking for a ~24" 1920 x 1080 monitor to go with my reasonably high-end gaming PC (i7 6700-K + GTX 1070). I want something fitting but not necessarily insane. Which manufacturer would you guys recommend? What is the maximum reaction time and minimum refresh rate I should be going for? I'm a layman when it comes to this stuff.

e: read the OP and apparently reaction time doesn't really matter.

e2: I'm guessing a 120 Hz minimum refresh rate, from what I'm reading.

Does anyone have an input lag-free, >120 Hz refresh rate, 1920 x 1080, 23-24" monitor at home that they would recommend?

Phlegmish fucked around with this message at 11:49 on Jun 5, 2016

VulgarandStupid
Aug 5, 2003
I AM, AND ALWAYS WILL BE, UNFUCKABLE AND A TOTAL DISAPPOINTMENT TO EVERYONE. DAE WANNA CUM PLAY WITH ME!?




krampster2 posted:

Ordered a XB271HU 2 days ago but yesterday my GPU died and so I'm having to run things with my on board graphics on an Intel 6700k until I can get my hands on a 1080. Is it alright to use on board graphics with this monitor? Should I use it in 1080p until I get a new GPU?

As long as your mobo has display port, it can push 1440p for desktop use. Obviously, gaming performance will be bad.

KingEup
Nov 18, 2004
I am a REAL ADDICT
(to threadshitting)


Please ask me for my google inspired wisdom on shit I know nothing about. Actually, you don't even have to ask.

shrike82 posted:

I'm looking to move from a 24" 1080 monitor to a 27" or 32" UHD monitor. Would driving them at 1080 mean poorer image quality?

If you mean running the game at a resolution lower than the native resolution of the monitor then yes, it will look varying grades of poo poo unless you can get your game running in interger scaling mode.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Parker Lewis posted:

I was on CRTs from 1989-2004 and don't think any of them were capable of doing over 60Hz.

What the hell? You must have had literally the world's worst CRTs, or ran them at such high resolutions that they couldn't go any higher than 60Hz.

I honestly find 60Hz CRTs unusable - they flicker way too much, very unpleasant on the eyes. That effect is much diminished at 72/75, but only becomes solid for me at 85.

SCheeseman
Apr 23, 2003

I had an old Dell 21" Trinitron that managed to push 1600x1200@100hz using good ol' Powerstrip.

Djarum
Apr 1, 2004

by vyelkin

SwissCM posted:

I had an old Dell 21" Trinitron that managed to push 1600x1200@100hz using good ol' Powerstrip.

I had 21" Trinitron that did 1600x1200@100hz it was glorious until a fuse blew in it and I couldn't get it to not blow a fuse on start ever again. I spent months trying to fix it and the day I had to send it on it's way was sad for both my back and losing that screen.

Evil Fluffy
Jul 13, 2009

Scholars are some of the most pompous and pedantic people I've ever had the joy of meeting.

PerrineClostermann posted:

All I remember about my CRT is that ungodly, eye-murdering flickering.

Also CRTs were bricks in a literal sense. I think the last CRT I had was 17" or 19" and I'm pretty sure it weighed more than my current PC plus both monitors.

bull3964 posted:

Also, they aged much more gracefully.

I sure loved having my $500 Trinitron start going blurry after 3 years.

My last CRT was a KDS AV-195TF, it was one of the top of the line at the time. 19" Trinitron. Looked great when I bought it, it really started going downhill after 2 years and I was happy to replace it with a TN LCD back in 2002 and I've never looked back.

Don't forget the demagnetizing. That monitor I had even had a button for and sometimes the screen would go pink and you'd have to hit the top/side of the monitor a few times to get it to go back to normal. The only upside was that since CRTs had glass covering the screen you could just clean them and not worry about dust or smears or needing to use the right kind of cloth/cleaner to not destroy it.

El Grillo
Jan 3, 2008
Fun Shoe
I'm looking at getting a u2410 on Ebay for £120 (used by good condition) as I need a second monitor - my current is 2410 and it's still great. Is there something newer/better I should be getting for that amount of money or are 2410's still a good bet?
Uses will be games, film/tv, and work (office, very occasional graphics). I'm a fan of the large 1200 vertical res, but it's not essential if there's something good out there I don't know about. Not kept up with things in the monitor world...
e: come to think of it if there's anything out there of this size but with better black levels, that would be pretty great. My one issue with my trusty 2410.
Also u2415's look to be only £30/£40 more on eBay than 2410's. As far as I understand it they don't have the same pro-grade colour gamut but are there any other major differences?

El Grillo fucked around with this message at 18:30 on Jun 5, 2016

Mr SoupTeeth
Jan 16, 2015

ThatOneGuy posted:

You could get a bigger screen, higher resolution, 16:9 or 16:10, and not have it weigh 75lbs. Then they also draw way less power than CRTs, etc. No real mystery.

It took years to get to that point, the only advantage they offered in bad old 4:3 days is a much smaller form factor. They were hilariously inferior to a decent CRT regardless of how much you spent, nothing but novelty drove the initial adoption and I'm just surprised it took them as far as it did. I think people forget how borderline unusable those washed out ghosting piles of poo poo were for anything but static images, any type of media use was straight up out of the question.

Ynglaur
Oct 9, 2013

The Malta Conference, anyone?
It also brought about a decade or so of pastel, washed out corporate palettes for branding. Now that phones, laptops, and tablets have decent color accuracy everyone is rushing into rich, almost over saturated palettes.

You can even see the limitations of color accuracy of the times in games: compare Morrowind with Oblivion, for example.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Ynglaur posted:

You can even see the limitations of color accuracy of the times in games: compare Morrowind with Oblivion, for example.

That's a compelling case to go back to bad monitors.

Knot My President!
Jan 10, 2005

How far off are we from having 144 G-Sync IPS displays made by Dell and other companies that don't have hilariously bad quality control issues? 1-2 years?

Parker Lewis
Jan 4, 2006

Can't Lose


HalloKitty posted:

What the hell? You must have had literally the world's worst CRTs, or ran them at such high resolutions that they couldn't go any higher than 60Hz.

I honestly find 60Hz CRTs unusable - they flicker way too much, very unpleasant on the eyes. That effect is much diminished at 72/75, but only becomes solid for me at 85.

Seems I just have a bad memory/forgot how CRTs worked.

My last one was a late-90s Hitachi SuperScan Elite 751 19" which apparently did 640x480@160Hz up to 1600x1200@75Hz. I think I mostly ran it at 1280x1024@85Hz.

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

Armchair Calvinist posted:

How far off are we from having 144 G-Sync IPS displays made by Dell and other companies that don't have hilariously bad quality control issues? 1-2 years?

The QC issues are from the manufacturer of the panels themselves, Dell could make one today and it would still have the same QC issues. Asus normally has very little QC issues on their products so the fact that they have largely the same problems as the Acer versions means that it's the manufacturer of the panels that matters here. The company making the panels has patents on the tech that makes them possible so the only chance of better panels would be if some other company was willing to pay license fees for the tech needed to make them and if anyone was interested I think it would have happened already.

Rookoo
Jul 24, 2007
Got my new Acer Predator XB271HU and it's pretty great.

Should the integrated graphics in my 6600K allow the monitor to run at 144hz? Not planning to run any games til I get my card, but the monitor appears to be sat at 60Hz. I'm using Displayport.

fozzy fosbourne
Apr 21, 2010

Yes. You need to go to your display adapter pane, the monitor tab, and change the refresh rate to 144hz (assuming you are on Windows)

Rookoo
Jul 24, 2007
Yep, that's done it, cheers.

Green Gloves
Mar 3, 2008
How awesome would it be to go from my Dell U2515H 1440 monitor to one of those 34" ultrawide LG 1440 monitors. I bought my Dell for $200 recently and today I see a used LG available for $550. I am really thinking about if it is worth the extra cash. Can anyone who owns one tell me if there any difficulties? I mostly game.

I will own either a rx 480 or 1070 to drive those pixels.

Green Gloves fucked around with this message at 18:34 on Jun 6, 2016

DeaconBlues
Nov 9, 2011
I bought a second hand monitor at the weekend. It's the Acer G277HU and I'm having quirks driving it at 2560x1440 via iGPU. In Windows it keeps reverting back to 1080 and Linux (which is my daily driver) is a no go.

I bought a 750ti to drive it and I'm sure I'll get it playing nice in Linux and Windows once the GPU arrives.

The model of GPU I've bought (evga FTW) allows connection via DisplayPort. I don't own a DisplayPort cable. Should I buy one, or just use HDMI? What are the advantages?

Vinlaen
Feb 19, 2008

I've purchased an Acer X34 (ultrawide 21:9 monitor) and I absolutely love it except for the problems. (monitor keeps showing connected/disconnected even with different cables, large green line on the right hand of the screen but only sometimes, scanlines on the left of screen, etc)

Anyways, I'm going to try to get a replacement but if it also has problems I'm going to go back to 16:9 or 16:10.

With that said, what's the best gaming monitor with the following criteria:

  • 24 - 27" (preferably 27")
  • IPS
  • 100 Hz or higher
  • G-Sync

Does anything fit that bill? ...and is anything with those specifications considered "extremely good" with little to no blacklight bleed, etc?

Thanks for any advice!

dbcooper
Mar 21, 2008
Yams Fan
Greetings you mavens of monitors, you connoisseurs of colors, I seek guidance. Anyone have experience with 10-bit color monitors, GPUs and/or related software issues?

Background:

My nephew is doing a new PC build (PC Building thread post). He's not planning on gaming. He is practicing and studying photography and plans to do film, 3D rendering, editing as well with this PC.

He wants "both a wide color gamut and a wide 10-bit depth (?? may be using wrong term, I mean the range from darkest to lightest pixel)"

He likes the Dell UltraSharp 24 Monitor with PremierColor - U2413 monitor for photo editing

Based on his software preferences (see below), it appears that only certain adapter cards support 10-bit color either as an output signal or as part of the software editing process (uncertainty is mine).

nephew posted:

Software:

Adobe Photoshop, Premiere [Supported GPUs], and (maybe) After Effects [Supported GPUs] for editing large, 16-32bit, 500MB - 3GB image files

Autodesk Maya [Maya 2016 Extension 2 certified graphics hardware, PDF], Mudbox [Mudbox 2016 recommended graphics hardware] (probably) for 3D modeling, sculpting, and rendering stills and movie sequences

Other: Agisoft Photoscan (for photogrammetry), Skanect, meshlab

I don't plan to use this workstation for gaming. I do want a dual screen, HD display setup if my budget allows!

The ideal budget is $1600 [$1200 after monitor] but I'd like to build something that will last for over 5 years, with upgrades/replacement parts.

Strockrow
Jan 30, 2005
Greatest of all the Rows!

dbcooper posted:

Greetings you mavens of monitors, you connoisseurs of colors, I seek guidance. Anyone have experience with 10-bit color monitors, GPUs and/or related software issues?

Background:

My nephew is doing a new PC build (PC Building thread post). He's not planning on gaming. He is practicing and studying photography and plans to do film, 3D rendering, editing as well with this PC.

He wants "both a wide color gamut and a wide 10-bit depth (?? may be using wrong term, I mean the range from darkest to lightest pixel)"

He likes the Dell UltraSharp 24 Monitor with PremierColor - U2413 monitor for photo editing

Based on his software preferences (see below), it appears that only certain adapter cards support 10-bit color either as an output signal or as part of the software editing process (uncertainty is mine).

The newest GPUs (those being released or about to be released now) are supposedly going to offer 10 bit color as part of the newest displayport standard. The old Nvidias, outside of the professional quadro line, very much do not and never have. Much is made of 10 bit this and that, but it matters way less than color gamut, black levels and contrast ratios in terms of the quality of a given monitor. His OS won't be in 10 bit and only a handful of programs support it. If he's doing photography there's a good chance his work will be 12 bit anyway, so he could still get banding. That's a pretty decent monitor and should serve him alright if he is a student. He'll need some sort of colorimeter and the x-rite i1 display pro is more or less the entry level standard for a good color profiler.

If he plans to use this as a reference monitor for video he should know that he will not be able to get it up to professional standards without a significantly larger expenditure of time, effort, and money. You need some sort of breakout box or card (like a blackmagic intensity or decklink or something), either an external lutbox or an internal hardware calibrator in the monitor and a working knowledge of displayCAL to get it looking the way you want it to. This http://www.bhphotovideo.com/c/product/1246461-REG/eizo_cs2420_bk_cs2420_24_16_10_ips.html is an entry level monitor for that sort of thing. Getting it to look juuuuuust right takes some time and specialized software too create the internal LUT that you use for the display. If you want all of that to be done for you get a 19 inch Flanders scientific. It costs about 2500 dollars.

He does not need these fancy things. He should be aspiring to good enough, which is a decent monitor that covers Adobe RGB at 6500K and a color profiler. He should edit video in an sRGB color profile when he needs to do that (it's almost the same as the standard for color on broadcast TV). Do not worry so much about the graphics card.

Strockrow fucked around with this message at 22:22 on Jun 6, 2016

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
I'd strongly consider a Dell P2715Q. The calibration is pretty good from the factory but you'll also want a colorimeter for long-term corrections.

If he wants to save his budget for more important stuff, Dell has a refurb outlet store and they run like 35-40% off coupons very frequently.

Paul MaudDib fucked around with this message at 23:46 on Jun 6, 2016

Elentor
Dec 14, 2004

by Jeffrey of YOSPOS

dbcooper posted:

Greetings you mavens of monitors, you connoisseurs of colors, I seek guidance. Anyone have experience with 10-bit color monitors, GPUs and/or related software issues?

Background:

My nephew is doing a new PC build (PC Building thread post). He's not planning on gaming. He is practicing and studying photography and plans to do film, 3D rendering, editing as well with this PC.

He wants "both a wide color gamut and a wide 10-bit depth (?? may be using wrong term, I mean the range from darkest to lightest pixel)"

He likes the Dell UltraSharp 24 Monitor with PremierColor - U2413 monitor for photo editing

Based on his software preferences (see below), it appears that only certain adapter cards support 10-bit color either as an output signal or as part of the software editing process (uncertainty is mine).

If he's going super serious on stuff why Mudbox and not ZBrush? Unless I'm missing something here (like Mudbox got important for architecture and I didn't know), ZBrush as far as I know is the industry standard.

If he wants a video card that outputs 10-bit then he'll need a 1070 or 1080, or a Quadro (super expensive). Even if he doesn't get one, a 10-bit monitor is gonna be super important so that he won't get awful banding after calibrating it, which is something you can't have when doing photograpy. Banding is bad enough as it is in Wide Gamut.

Adbot
ADBOT LOVES YOU

kimcicle
Feb 23, 2003

kimcicle posted:

I've been perfectly happy with my Dell 2408, but I have a feeling it might be on it's last legs. Sometimes it fails to power up on the first or second try. Are the Dell Ultrasharps still the top dog in terms of 24" monitors? Has monitor technology advanced far enough where my almost 8 year old monitor is now outdone by cheaper models?

I'm assuming that if I ran out and got whatever new ultrasharp model in 24" is out now that I'll still be happy, but if there's a consensus as to a better / cheaper model I'm all ears.

So now my 2408 doesn't turn on anymore. I made the mistake of playing on an 144hz monitor and now I think I'm willing to forgo the IPS panel for the buttery smoothness.

Poking around, it seems that Amazon has the Dell 27" monitor for a decent discount, making it around $540 or so after taxes. Is there any reason to bump up to the ASUS 27" gaming monitor?

http://amzn.com/B0149QBOF0

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply