Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
I have to build a video wall for an expo. I have nine 1080p HDTVs (3x3 configuration) and a rolling frame meant to hold them.

I took a look at video wall processor units, but they're like $10,000 each and complicated/proprietary.

Can I build a PC with two or three midrange GPUs so I have a total of nine HDMI out, and use some kind of trickery so Windows treats it as a single display for full-screen video purposes? Will a bunch of graphics cards be able to drive a 1080p or 4k video across all the displays like that?

Adbot
ADBOT LOVES YOU

penus penus penus
Nov 9, 2014

by piss__donald

Zero VGS posted:

I have to build a video wall for an expo. I have nine 1080p HDTVs (3x3 configuration) and a rolling frame meant to hold them.

I took a look at video wall processor units, but they're like $10,000 each and complicated/proprietary.

Can I build a PC with two or three midrange GPUs so I have a total of nine HDMI out, and use some kind of trickery so Windows treats it as a single display for full-screen video purposes? Will a bunch of graphics cards be able to drive a 1080p or 4k video across all the displays like that?

Ha this is something I was "lead technician" on for a company. Unfortunately... all I know is how to setup the real deal though (NEC seamless in particular) so I wont be of much help for alternatives, however (!) a friend of mine who still works there was telling me how some plain jane (new) 46 inch sony tv's were able to inherently daisy chain for a pseudo video wall. That's what I'd look for if you have any kind of budget for new monitors and there is no software alternative. The problem I see is most multi monitor solutions attempt to drive all those pixels rather than display a single 1080p image across 9 monitors but seems like that should be fairly simple.

You could be really lucky and the monitors you have might have that feature. Luckily 3x3 is a reasonable number too. Keywords to search for are tile/matrix monitors and digital signage

penus penus penus fucked around with this message at 19:28 on Jul 20, 2015

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

Zero VGS posted:

I have to build a video wall for an expo. I have nine 1080p HDTVs (3x3 configuration) and a rolling frame meant to hold them.

I took a look at video wall processor units, but they're like $10,000 each and complicated/proprietary.

Can I build a PC with two or three midrange GPUs so I have a total of nine HDMI out, and use some kind of trickery so Windows treats it as a single display for full-screen video purposes? Will a bunch of graphics cards be able to drive a 1080p or 4k video across all the displays like that?

This guy did something similiar

http://www.wsgf.org/forums/viewtopic.php?t=26515

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

Zero VGS posted:

I have to build a video wall for an expo. I have nine 1080p HDTVs (3x3 configuration) and a rolling frame meant to hold them.

I took a look at video wall processor units, but they're like $10,000 each and complicated/proprietary.

Can I build a PC with two or three midrange GPUs so I have a total of nine HDMI out, and use some kind of trickery so Windows treats it as a single display for full-screen video purposes? Will a bunch of graphics cards be able to drive a 1080p or 4k video across all the displays like that?

Do the displays have displayport I/O? If so I think you can daisychain them together.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
Huh, some guy figured it out with Eyefinity in 2010:

https://www.youtube.com/watch?v=GTzTKBKRAwg

lol here's 1x9 config, puts our guys to shame:

https://www.youtube.com/watch?v=QiYDbdHB548

Zero VGS fucked around with this message at 20:56 on Jul 20, 2015

mpyro
Feb 9, 2003

'Cause I live and breathe this Fillydelphia freedom
Would I be better off with a 970 or a 390 w/8 gb ram? They cost about the same. Will be doing VR with them at some point.

Anime Schoolgirl
Nov 28, 2002

I'd take the 970 unless you are also doing workstation things.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

mpyro posted:

Would I be better off with a 970 or a 390 w/8 gb ram? They cost about the same. Will be doing VR with them at some point.

Get the 970. Since Rift and Vive both don't have a particularly high resolution, they'll have a very hard time filling up 4gb (or 3.5gb) VRAM. You want all the raw FPS you can get since both headsets are supposed to be running at 90hz. NVidia also was the first of the two to claim VR specific latency upgrades, and they tend to have higher quality and more timely work on the driver front, so I'd invest there.

mpyro
Feb 9, 2003

'Cause I live and breathe this Fillydelphia freedom
Thank you.
How good is Gigabyte for the 970? See a open box at a local Microcenter for $288.
Edit: nevermind that Gigabyte is crap.

Where might I find a fairly inexpensive 970? Ebay or..

mpyro fucked around with this message at 00:39 on Jul 21, 2015

Tanreall
Apr 27, 2004

Did I mention I was gay for pirate ducks?

~SMcD
EVGA B-stock is probably the cheapest when they have stock.

sauer kraut
Oct 2, 2004
EVGA B-Stock. Make sure the part number includes -397X-, the old 2000's are not good.
http://www.evga.com/Products/ProductList.aspx?type=8&family=GeForce+900+Series+Family

Or SA Mart.

Nullsmack
Dec 7, 2001
Digital apocalypse

goodness posted:

I vaguely recall seeing an external GPU at a friend's awhile ago but I could be wrong? And I didn't know all Macs were unable to change there GPU. That is pretty dumb, it is his pc though so I was just trying to help him s bit.

You have two options:
http://gizmodo.com/a-wonderful-lunatic-turned-a-macbook-air-into-a-badass-967800593
or
http://lg.io/2015/07/05/revised-and-much-faster-run-your-own-highend-cloud-gaming-service-on-ec2.html

From the earlier discussion it sounds like setting up the equipment to do the external GPU is glitchy.
Running something via Amazon EC2 might work.

Bleh Maestro
Aug 30, 2003
They don't have other 970's at your microcenter? You might want to hold off for a bit anyway, they will be bundled with a new game promo for MGS5 soon.

mpyro
Feb 9, 2003

'Cause I live and breathe this Fillydelphia freedom

Bleh Maestro posted:

They don't have other 970's at your microcenter? You might want to hold off for a bit anyway, they will be bundled with a new game promo for MGS5 soon.

Would it be better buying it from there or from Newegg or Amazon?

The Illusive Man
Mar 27, 2008

~savior of yoomanity~
So, rearranged my case a bit, and my 970 is now idling at 42 degrees. Thankfully it didn't even require any real change to my cable management.

SlayVus
Jul 10, 2009
Grimey Drawer
Ambient temperature of 23.3C. GPU idling temps with idle clocks 29C. Full 3D clocks idle, 35-36C. Fan running at 50% idle.

Fan running at 0%, idle clocks. 31c.

Case is a C70 with a side panel that does not have the fan holes drilled into it(Thanks MNPCTech).

SlayVus fucked around with this message at 02:38 on Jul 21, 2015

Google Butt
Oct 4, 2005

Xenology is an unnatural mixture of science fiction and formal logic. At its core is a flawed assumption...

that an alien race would be psychologically human.

Which 980ti is the one to get?

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map
I don't know but EVGA will let you pay more than a new Titan X for one if you want

http://www.evga.com/articles/00944/EVGA-GeForce-GTX-980-Ti-KINGPIN/

Kazinsal
Dec 13, 2011



...The gently caress is "ASIC Quality"? GPU-Z says my 290 is 76.1%.

I'm assuming since 76+% is something EVGA wants people to pay more to guarantee for the 970, it means I could theoretically push some pretty nice clocks out of this thing.

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!

Google Butt posted:

Which 980ti is the one to get?

http://pcpartpicker.com/part/msi-video-card-gtx980tigaming6g

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

Kazinsal posted:

...The gently caress is "ASIC Quality"? GPU-Z says my 290 is 76.1%.

I'm assuming since 76+% is something EVGA wants people to pay more to guarantee for the 970, it means I could theoretically push some pretty nice clocks out of this thing.

Some people say it means the higher the asic the less voltage it needs and the lower the temps, but thats not always the case and honestly its like voodoo magic. Its not an EVGA thing. They are the first person ive seen market it though.

Kazinsal
Dec 13, 2011



Don Lapre posted:

Some people say it means the higher the asic the less voltage it needs and the lower the temps, but thats not always the case and honestly its like voodoo magic. Its not an EVGA thing. They are the first person ive seen market it though.

I figured it was a general GPU thing since my 290 is an XFX, but... I mean, good to know it's mostly voodoo, but I wonder if a 76% ASIC quality helps contribute to this thing running stable at 1150 MHz (e: granted with VRM temps that make me want to scream when the core's at 93 C but I cap my framerate to 60 in games anyways).

Kazinsal fucked around with this message at 04:53 on Jul 21, 2015

Watermelon Daiquiri
Jul 10, 2010
I TRIED TO BAIT THE TXPOL THREAD WITH THE WORLD'S WORST POSSIBLE TAKE AND ALL I GOT WAS THIS STUPID AVATAR.
what the heck does asic mean in this context? I might be dumb, but I don't see how 'application specific ic' fits into that context.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

Watermelon Daiquiri posted:

what the heck does asic mean in this context? I might be dumb, but I don't see how 'application specific ic' fits into that context.

They test the asic on the cards and then sell them for more money if its a higher number.

A higher number COULD mean better overclocks.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Anime Schoolgirl posted:

The R7 250/GT 740 would be the perfect cards at that price point. I'd lean towards the GT 740 for the better GPU oomph because the 250's compute advantage isn't something I see them coming across very often. Shame for both you have one awful brand for each since they're the only ones bothering with low profile: XFX and Zotac.

Not true, found an EVGA low profile 740 w/2GB GDDR5, but I guess the card would be rock bottom price without inflating the cost with useless memory.

Watermelon Daiquiri
Jul 10, 2010
I TRIED TO BAIT THE TXPOL THREAD WITH THE WORLD'S WORST POSSIBLE TAKE AND ALL I GOT WAS THIS STUPID AVATAR.

Don Lapre posted:

They test the asic on the cards and then sell them for more money if its a higher number.

A higher number COULD mean better overclocks.

What are they measuring? efficiency?

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

Watermelon Daiquiri posted:

What are they measuring? efficiency?

Apparently it measures voltage leaking

This is from gpu-z



Its not necessarily true though for all cards

Watermelon Daiquiri
Jul 10, 2010
I TRIED TO BAIT THE TXPOL THREAD WITH THE WORLD'S WORST POSSIBLE TAKE AND ALL I GOT WAS THIS STUPID AVATAR.
:psyduck: that's a wierd as gently caress way of using it, then. Why not just refer to the jitter or something?

Lord Windy
Mar 26, 2010
When can I expect my Zotac 560Ti to stop being good enough? I thought it was a while ago as I stopped playing video games, but I bought that Alien game on sale and at 1080p it's on ultra graphics running very well. Now I found out that Vulkan and DX12 are going to be patched in soonish.

I look at the gaps between the 560ti and the newer graphics cards but it all seems work better than I would ever have hoped.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
A 560ti is pretty close to obsolescence now, there just isn't enough memory or performance for newer titles. If you don't plan on playing anything newer, a 560ti is fine for 2013 games and earlier for 1080p. If you plan to get a higher resolution monitor or want to play games newer than 2013 you'll need to look at R9 280, R9 280X, R9 290, or GTX 970. I don't really want to suggest a GTX 960 due to poor price/perf but it's your money. All of this depends on your PSU though - age, 12v, efficiency and brand are important factors in whether it might need replaced.

If you're perfectly happy with current performance, I'd recommend a 750ti, slightly better performance than the 560ti at stock, get one with a 6pin and it's a good OCer, consumes much less power and is a safer bet without knowing about your PSU.

Professor Science
Mar 8, 2006
diplodocus + mortarboard = party

Don Lapre posted:

Apparently it measures voltage leaking

This is from gpu-z



Its not necessarily true though for all cards
uhhhh as a person who knows a lot of things about binning and voltages and things like that I'm pretty sure this is total nonsense, there's no way to determine this outside of hugely complicated test suites that have lower level access to the GPU than you do in Windows (and the results aren't burned onto the BIOS or anything afaict)

BurritoJustice
Oct 9, 2012

All this talk of ASIC quality reminds me of my 570 that had a 97% ASIC quality and still needed additional voltage to be stable at stock clocks later in its life.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Professor Science posted:

uhhhh as a person who knows a lot of things about binning and voltages and things like that I'm pretty sure this is total nonsense, there's no way to determine this outside of hugely complicated test suites that have lower level access to the GPU than you do in Windows (and the results aren't burned onto the BIOS or anything afaict)
Binning results are definitely burned into the ASIC, they have to be since they determine the DVFS (Dynamic Frequency and Voltage Scaling I know you know this but for other peeps) tables/curves that are applied. I think they would also need to be available for applications to query on systems that do software-interactive frequency/voltage control, so it would be required on Boost-enabled GPUs and smartphone SoCs*, but not on CPUs. GPU-Z is just querying this value and mapping all manufacturer's internal values to a % using magic bullshit, but the underlying data should be real. I wonder if this is still available on Fury since AMD is doing Steamroller-style Adaptive Clocking? I think so because I think they still do both software-interactive and hardware-controlled scaling to maximize frequency, with the adaptive clocking basically exploiting their built-in overclocking margin.

*You can read the bin of your smartphone SoC via CPU-Z, your kernel contains voltage-frequency tables for each bin and the appropriate one is applied based on the value returned. In general higher-priced phones have better bins, the OnePlus One for example tends to have lower-tier bins for reduced price.

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map
Now that I have stopped laughing at the prices on the Kingpin 980ti, I've remembered that there is a small business in the US that also does a similar thing with binning Intel 4790Ks depending on how high they (cpu) overclock in practice, which is a little less voodoo than ASIC.

The guy's prices scale with very much the same proportions as EVGA is using here, which leads me to believe that scarcity and perceived value are calculated with a formula that I'm not privy to because I am a money-hating layman

Lord Windy
Mar 26, 2010

FaustianQ posted:

A 560ti is pretty close to obsolescence now, there just isn't enough memory or performance for newer titles. If you don't plan on playing anything newer, a 560ti is fine for 2013 games and earlier for 1080p. If you plan to get a higher resolution monitor or want to play games newer than 2013 you'll need to look at R9 280, R9 280X, R9 290, or GTX 970. I don't really want to suggest a GTX 960 due to poor price/perf but it's your money. All of this depends on your PSU though - age, 12v, efficiency and brand are important factors in whether it might need replaced.

If you're perfectly happy with current performance, I'd recommend a 750ti, slightly better performance than the 560ti at stock, get one with a 6pin and it's a good OCer, consumes much less power and is a safer bet without knowing about your PSU.

I basically don't play video games anymore, but I am a programmer and really interested in learning more OpenCL. Would the 270x/280 be a good fit? I can only fit one GPU on my mobo. I'd also get a 280x but they are only up for stupid prices on Ebay and I can't see them on Umart/MSY.

One Eye Open
Sep 19, 2006
Am I awake?

Lord Windy posted:

I basically don't play video games anymore, but I am a programmer and really interested in learning more OpenCL. Would the 270x/280 be a good fit? I can only fit one GPU on my mobo. I'd also get a 280x but they are only up for stupid prices on Ebay and I can't see them on Umart/MSY.

Here are some benchmarking results.

Also, something to bare in mind is that nVidia have only just started supporting OpenCL 1.2, whereas AMD cards support OpenCL 2.0 at the moment. There are rumours that nVidia may support 2.0 later this year though, but I don't know how reliable that is.

Josh Lyman
May 24, 2009


I have a 290 so I don't expect to upgrade anytime soon, but I also have a Korean dual-DVI 27" monitor. Should I assume that any card I'll upgrade to in the next few years will only have 1x DVI, and that any monitors I buy going forward must have DisplayPort as opposed to DVI/HDMI?

I've been slowly catching up on this thread since page 375 or so. I love this thread so much :allears:.

penus penus penus
Nov 9, 2014

by piss__donald

Josh Lyman posted:

I have a 290 so I don't expect to upgrade anytime soon, but I also have a Korean dual-DVI 27" monitor. Should I assume that any card I'll upgrade to in the next few years will only have 1x DVI, and that any monitors I buy going forward must have DisplayPort as opposed to DVI/HDMI?

I've been slowly catching up on this thread since page 375 or so. I love this thread so much :allears:.

AMD is moving away, nvidia doesnt show signs yet as far as I know but in a few years? I wouldn't be shocked if they dropped off at least reference cards. However I would expect aftermarket companies to hang onto them for longer than reference

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Alereon posted:

Binning results are definitely burned into the ASIC, they have to be since they determine the DVFS (Dynamic Frequency and Voltage Scaling I know you know this but for other peeps) tables/curves that are applied. I think they would also need to be available for applications to query on systems that do software-interactive frequency/voltage control, so it would be required on Boost-enabled GPUs and smartphone SoCs*, but not on CPUs. GPU-Z is just querying this value and mapping all manufacturer's internal values to a % using magic bullshit, but the underlying data should be real. I wonder if this is still available on Fury since AMD is doing Steamroller-style Adaptive Clocking? I think so because I think they still do both software-interactive and hardware-controlled scaling to maximize frequency, with the adaptive clocking basically exploiting their built-in overclocking margin.

*You can read the bin of your smartphone SoC via CPU-Z, your kernel contains voltage-frequency tables for each bin and the appropriate one is applied based on the value returned. In general higher-priced phones have better bins, the OnePlus One for example tends to have lower-tier bins for reduced price.

Pretty sure if anything, Fury voltages are going to be even tighter to the chip's requirements, that adaptive clocking is just so they can take the voltage a bit lower because then the voltage just has to be above the requirement rather than the voltage being above it by a safety margin to deal with any voltage dips.

Adbot
ADBOT LOVES YOU

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

xthetenth posted:

Pretty sure if anything, Fury voltages are going to be even tighter to the chip's requirements, that adaptive clocking is just so they can take the voltage a bit lower because then the voltage just has to be above the requirement rather than the voltage being above it by a safety margin to deal with any voltage dips.
Another way of looking at that though is that adaptive clocking lets them clock the processor beyond what would normally be stable at a given voltage.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply