Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Ramadu
Aug 25, 2004

2015 NFL MVP


Would a gtx570 be able to drive 2 of those Korean 27inch monitors? I would probably only game on one of them because I like to have a separate monitor for all my other stuff. I'd just like to know before I pull the trigger on 2 of them and discover my video card can't keep up. I currently am gaming at 1680x1050 so it would be a huge upgrade.

Adbot
ADBOT LOVES YOU

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
In terms of frames per second, it'd be a compromise. Lowered details, no AA, etc. You might even consider just setting everything to 1920x1080 with high details and letting the video card scale it up.

In terms of making the monitors show images, as long as the card has two dual-link DVI ports, it'll be fine.

Berk Berkly
Apr 9, 2009

by zen death robot
GeForce 306.02 BETA Drivers are out:

http://www.geforce.com/drivers

Edit: Fixed so it wouldn't point you directly to the XP drivers. Thanks for the heads-up.

Berk Berkly fucked around with this message at 07:47 on Aug 28, 2012

Ramadu
Aug 25, 2004

2015 NFL MVP


Factory Factory posted:

In terms of frames per second, it'd be a compromise. Lowered details, no AA, etc. You might even consider just setting everything to 1920x1080 with high details and letting the video card scale it up.

In terms of making the monitors show images, as long as the card has two dual-link DVI ports, it'll be fine.

Yeah, that's what I was planning on doing. Since I already game at such a low resolution (comparatively) I was just going to go to 1080p and leave it there. I just wanted to make sure I had enough juice to make the monitors go. I'm perfectly willing to go without AA and turn some details down. Thanks for the help buddy.

Aquila
Jan 24, 2003

Ramadu posted:

Yeah, that's what I was planning on doing. Since I already game at such a low resolution (comparatively) I was just going to go to 1080p and leave it there. I just wanted to make sure I had enough juice to make the monitors go. I'm perfectly willing to go without AA and turn some details down. Thanks for the help buddy.

I have a 560ti 448 and it drives games at 2560x1440 acceptably. I don't think I ever have to turn AA completely off, but I rarely max all my settings either. In general life with the that graphics card and two of the Korean 27" monitors is great, so a 570 should be good as well.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down


Just a friendly note, that links to the XP drivers - here's a link to the page where you can enter your appropriate OS and whether x86 or x64. Good drivers, basically just what we had already anticipated given the dev drivers 305.67/305.68: rolling improved functionality and 660Ti support into one driver, with additional profiles for big releases.

Notably, the driver team was concerned that the 305.67/305.68 drivers didn't have support for Sleeping Dogs, even went on their biggest partner's forum and stated that they should have been included, and that they will be in the next driver release - and then gave manually enterable compatibility bits for SLI etc, which is a little weird for the dev drivers but not totally unheard of. If you know about the dev drivers then presumably you know how to manually edit the driver bits and can take care of that stuff yourself, but apparently they were supposed to make it into the 305.67/.68 release candidates and didn't, oops. They're here now.

... which makes me want to play it, if nVidia are specifically concerned and want to be sure their cards run it correctly it's probably a game worth playing. Although that's also confirmation bias since it looks pretty awesome and I'd like to pick it up. :v:

Agreed fucked around with this message at 06:30 on Aug 28, 2012

incoherent
Apr 24, 2004

01010100011010000111001
00110100101101100011011
000110010101110010
Amusingly sleeping dogs is a AMD branded joint. The game is a surprise sleeper hit (and legitimately fun). If nvidia is going that far to make it work, then they must of dropped the ball in marketing and attempting to save face.

It really is the kind of game you want to show off a video card with.

fake edit: it fixed some noticeable stuttering issues when locking in half-rate vysnc.

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

Factory Factory posted:

And before you say "But Australia!" I priced it out on Mwave and it's still ballpark.

Your Pricing is reasonable if I was interested in buying midrange, value for money stuff the way everyone on SA is obsessed with. I'm not. Value for money is irrelevant to me. I simply don't care about spending twice as much for a 10% performance improvement. When it comes to buying things, I have a much simpler philosophy. I identify the best performing products, then I buy them. If I can't afford to do that, I hold off my purchase until I can.

Either way, I'm not likely to do anything for a year, and I feel like I'm beginning to derail the thread so I think it's time to drop it and return to talking about graphics cards.

Speaking of which, any word on when AMD is going to return fire with the 8xxx series? I remember reading that they wanted to get them out by the end of the year.

Berk Berkly
Apr 9, 2009

by zen death robot

The Lord Bude posted:

Your Pricing is reasonable if I was interested in buying midrange, value for money stuff the way everyone on SA is obsessed with. I'm not. Value for money is irrelevant to me. I simply don't care about spending twice as much for a 10% performance improvement. When it comes to buying things, I have a much simpler philosophy. I identify the best performing products, then I buy them. If I can't afford to do that, I hold off my purchase until I can.

This is a especially terrible philosophy for buying tech. If you can't afford things now its at least partially because you spent twice as much for that 10% performance improvement before multiple times already.

If you had nigh-unlimited funds your purchasing strategy might make sense.

There is a good reason why goons put such an emphasis on high price/performance, even has high end gamers. We know eventually our poo poo is going to be outdated, outperformed, and eventually obsolete and paying premium now barely slows that inevitability down.

quote:

Speaking of which, any word on when AMD is going to return fire with the 8xxx series? I remember reading that they wanted to get them out by the end of the year.

Optimistically in time for Christmas shopping, but more likely in the first quarter 2013 to take advantage of the time before Nvidia can online a 700 series Kepler refresh.

Berk Berkly fucked around with this message at 08:24 on Aug 28, 2012

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

The Lord Bude posted:

Your Pricing is reasonable if I was interested in buying midrange, value for money stuff the way everyone on SA is obsessed with.

What? Wanting to spend your money sensibly is an obsession? gently caress it, you're right, why bother giving good advice.

Go spec out an Alienware to the highest extent, with 32GB RAM and install XP 32 bit!

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

Berk Berkly posted:

Optimistically in time for Christmas shopping, but more likely in the first quarter 2013 to take advantage of the time before Nvidia can online a 700 series Kepler refresh.

Do you think Nvida will just come out with a gtx685 or something? It always seemed as though despite being a performance powerhouse the 680 was a very conservative gpu... I bet Nvidia could have made a single gpu that takes performance to near 690 levels if they wanted to.

Every week I set aside $25. Every 3 years I use that set aside money to build a new PC from scratch (obviously not including monitor/peripherals which I replace whenever the mood hits me). Broken down like that, it isn't a large amount of money week by week, I'd spend at least twice that each week just on coffee.

The only reason I ever considered upgrading something outside that timeframe is because my monitor died earlier this year, which necessitated a premature jump to a 2560x1440 screen, which in turn put excessive pressure on what was otherwise a very powerful graphics card. While I could use some of the money I have saved up to do a full refresh now, I'd rather wait till next year so I can do it properly.

The Lord Bude fucked around with this message at 09:25 on Aug 28, 2012

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

The Lord Bude posted:

Do you think Nvida will just come out with a gtx685 or something? It always seemed as though despite being a performance powerhouse the 680 was a very conservative gpu... I bet Nvidia could have made a single gpu that takes performance to near 690 levels if they wanted to.

GK110 is basically that - 2880 Kepler shaders at 900 MHz with a 384-bit * 6 GHz memory bus @ 250W TDP. But it's also literally double the size of the 680's GK104 GPU, and so ludicrously expensive to make (as a larger percentage of chips will have flaws and imperfections, and there will be fewer saleable chips per wafer because they're so big). Thus it's destined to be a Tesla part first and foremost, where profit margins are obscene. Maybe if we're beyond lucky, we'll get the harvested dregs as a GeForce card, but probably only after they allocate the only-slightly-bad chips for Quadros. But with TSMC's 28nm process improving, don't count on a lot of multiple-flaw chips to make this happen.

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

Factory Factory posted:

GK110 is basically that - 2880 Kepler shaders at 900 MHz with a 384-bit * 6 GHz memory bus @ 250W TDP. But it's also literally double the size of the 680's GK104 GPU, and so ludicrously expensive to make (as a larger percentage of chips will have flaws and imperfections, and there will be fewer saleable chips per wafer because they're so big). Thus it's destined to be a Tesla part first and foremost, where profit margins are obscene. Maybe if we're beyond lucky, we'll get the harvested dregs as a GeForce card, but probably only after they allocate the only-slightly-bad chips for Quadros. But with TSMC's 28nm process improving, don't count on a lot of multiple-flaw chips to make this happen.

So would a GK110 based 690 have been more expensive do you think than the dual GPU 690 that was released? It would have been a pretty feather in Nvidia's cap to release a single GPU card that performed as well as whatever AMD could have managed with a Dual GPU, not to mention getting rid of driver issues and microstutter.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
That's not an easy question for a Joe Schmoe nerd like me to answer.

Everything but the GPU itself would be cheaper on such a card. That would drive the price down.

The GPU itself... For the moment, to simplify, let's say that all the demand for Tesla and Quadro versions of the chip were satisfied, so there is no opportunity cost in putting a GK110 in a GeForce 690B.

The actual cost of the GPU has two significant parts: the actual marginal cost of production, and profit thereon earmarked to pay off the investments to design and validate the chip in the first place. As a first approximation, the design and validation investment is the same for each GPU of a given architecture, regardless of the die size. So we can treat that as a fixed part of the cost side of the equation.

The marginal cost of production, on the other hand, varies significantly with die size. Silicon wafers come in pre-defined sizes, and chips are ordered by the wafer. That means that, if we assume GK110 is twice the size of GK104, each wafer of GK104 produces twice the chips as GK110. So that fixed per-wafer production cost results in more GPUs.

But lumped on top of that is manufacturing flaws. Semiconductor fabrication is not a perfect process, and flaws are scattered pretty much at random throughout the wafer. These flaws can be minor, such that they result in a part of the GPU not being able to hold up to high clock speeds, or they can be major, such that the entire logical section of the GPU which has the flaw is irreparably damaged. Since GK110 is twice the size of GK104, that means it's more likely that each GPU will receive a flaw, and more likely to receive two flaws, as well.

Side note in empirical reality: GK104 already has harvests for 1) full chip, 2) 1 SMX disabled, 3) 1 SMX and one RAM controller disabled, 4) 2 SMX and one RAM controller disabled. At this point, GF110 has had a number of harvests as well: 1) full chip, 2) 1 SMX and 1 memory controller disabled, 3) 2 SMX and 1 memory controller disabled, 4) 6 SMX(!) and 1 memory controller disabled (the GeForce 560 Ti OEM), and 5) 5 SMX and 1 memory controller disabled, plus severely hobbled clock speeds (GeForce 560 OEM). GF104 and GF114 also had a crazy number of cut-downs and harvests. So there are LOTS of flaws per wafer to go around.

The end result is that not only does GK110 produce fewer chips per wafer, but a smaller proportion of those chips are fully functional. So it's a lot more expensive to make the chips for the same number of GeForce 690s as it was to use two GK104s. Therefore, if you price each GK110-based 690B the same as a dual-GK104 690, you're actually making less money to go into paying off the design and validation investment.

And then, after that, you're left with all these flawed GK110s lying around. What do you do with them? They aren't fit to be a GeForce 690B, but they're very expensive to just toss into an acid bath to recover some precious metals.

Well, of course, you can harvest them, sure. Add the GeForce 689, 688, 685, 685 (OEM), etc. etc. to the mix. Now you're getting money for these chips which would have been garbage otherwise! Right?

Not really. While some buyers may appreciate the higher performance and move up to a 685 where they would've been stuck at a 680 before, many more buyers looking at the 690/690B would suddenly have a lot of options that performed in the same ballpark, yet chopped a few hundred dollars off the price tag of the card. Cards which, except for the GPU itself, largely cost the same to produce.

So a 689 and a 690B cost almost the exact same to produce, but because the 689 retails for $100 less, that's $100 less paying back the cost to develop the chip going into Nvidia's pocket.

And God help you if AMD releases DoubleTahiti and forces you to cut your prices. There's a lot of room to cut prices on a video card before they start being unprofitable on the margins. But if you want to turn a profit in the long run, you better believe you need a healthy margin on the product.

And that's really the point in a nutshell: a GK110 690 would not command the same profit as a 2xGK104 690.

Of course, there's another side to the coin: GK110 is a revision of Big Kepler. A Big Kepler GPU of the same vintage as GK104 would be GK100. That GPU has not been productized; you don't see it anywhere in the marketplace, probably for the reasons above combined with a ton of other problems in getting the design to work within a reasonable power budget. So a single-GPU 690, released when the regular 690 was, would have required a GPU that didn't exist in a mature and validated state.

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride

The Lord Bude posted:

Your Pricing is reasonable if I was interested in buying midrange, value for money stuff the way everyone on SA is obsessed with. I'm not. Value for money is irrelevant to me. I simply don't care about spending twice as much for a 10% performance improvement. When it comes to buying things, I have a much simpler philosophy. I identify the best performing products, then I buy them. If I can't afford to do that, I hold off my purchase until I can.

But an i5/z77 setup IS the best, so :confused:

craig588
Nov 19, 2005

by Nyc_Tattoo
He's just trolling this thread, the Phenom was never the best performing option but for some reason he has one.

craig588 fucked around with this message at 16:18 on Aug 28, 2012

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Dogen posted:

But an i5/z77 setup IS the best, so :confused:

Once someone says that everyone keeping up to date with high-performance stuff is wrong compared to that person's strategy of upgrading every 3 years with the highest end stuff, that's when you can just call off the whole thing, nothing's going to change that mindset. It's self-affirming/confirmation-bias to a degree, since I'm sure the 5970 has been a very solid performer since he got it, but that's more a product of games not being very demanding than anything else.

We did get a win, in that he will be looking into the GTX 670 (or 680, since... you know... extra hundred or more, slightly higher performance, GOTTA HAVE THE BEST!). That's about all we can hope for out of this, he doesn't accept the basic premise that a rather old AMD K10-based processor is holding him back because ___________. It's a wrap.

Verizian
Dec 18, 2004
The spiky one.
Went back a few pages and could only find people asking a similar question to mine so I think I'll go ahead and order this Palit GTX670 Jetstream tomorrow if nobody posts a negative response by then. The three reviews I found on the first page of search results claim it's comparible to a stock 680 so it's a decent gamble and UK consumer rights are pretty solid if it's in any way faulty.

Only thing is they claim it can support 3x DVI displays using the included HDMI-DVI adaptor but I thought you had to use an active DsiplayPort converter for more than two DVI/HDMI displays? Aren't they usually on the same circuit?

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

It looks like an interesting unit, not a big fan of the gaudy superbright LEDs on the card thing but the custom cooling solution seems like it should be effective and quiet (unless they REALLY MEAN IT when they say it's like "a turbojet" in which case send that sucker back, this generation of hardware from team green runs cool and quiet in reference designs, no need to go noisier rather than quieter with a custom design).

As far as monitors go - bear in mind I have absolutely no experience with the unit at all, or any other card from the brand - they claim the following outputs:

DVI Port 2 x Dual-Link DVI-I
HDMI Port 1 x HDMI
DisplayPort 1 x DisplayPort

Also, should be of note that the manufacturer says it's a 1006MHz/1084MHz unit, which... I really don't know much about, to be honest, wish I could tell you more. Most 670s seem to have a reference base clock of 915MHz. Provided they've binned correctly and carefully that should be a nice performance boost right out the gate, though if they've binned very carefully and offer higher end models as well, it could also mean that your overclocking will be limited since it's already got a solid 10% from the factory and not every card will push much farther than that (some won't do that, which causes issues still with some 670s that are sold as factory overclocked cards - this whole turbo thing with power thresholds limiting at TDP makes the chips tougher to test quickly and accurately).

Verizian
Dec 18, 2004
The spiky one.
Turns out a friend of mine bought one a week ago and he says it's pretty good connected to his old 19" monitor, 27" catleap and HDTV. :iiam:

On the downside this is a pretty cheaparse marketing video. How many rolling lensflares can you have from a cgi metal bezel?

https://www.youtube.com/watch?v=FDTbm_xWZh0

Not too worried about lack of further overclocking as long if it performs well and as for the blue LED, unfortunately my case is full of them so it's not like a couple more will make much difference.

Verizian fucked around with this message at 01:48 on Aug 29, 2012

FISHMANPET
Mar 3, 2007

Sweet 'N Sour
Can't
Melt
Steel Beams
I posted this in the System Building thread but it's probably more applicable here:

So at work I'm trying to drive 12 monitors at once. This is going to be viewing of video feeds from cameras in the lab, or possibly time lapse pictures, I'm not really sure, but I don't think it's going to be anything 3D. I found the ATI FirePro 2450 card that has 4 DVI out. They even have a model that does PCIe 1x. So I'm looking at a Dell Optiplex 7010 that has one PCIe x16 3.0 slot, one PCIe x16 2.0 (physically x16 but wired at x4) and one PCIe x1 2.0. So I want to put two 2450s in each of the x16 slots and one into the x1 slot.

Can anyone sanity check this? If there's a better way to do this I'm all ears. Currently they have some video distributor box that's never worked properly and I'd like to not make this thing a pile of wasted money.

The two cards I'm looking at specifically:
http://www.newegg.com/Product/Product.aspx?Item=N82E16814195083
http://www.newegg.com/Product/Product.aspx?Item=N82E16814195089

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Verizian posted:

Not too worried about lack of further overclocking as long if it performs well and as for the blue LED, unfortunately my case is full of them so it's not like a couple more will make much difference.

Well, for me it's two things - aesthetically I sort of left the whole CCT and LED thing behind in about 2005, but if you're into it, do your thing, y'know?

More concerning is that spending money on cosmetics is not spending money on functional components. If you look at EVGA's reference and non-reference lineup, it's very much function first, then form - they've got a sleekness to them because they're something of a premium product but they aren't clearly designed to show off. A company that invests in showy stuff and sells a lower priced product does concern me because they've gotta be cutting costs in the first place to sell it at a lower price, it doesn't all come down to brand power - do the blue LEDs mean they decided they could toss out some redundant parts somewhere else that were involved in power delivery to even out the costs there?

Sounds :tinfoil: a bit, I guess, but take into account that major companies have lost standing over bean-counter bullshit like seeing just how little support the VRM stages can get (thus hurting the power delivery and stability of the card) and still function, because every part saved compared to the reference design is less money they have to spend and volume makes it meaningful enough that they paid people to pare it down. That's XFX, in that specific example. I don't really know anything about Palit, but I do know that good custom cooling isn't especially cheap and they've spent some money on LEDs as part of their parts list. I don't find that super encouraging from a brand that's a bit of a question mark in the first place. But it should be fine and as you note, you guys have real consumer protection laws so worst case scenario you deal with the return process and they can't just jack you over. Best case scenario, you can offer us your experiences and help fill in the blank with SA member experiences.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

FISHMANPET posted:

I posted this in the System Building thread but it's probably more applicable here:

So at work I'm trying to drive 12 monitors at once. This is going to be viewing of video feeds from cameras in the lab, or possibly time lapse pictures, I'm not really sure, but I don't think it's going to be anything 3D. I found the ATI FirePro 2450 card that has 4 DVI out. They even have a model that does PCIe 1x. So I'm looking at a Dell Optiplex 7010 that has one PCIe x16 3.0 slot, one PCIe x16 2.0 (physically x16 but wired at x4) and one PCIe x1 2.0. So I want to put two 2450s in each of the x16 slots and one into the x1 slot.

Can anyone sanity check this? If there's a better way to do this I'm all ears. Currently they have some video distributor box that's never worked properly and I'd like to not make this thing a pile of wasted money.

The two cards I'm looking at specifically:
http://www.newegg.com/Product/Product.aspx?Item=N82E16814195083
http://www.newegg.com/Product/Product.aspx?Item=N82E16814195089

I think that would do it. It's an older, slow-as-balls card, but that seems to be all you need.

Now, if you want something that takes fewer slots, has more oomph behind it if needed, and has newer, more flexible connectors, take a look at the FirePro W600. It was basically made for you. It's six mini-DisplayPort connectors on a single-slot card. That's $1100 of card instead of $750, though. Plus, if you already own the monitors and they don't have DP connectors, twelve times $25 for active DP->DVI adapters. Probably not worth it.

If this doesn't need to be done on any particular timeframe, then someday SoonTM there will be DisplayPort MST hubs available. Then you can break out six monitors from any $100 Radeon 7750 (either 1xDP->4xDP/DVI/HDMI + DVI + HDMI or 2xDP->6xDP/DVI/HDMI). MST hubs (or daisy-chaining DP monitors, for that matter) have been said to be imminent since 2010, though. And while AMD said they're making an extra-special effort to bring them to market this summer, they said that at the Radeon 7970 launch and summer is almost gone.

Gunjin
Apr 27, 2004

Om nom nom

FISHMANPET posted:

I posted this in the System Building thread but it's probably more applicable here:

So at work I'm trying to drive 12 monitors at once. This is going to be viewing of video feeds from cameras in the lab, or possibly time lapse pictures, I'm not really sure, but I don't think it's going to be anything 3D. I found the ATI FirePro 2450 card that has 4 DVI out. They even have a model that does PCIe 1x. So I'm looking at a Dell Optiplex 7010 that has one PCIe x16 3.0 slot, one PCIe x16 2.0 (physically x16 but wired at x4) and one PCIe x1 2.0. So I want to put two 2450s in each of the x16 slots and one into the x1 slot.

Can anyone sanity check this? If there's a better way to do this I'm all ears. Currently they have some video distributor box that's never worked properly and I'd like to not make this thing a pile of wasted money.

The two cards I'm looking at specifically:
http://www.newegg.com/Product/Product.aspx?Item=N82E16814195083
http://www.newegg.com/Product/Product.aspx?Item=N82E16814195089

I'm slightly confused, is it 12 screens with different things on each, or are all 12 screens going to be showing the same thing?

FISHMANPET
Mar 3, 2007

Sweet 'N Sour
Can't
Melt
Steel Beams

Factory Factory posted:

I think that would do it. It's an older, slow-as-balls card, but that seems to be all you need.

Now, if you want something that takes fewer slots, has more oomph behind it if needed, and has newer, more flexible connectors, take a look at the FirePro W600. It was basically made for you. It's six mini-DisplayPort connectors on a single-slot card. That's $1100 of card instead of $750, though. Plus, if you already own the monitors and they don't have DP connectors, twelve times $25 for active DP->DVI adapters. Probably not worth it.

If this doesn't need to be done on any particular timeframe, then someday SoonTM there will be DisplayPort MST hubs available. Then you can break out six monitors from any $100 Radeon 7750 (either 1xDP->4xDP/DVI/HDMI + DVI + HDMI or 2xDP->6xDP/DVI/HDMI). MST hubs (or daisy-chaining DP monitors, for that matter) have been said to be imminent since 2010, though. And while AMD said they're making an extra-special effort to bring them to market this summer, they said that at the Radeon 7970 launch and summer is almost gone.

I'm in the enviable position of being given a huge pile of money but I have to spend it by tomorrow, so waiting is out of the question.

I think I can easily swing the card you posted along with an upgrade to a Precision workstation that has two PCIe 3.0 x16 slots.

Currently we have lovely TN monitors with terrible viewing angles, but they have DVI, though I'm thinking of upgrading to 2412Ms which do have DP. But isn't DP>DVI passive as long as the resolution is 1920x1200 or below?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Short version: No, it doesn't work like that.

A dual-mode DisplayPort with a passive adapter cable will basically "forward" an HDMI/DVI signal, but only if the card has a set of HDMI/DVI signal generators available to make that signal in the first place. The ports on the W600 aren't dual-mode; the card has no HDMI/DVI signal generators.

Converting DP to DVI actively is split up by whether the DVI connection is single-link (1920x1200 and below) or dual-link (higher resolutions). A single-link active adapter can be powered from the port itself, and so it's cheap. A dual-link active adapter requires extra power via USB because the conversion is much more involved, so it's more expensive.

Oh, if you need the DVI adapters, AMD part number 199-999440 is a six piece mini-DP to DVI kit. It looks to be a hair under $20 per adapter, so an okay discount.

Factory Factory fucked around with this message at 03:31 on Aug 29, 2012

Gunjin
Apr 27, 2004

Om nom nom
If all the screens are showing the same feed, I think you're better off with a distribution amplifier, get a 1:12 VGA or DVI. If you have to feed multiple feeds into it get an inexpensive matrix switcher. Something like this:
http://www.bhphotovideo.com/c/product/630446-REG/Kramer_VP_411DS_VP_411DS_4x1_Computer_Graphics.html
that would let you switch between up to 4 VGA feeds leading to something like this:
http://www.bhphotovideo.com/c/product/521916-REG/Kramer_VP_12NHD_VP_12N_1_12_Computer_Graphics.html
which would feed out to 12 monitors via VGA.

If the monitors all need to show different feeds, then it would get more complicated and become something that needed to get a video engineer involved.

FISHMANPET
Mar 3, 2007

Sweet 'N Sour
Can't
Melt
Steel Beams
I'm pretty sure the cameras are all IP cameras, so it's just a matter of opening however many browser windows is necessary.

I'm not sure what you mean by needing a video engineer. What kind of inputs do you think I'm talking about?

Devian666
Aug 20, 2008

Take some advice Chris.

Fun Shoe
He's meaning an electrical engineer. I wouldn't bother though as your planned set up is something completely different. This is the sort of thing that the electrical engineers in the office would end up asking me about. Not that they'd need too much help given most of the office has shifted to two or three screens already. I'm the only one that hasn't bothered but that's because I'd probably need to replace my work pc.

Gunjin
Apr 27, 2004

Om nom nom
I was thinking of PTZ cameras, something like this:
http://www.bhphotovideo.com/c/product/631379-REG/Sony_BRC_Z330_BRC_Z330_High_Definition_PTZ.html

Yeah, ignore everything I said, I was in a totally different world.

FISHMANPET
Mar 3, 2007

Sweet 'N Sour
Can't
Melt
Steel Beams

Gunjin posted:

I was thinking of PTZ cameras, something like this:
http://www.bhphotovideo.com/c/product/631379-REG/Sony_BRC_Z330_BRC_Z330_High_Definition_PTZ.html

Yeah, ignore everything I said, I was in a totally different world.

I'm sure in a next round of funding we could get something like that, but we'll cross that bridge when we get to it.

Verizian
Dec 18, 2004
The spiky one.

Agreed posted:

Well, for me it's two things - aesthetically I sort of left the whole CCT and LED thing behind in about 2005, but if you're into it, do your thing, y'know?

More concerning is that spending money on cosmetics is not spending money on functional components. If you look at EVGA's reference and non-reference lineup, it's very much function first, then form - they've got a sleekness to them because they're something of a premium product but they aren't clearly designed to show off. A company that invests in showy stuff and sells a lower priced product does concern me because they've gotta be cutting costs in the first place to sell it at a lower price, it doesn't all come down to brand power - do the blue LEDs mean they decided they could toss out some redundant parts somewhere else that were involved in power delivery to even out the costs there?

Sounds :tinfoil: a bit, I guess, but take into account that major companies have lost standing over bean-counter bullshit like seeing just how little support the VRM stages can get (thus hurting the power delivery and stability of the card) and still function, because every part saved compared to the reference design is less money they have to spend and volume makes it meaningful enough that they paid people to pare it down. That's XFX, in that specific example. I don't really know anything about Palit, but I do know that good custom cooling isn't especially cheap and they've spent some money on LEDs as part of their parts list. I don't find that super encouraging from a brand that's a bit of a question mark in the first place. But it should be fine and as you note, you guys have real consumer protection laws so worst case scenario you deal with the return process and they can't just jack you over. Best case scenario, you can offer us your experiences and help fill in the blank with SA member experiences.

Good points but just for the record I'm not a fan of LED's myself. I had to choose when buying my case between the version I bought with side window and fans stuffed with blue LED's or paying £40 more for a solid side panel and buying 3rd party 120mm case fans. There seems to be a tax on good taste over here.

Also I looked for MSI and EVGA versions first but the UK version of the Borderlands 2 deal is all kinds of hosed up. Each retailer chooses which card models get the bundled game code, Scan is 660Ti's only while Ebuyer seems to be using it to dump their old 570 and 580 stock. That's another £29.99 saving.

FSMC
Apr 27, 2003
I love to live this lie
I got my 660ti today. I was really excited to play the borderland 2 game that came with it.:woop: ......
:cry:

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

FSMC posted:

I got my 660ti today. I was really excited to play the borderland 2 game that came with it.:woop: ......
:cry:

Just a few more weeks, just a few more weeks.

Henry Black
Jun 27, 2004

If she's not making this face, you're not doing it right.
Fun Shoe
Speaking of Borderlands 2 - am I likely to be able to squeeze PhysX out of a 680 running a 1440p display?

Being one of those annoying 'must play on ultra' freaks, Sleeping Dogs has me tempted to go 680 SLI, although I'm not sure if that's simply a driver issue, haven't had time to check yet.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

LittleBob posted:

Speaking of Borderlands 2 - am I likely to be able to squeeze PhysX out of a 680 running a 1440p display?

Being one of those annoying 'must play on ultra' freaks, Sleeping Dogs has me tempted to go 680 SLI, although I'm not sure if that's simply a driver issue, haven't had time to check yet.

No to the first question, don't to the second thought.

Why:

You pretty much have to have a GPU dedicated to CUDA for PhysX not to carry a fairly large framerate cost if you expect both very high graphics settings and good PhysX performance. It can juggle them okay if you turn your rendering details down, but otherwise, you'll see a huge jump in both framerate and the less tangible "smoothness" of gameplay using a dedicated PhysX card, provided your motherboard has the bandwidth to take advantage of both. There are some guys out there putting out really lovely "benchmarks" that still show some improvement, but you get to the end and lo and behold it turns out the PhysX card was running at PCI-e 2.0 x1 or x2, when anything lower than x8 is going to substantially limit the bandwidth of the CUDA card. You won't find many benchmarks in general for "dedicated GPU + dedicated PhysX processor" because it's a really extravagant setup and pretty absurd. Maybe you could pick up something inexpensive (comparatively speaking), no lower than a GTX 560 to see benefit rather than drawbacks, recommend a 560Ti or 560Ti-448 to ensure you're not bottlenecking the 680. But seriously it's not worth it, there just aren't enough titles to justify it. Unless you happen to have the other card hanging around, just don't, and even then it's not a great idea. How much time you plan to spend playing PhysX games vs. regular games, after all?


The second "why" is easier. Two 680s? Really? :negative: Should have gone with 670s, at least they're a little more affordable if you're gaming at a high enough resolution to tax one card enough to consider adding a second. But if you've got the means, nobody can stop you. Just going to be regretting it at some point in the future when such a pricey setup doesn't stack up so well against that generation's price-to-performance giants when they're in SLI, but replacing it with one card wouldn't give you real performance improvement. And as games get more demanding (which we can definitely expect them to do, with a new console generation on the horizon), you'll be dealing with more issues related to "slower" cards in SLI, like microstuttering, etc., not to mention compatibility and all that good stuff.

The high end sucks, you have to either just wait it out, or keep pouring money into a furnace if you intend to stay current.

Berk Berkly
Apr 9, 2009

by zen death robot

LittleBob posted:

Speaking of Borderlands 2 - am I likely to be able to squeeze PhysX out of a 680 running a 1440p display?

Being one of those annoying 'must play on ultra' freaks, Sleeping Dogs has me tempted to go 680 SLI, although I'm not sure if that's simply a driver issue, haven't had time to check yet.

It likely is a driver disconnect as far as Sleeping Dog and Nvidia card performance right now. From what I recall they didn't get a chance to optimize for it yet, just add SLI support.

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride
Sleeping Dogs runs on everything at the highest for me on 1920x1200 except SSAA, which I have to turn down one step... so just don't go nuts with the SSAA and it should be fine on a 680.

MiniSune
Sep 16, 2003

Smart like Dodo!
Just on Physx, I'm looking to do a major upgrade and have currently two 460's in SLI. In a new build is there any benefit keeping both cards as dedicated Physx, or can Physx only run off one card?

Adbot
ADBOT LOVES YOU

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride
I think it can only run off one card, and physx is not that demanding on a 460 anyway if that is its only load. I wouldn't bother unless you loooove physx and play a lot of games that use it.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply