Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
m3monster
Jan 3, 2009
http://www.pcgamer.com/nvidia-geforce-gtx-titan-x-review/

Here is PC gamers full review on the Titan X.
Long story short seems to be that it uses half as much power as the R9 295x2. At stock clocks it runs around 10-15 percent slower on average frame rates, but in some situations it almost doubles the 295x2's minimum frame rate. When overclocked, the Titan x is only a couple of frames slower on average than the 295x2. And last but not least, you don't have to put up with any crossfire/sli related problems.

It looks like a good card. I can't wait to see how it stacks up against the 390x

Adbot
ADBOT LOVES YOU

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
Is there any kind of GPU card or adapter that can drive 9 (nine) 1080p televisions at once in a 3x3 matrix?

I'm thinking of nabbing one of these: http://www.officedepot.com/a/products/940654/Peerless-AV-SmartMount-DS-C555-3X3/ and nine TV's to go with it, $7000 for a 165" death panel which is going to serve as a video conferencing Stargate to our remote offices.

Yeah, projector might be cheaper but they can be a pain in the rear end in their own ways, so I'm weighing my options.

Grim Up North
Dec 12, 2011

I don't know if there are even more specialized OEMs but even Matrox maxes out at 8 monitors. But maybe two Eyefinity Radeon cards are enough.

Gwaihir
Dec 8, 2009
Hair Elf
There's a few of the 6 mini-DP Radeons, and two of them in theory should be able to do a single eyefinity 3 * 3 panel, but uh.. Heh, good luck!

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

Zero VGS posted:

Is there any kind of GPU card or adapter that can drive 9 (nine) 1080p televisions at once in a 3x3 matrix?

I'm thinking of nabbing one of these: http://www.officedepot.com/a/products/940654/Peerless-AV-SmartMount-DS-C555-3X3/ and nine TV's to go with it, $7000 for a 165" death panel which is going to serve as a video conferencing Stargate to our remote offices.

Yeah, projector might be cheaper but they can be a pain in the rear end in their own ways, so I'm weighing my options.

As cool as this sounds, you're kind of over-engineering the situation:

1) Bezels. They'll be more annoying than you think, unless you spend a lot extra for thin-bezel TVs.
2) Nine primary points of failure (and nine secondary points of failure in the cables) by means of the chance of getting a lemon display (or two) is higher, driver issues pertaining to multiple displays, etc. And that doesn't count the likelihood of a GPU failure, which would cripple a large portion of your Death Panel.
3) You'll have to research TVs that are guaranteed to use the exact same panel, preferably all the same revision as well, so you don't have displays that look different from the others. Calibrating different panels to all look like each other is a pain in the dick.
4) Not power efficient, and it's also going to be a workplace hazard. It will weigh upward of 500+ pounds by the time you're done with it.

If you're going to spend seven grand, just buy an 85-90" 1080p. No bezels, a lot easier to mount, less hassle, and still has that 'holy poo poo, wow' factor which will have everyone begging to stay after work to watch/play games on it.

Or, spend a bit more and go this route: http://wwv.crutchfield.com/p_30585U8550/Samsung-UN85HU8550.html (I've seen it about 2k cheaper than this)

BIG HEADLINE fucked around with this message at 11:29 on Mar 18, 2015

LRADIKAL
Jun 10, 2001

Fun Shoe
Unless this is some kind of hacker/skate club, you probably should stick to a projector or one enormous LCD.

Khagan
Aug 8, 2012

Words cannot describe just how terrible Vietnamese are.
Could Nvidia make a GM200 without the compute stuff to slot in between the 980 and titan x price point?

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map
They definitely could, but much later so as to avoid cannibalizing titan x sales as well as attempt to undermine any momentum the R9 300-series can muster this summer/fall.

E: vvv it would have to be a card gimped in performance, not extra features as there aren't any, yeah.

Sidesaddle Cavalry fucked around with this message at 01:45 on Mar 18, 2015

Gwaihir
Dec 8, 2009
Hair Elf

Khagan posted:

Could Nvidia make a GM200 without the compute stuff to slot in between the 980 and titan x price point?

GM200 doesn't have any compute stuff in it, that's why it's so different than last gen's titan. The die space is literally "GTX980 * 1.5" vs GK110's extra FP64 hardware.

e: To be clear, last gen's big chip was a compromise to let it be used for both Tesla and Quadro/Geforce GTX lines. The new GM200 will never go in to a Tesla card, it looks like Nvidia have separated out the lines again. GM200 is all about graphics/FP32 performance through and through.

1gnoirents
Jun 28, 2014

hello :)

Khagan posted:

Could Nvidia make a GM200 without the compute stuff to slot in between the 980 and titan x price point?

I saw some "cut GM200" benchmarks somewhere that are supposed to be a 980ti (I believe). If true, its basically exactly where you're talking about, and I'd imagine it'd be priced exactly like the 780ti considering the 980 price.

http://www.game-debate.com/news/?ne...ance%20Analysis

Who knows if that's real but it seems logical.


Swartz posted:

I wouldn't, my post wasn't very clear. I'm at 1080p and using DSR to bring me to 1440p so I can get an idea of what the performance hit will be.

Do you really think there will be a 980Ti? It would be great, but I wonder if they'll do something like that this time around.

Maybe I'll just hold off on getting that 1440p monitor until next year when Pascal comes out. Hopefully it's early in 2016 and not late.


:3: sorry, sometimes DSR chat gets me all worked up even though I love SSAA. Ever since its been out I've read or heard "I have 4k" countless times. Yes 1440p is fairly rough but I'm not sure its comparable to 1440p DSR, I believe that would be a harder hit ... but I don't know.

GrizzlyCow
May 30, 2011
edit: what he said^^^

Khagan posted:

Could Nvidia make a GM200 without the compute stuff to slot in between the 980 and titan x price point?

There was a leak a few months ago indicating that there would be a cutdown GM200 card, promising to offer 90% of the Titan X's performance but only half its memory. The numbers for it seem to roughly match the numbers from today's reviews, but I still don't know if its trustworthy.

Khagan
Aug 8, 2012

Words cannot describe just how terrible Vietnamese are.
Cut down as in SMMs disabled would suck but I'm more than pleased that I dont have to pay for an extra 6GB of VRAM that I'm not going to use anyway.

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!

1gnoirents posted:

:3: sorry, sometimes DSR chat gets me all worked up even though I love SSAA. Ever since its been out I've read or heard "I have 4k" countless times. Yes 1440p is fairly rough but I'm not sure its comparable to 1440p DSR, I believe that would be a harder hit ... but I don't know.

If anything true 1440p should be a (probably insignificantly) lighter hit than 1440p DSR; DSR is just a really naive, but competent, implementation of SSAA. Without knowing the inner workings of DSR, it certainly acts like it creates a virtual monitor and then downscales images from that virtual monitor to display them on the real monitor. So it would have to internally render everything as if it were going to end up on a 1440p monitor, then an extra step at the end to downscale it.

Unless NVIDIA is taking shortcuts with DSR, which would be more difficult and somewhat defeat the purpose of it. If they were willing to put real effort into DSR they'd have been better off doing SSAA properly in the first place with a random or at least rotated grid. DSR seems like it was a quick and (relatively) easy feature that was added to give Maxwell something to do besides sit idle when playing older dx9 games.

Desuwa fucked around with this message at 03:02 on Mar 18, 2015

Comfy Fleece Sweater
Apr 2, 2013

You see, but you do not observe.

I can't get my loving 2 video cards (nvidia 970 Asus strix) to connect via SLI, I've tried 5 different cables and none worked.

What the fuuuuuuck I'm going crazy, heeeeelp

Motherboard is a Maximus hero VII

I plugged in one 970 card in the red slot, then the 2nd card on the other red slot next to it(a very tight fit, btw, doesn't seem right), but that didn't work.
Detects both cards and they work fine, I can use either exclusively for PHYSX, but no SLI In sight. Turned the SLI cable around, tried both slots, etc.

I plugged the 2nd card on the other black PCI port(not red) just to try it and same thing.

The only difference I noticed was the first red port was PCI "Gen 3" and the other 2 ports show up as Gen 2.

I tried changing both to Gen 3 in the BIOS but it didn't work, then both to gen2 and it didn't work and also killed my frame rate and gave me lots of errors(Skyrim running at 1fps).

I'm at my wits end and about to take it to some computer repair store and see if they can figure it out but I'm going mad trying to figure it out.

I might just leave the 2nd card on the slower port as a PHYSX a exclusive card but it seems like a waste. Please... Hellllppp

Star War Sex Parrot
Oct 2, 2003

Don Tacorleone posted:

I might just leave the 2nd card on the slower port as a PHYSX a exclusive card but it seems like a waste.
Yep this is the answer.

Comfy Fleece Sweater
Apr 2, 2013

You see, but you do not observe.

Star War Sex Parrot posted:

Yep this is the answer.

Thanks Imma really enjoy those extra floating dust specks that block your vision

For real they're super annoying and sometimes I couldn't even see what was going on with all the snow in Batman Origins, poo poo was ridiculous.

Nvidiaaaaaaa :argh:

Next time don't even talk to me about SLI, I'm just buying the more powerful vidcard

The Iron Rose
May 12, 2012

:minnie: Cat Army :minnie:

Don Tacorleone posted:

I can't get my loving 2 video cards (nvidia 970 Asus strix) to connect via SLI, I've tried 5 different cables and none worked.

What the fuuuuuuck I'm going crazy, heeeeelp

Motherboard is a Maximus hero VII

I plugged in one 970 card in the red slot, then the 2nd card on the other red slot next to it(a very tight fit, btw, doesn't seem right), but that didn't work.
Detects both cards and they work fine, I can use either exclusively for PHYSX, but no SLI In sight. Turned the SLI cable around, tried both slots, etc.

I plugged the 2nd card on the other black PCI port(not red) just to try it and same thing.

The only difference I noticed was the first red port was PCI "Gen 3" and the other 2 ports show up as Gen 2.

I tried changing both to Gen 3 in the BIOS but it didn't work, then both to gen2 and it didn't work and also killed my frame rate and gave me lots of errors(Skyrim running at 1fps).

I'm at my wits end and about to take it to some computer repair store and see if they can figure it out but I'm going mad trying to figure it out.

I might just leave the 2nd card on the slower port as a PHYSX a exclusive card but it seems like a waste. Please... Hellllppp

Not sure if I'm missing something or not, but did you turn SLI on in the Nvidia control panel?

Comfy Fleece Sweater
Apr 2, 2013

You see, but you do not observe.

The Iron Rose posted:

Not sure if I'm missing something or not, but did you turn SLI on in the Nvidia control panel?

It doesn't show up, it only gives me options for PHYSX and surround something, from what I read the option should show up in the same screen right? 3D configuration I believe.

BurritoJustice
Oct 9, 2012

Don Tacorleone posted:

It doesn't show up, it only gives me options for PHYSX and surround something, from what I read the option should show up in the same screen right? 3D configuration I believe.

Use GPU-Z while loading both cards to make sure that they are both running at PCIE3 x8, not PCIE3 x16/PCIE2 x4 like some boards set. Sounds like a motherboard issue to me.

Comfy Fleece Sweater
Apr 2, 2013

You see, but you do not observe.

BurritoJustice posted:

Use GPU-Z while loading both cards to make sure that they are both running at PCIE3 x8, not PCIE3 x16/PCIE2 x4 like some boards set. Sounds like a motherboard issue to me.

Thanks, will check and get back.

While I'm at it, is leaving the 2nd card PhysX exclusive just a better overall option, or is this just the mods trolling?
Serious question I don't hang out here and I'm not "hip" to the "injokes" of the jokesters in this Hardware sub forum

Pyrolocutus
Feb 5, 2005
Shape of Flame



I checked the OP but didn't see anything - are there any reputable and good driver cleaners y'all would recommend?

GrizzlyCow
May 30, 2011
DDU comes to mind.

calusari
Apr 18, 2013

It's mechanical. Seems to come at regular intervals.
I'm surprised by how hot the Titan X runs, looks like it's a smidge hotter than Hawaii

KakerMix
Apr 8, 2004

8.2 M.P.G.
:byetankie:
Titan X is already sold out direct from NVIDA.

BurritoJustice
Oct 9, 2012

Don Tacorleone posted:

Thanks, will check and get back.

While I'm at it, is leaving the 2nd card PhysX exclusive just a better overall option, or is this just the mods trolling?
Serious question I don't hang out here and I'm not "hip" to the "injokes" of the jokesters in this Hardware sub forum

Mod trolling. Anything over a GTX750 is hideous overkill for dedicated PhysX. And all that dedicated PhysX allows is to really turn up the PhysX effects in games that have them without a drop in performance, compared to the 80-100% performance increase that SLI allows in games that support it (most anything where SLI would be a benefit).

Comfy Fleece Sweater
Apr 2, 2013

You see, but you do not observe.

BurritoJustice posted:

Mod trolling. Anything over a GTX750 is hideous overkill for dedicated PhysX. And all that dedicated PhysX allows is to really turn up the PhysX effects in games that have them without a drop in performance, compared to the 80-100% performance increase that SLI allows in games that support it (most anything where SLI would be a benefit).

Thanks, mods gonna mod

Btw my card 1 says on Gpu-z
BUS INTERFACE : PCI-E 3.0x16 @x16 3.0(changes to 1.1)

Card 2 says:
BUS interface: PCI-E 2.0x16@x1 1.1

Any idea how to solve this on the BIOS?

Thanks, feels I'm closing in on a solution maybe

I'm going to reinstall my drivers in the meantime, something went fucky in the switching cards, lots of errors starting games.

GrizzlyCow
May 30, 2011

calusari posted:

I'm surprised by how hot the Titan X runs, looks like it's a smidge hotter than Hawaii

It runs as hot as the 780 Ti and consumes a little less power than the R9 290X. Where are you looking where it is anywhere close to over Hawaii?

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

Don Tacorleone posted:

Thanks, mods gonna mod

Btw my card 1 says on Gpu-z
BUS INTERFACE : PCI-E 3.0x16 @x16 3.0(changes to 1.1)

Card 2 says:
BUS interface: PCI-E 2.0x16@x1 1.1

Any idea how to solve this on the BIOS?

Thanks, feels I'm closing in on a solution maybe

I'm going to reinstall my drivers in the meantime, something went fucky in the switching cards, lots of errors starting games.

I was worried about the same thing with my 970. GPU-Z was showing my card in 1.1 reduced-speed mode...when I was in GPU-Z. When I alt-tabbed into something demanding, then tabbed back into GPU-Z, it was running at PCIe 2.0 x16. That's also when I learned that even though I've got a PCIe 3.0-capable Z68 motherboard, you only get PCIe 3.0 speeds with an Ivy Bridge chip installed. Oh well.

And yeah, slicking the drivers completely with DDU is a good idea - I made the mistake of dropping my first 970 in after taking out my SLIed 560/448s, thinking the drivers would just recognize the new card over the old, and found Windows was still registering the existence of the 560s in addition to the 970, at least at the driver level.

BIG HEADLINE fucked around with this message at 06:36 on Mar 18, 2015

Comfy Fleece Sweater
Apr 2, 2013

You see, but you do not observe.

BIG HEADLINE posted:

I was worried about the same thing with my 970. GPU-Z was showing my card in 1.1 reduced-speed mode...when I was in GPU-Z. When I alt-tabbed into something demanding, then tabbed back into GPU-Z, it was running at PCIe 2.0 x16. That's also when I learned that even though I've got a PCIe 3.0-capable Z68 motherboard, you only get PCIe 3.0 speeds with an Ivy Bridge chip installed. Oh well.

And yeah, slicking the drivers completely with DDU is a good idea - I made the mistake of dropping my first 970 in after taking out my SLIed 560/448s, thinking the drivers would just recognize the new card over the old, and found Windows was still registering the existence of the 560s in addition to the 970, at least at the driver level.

Do you happen to have any clue if ok the Maximus Hero VII motherboard do the cards for SLI absolutely have to go into the red slots?

Nothing seems to work and I'm about to give up on this

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

Don Tacorleone posted:

Do you happen to have any clue if ok the Maximus Hero VII motherboard do the cards for SLI absolutely have to go into the red slots?

Nothing seems to work and I'm about to give up on this

Possibly a dumb question, but you do have them bridged, right? And if so, you might have a bad bridge. They're not exactly the strongest pieces in the SLI puzzle.

You also have to manually enable SLI in the driver control panel.

And yes, the red slots are the ones you'd use. Another question...what ELSE do you have plugged into the board? You've only got a finite amount of SLI lanes on a Z97 board.

BIG HEADLINE fucked around with this message at 07:10 on Mar 18, 2015

Comfy Fleece Sweater
Apr 2, 2013

You see, but you do not observe.

BIG HEADLINE posted:

Possibly a dumb question, but you do have them bridged, right? And if so, you might have a bad bridge. They're not exactly the strongest pieces in the SLI puzzle.

You also have to manually enable SLI in the driver control panel.

Yep, tried 4 different SLI cables so far, and I'm constantly checking the control panel, just seems to not work. Maybe cards are bad?

SlayVus
Jul 10, 2009
Grimey Drawer

Don Tacorleone posted:

Yep, tried 4 different SLI cables so far, and I'm constantly checking the control panel, just seems to not work. Maybe cards are bad?

Well, you shouldn't be plugging and unplugging the SLI cable while the computer is on first of all. General rule of thumb is to turn if the computer before you start unplugging things. Unless it has hot swap capabilities, which video cards do not.

BurritoJustice
Oct 9, 2012

Tri-SLI TitanX's are predictably ridiculous

Comfy Fleece Sweater
Apr 2, 2013

You see, but you do not observe.

SlayVus posted:

Well, you shouldn't be plugging and unplugging the SLI cable while the computer is on first of all. General rule of thumb is to turn if the computer before you start unplugging things. Unless it has hot swap capabilities, which video cards do not.

I know I probably come across as completely inept (not far from the truth actually) but yes I turn everything off and unplug before trying a new set of cables or changing cards around...

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

Don Tacorleone posted:

I know I probably come across as completely inept (not far from the truth actually) but yes I turn everything off and unplug before trying a new set of cables or changing cards around...

Just to clarify, when you're saying "SLI cables," you mean this (or something that looks exactly like it):



Right?

I'm only asking because generally people don't have a ton of these laying around - usually just the one that came with the motherboard.

The other thing would be to make sure that the second slot is set to max bandwidth in BIOS. Sometimes the second x16 slot can be set to x4 by default. Also, be aware that even if you get it working, it won't be two PCIe 3.0 x16, but two PCIe 3.0 x8. That's just a limitation of the Z97, not the cards.

Just another question...but what's your PSU situation? If it's got the wattage it could be something as simple as a bad PCIe lead, which is forcing the second card into x1 mode.

BIG HEADLINE fucked around with this message at 11:30 on Mar 18, 2015

calusari
Apr 18, 2013

It's mechanical. Seems to come at regular intervals.

GrizzlyCow posted:

It runs as hot as the 780 Ti and consumes a little less power than the R9 290X. Where are you looking where it is anywhere close to over Hawaii?




a few reviews show the memory modules get very toasty and almost every review says overclocks were limited by thermals



guru3d review:

quote:

The GDDR5 memory runs the hottest at 93 Degrees C, that is not rather pleasant at all. It has a lot to do with the massive density (this card has 12 GB memory) versus frequency and injecting the needed voltage for it.

a waterblock will be a good investment

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
It might actually bode well to buy eVGA's Titan X and elect to buy an extended warranty if the memory runs that hot. A 12GB graphics card will probably be viable as a general-use card in five years' time, since we'll probably still be using PCIe of some stripe, or a few slots will still be on some boards, like PCI used to be.

Anyone looking to buy direct from nVidia should make certain that theirs has a 3y parts/labor warranty as well - I remember seeing an nVidia-made 970, and it only carried a 1y P/L warranty.

BIG HEADLINE fucked around with this message at 12:20 on Mar 18, 2015

BurritoJustice
Oct 9, 2012

calusari posted:




a few reviews show the memory modules get very toasty and almost every review says overclocks were limited by thermals



guru3d review:


a waterblock will be a good investment

Anandtech's numbers are total system load, which is a lot more variable, and they are also way out of whack with pretty much every other source. Notice how in that image the 290x is drawing less power than even a 980, which is drat nonsensical.

TPU has a reasonable 50w delta, which makes sense comparing a 250w card and a 300w card.. TPU (as well as Tom's Hardware iirc) use a specialty setup to measure just the draw from the PCIE power connectors + slot.

I will agree that a water block is an amazing investment, with the card hitting heat limits long before it hits voltage or stability limits (the cards stock voltage is like 1.15 to a 980s 1.25). The headroom on TitanX's is ridiculous.

Truga
May 4, 2014
Lipstick Apathy
I read somewhere that radeons use more power, but put less stress on other parts so you end up with a total power consumption quite a bit lower than simply the difference between the nvidia and radeon card.

If that's true or not I have no idea, and I can't remember where I read that. I'll try to find the source.

E: Can't find it now so it might just be something I read on a forum. Probably bogus.

Truga fucked around with this message at 13:18 on Mar 18, 2015

Adbot
ADBOT LOVES YOU

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

BurritoJustice posted:

Anandtech's numbers are total system load, which is a lot more variable, and they are also way out of whack with pretty much every other source. Notice how in that image the 290x is drawing less power than even a 980, which is drat nonsensical.

TPU has a reasonable 50w delta, which makes sense comparing a 250w card and a 300w card.. TPU (as well as Tom's Hardware iirc) use a specialty setup to measure just the draw from the PCIE power connectors + slot.

I will agree that a water block is an amazing investment, with the card hitting heat limits long before it hits voltage or stability limits (the cards stock voltage is like 1.15 to a 980s 1.25). The headroom on TitanX's is ridiculous.

Even AnandTech says the 290X draws more - but for some reason nobody is looking at it: über mode - which is probably how other 290Xs are tested. All aftermarket cooled cards are running effectively in über mode.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply