Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Zero VGS posted:

I thought the MSRP of the R9 290X is like $700, if that $2300 bundle has three of them you're only paying $200 for everything else. Mining or not, that's not as gouged as I'd have thought.

Nuh-uh. MSRP for the 290X is $549.

Adbot
ADBOT LOVES YOU

Seamonster
Apr 30, 2007

IMMER SIEGREICH
My understanding of crossfire (in 2 card) has both cards sharing the VRAM of the "master" card, correct? Would crossfiring a 270X 4GB and 270X 2GB work or do they have to be identical down to the VRAM apportioning?

beejay
Apr 7, 2002

AMD is actually pretty loose with Crossfiring, you can do different cards even. For example you can Crossfire a 7870 and a 7850. In terms of the memory amounts I believe it will take the lowest common denominator, in your example you would be limited to 2GB.

Wistful of Dollars
Aug 25, 2009

I other amusing news for $850 you can now buy the fastest 780 Ti under the sun from Evga.

Liquid nitrogen sold separately.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

El Scotch posted:

I other amusing news for $850 you can now buy the fastest 780 Ti under the sun from Evga.

Liquid nitrogen sold separately.

This about the kingpin edition? We totally already talked about that, snooooore...

(The weirdest thing about it is that it apparently actually doesn't have any sort of apparent TDP, not enforced nor visible nor ... anything, it'll just keep going until presumably something dies?)

Wistful of Dollars
Aug 25, 2009

Nothing new, today's just the day it went on sale.

td4guy
Jun 13, 2005

I always hated that guy.

HalloKitty posted:

I would definitely leave the monitors turned on in software, and just hit their power buttons instead, for now at least.
This isn't actually possible on some monitors when using DisplayPort. http://support.microsoft.com/kb/2625567

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

td4guy posted:

This isn't actually possible on some monitors when using DisplayPort. http://support.microsoft.com/kb/2625567

I had a vague feeling someone might mention this, but the 660 Ti he is using has two DVI, HDMI, and only one DisplayPort.

Unless he has some kind of weird DisplayPort only monitor, I was relatively confident he'd only be using DVI based ports.

I know I ran into this very problem myself. I very quickly tossed the DisplayPort cable into a box and hooked up a DVI cable again. I would recommend avoiding DisplayPort for this very reason, but obviously with some configurations that isn't possible.

HalloKitty fucked around with this message at 10:21 on Jan 23, 2014

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride
Not to mention a bad displayport cable keeping my machine from booting, but inexplicably lighting up the boot device light instead of the VGA light on my motherboard, causing a few hours of needless troubleshooting.

Swartz
Jul 28, 2005

by FactsAreUseless
So any thoughts on the rumors surrounding the GTX 780 Ti Black edition or GTX 790 that have been surfacing over the past few days?

Here's one such source: http://www.geek.com/games/leaked-specs-reveal-nvidias-dual-gpu-geforce-gtx-790-1583004/

I don't understand GPU stuff that well, but from what I understand the "leaked" specs for the GTX 790 don't make much sense to me. Why would it be 10gb or RAM? Seems more logical it would be 6gb or 12gb.

veedubfreak
Apr 2, 2005

by Smythe
Mostly because people make up bullshit to get page hits.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

I'm not saying I can predict the future, but I don't think that nVidia wants there to be any 780 Ti cards with more than 3GB of VRAM per GPU. The only point of the Titan now is that it has the DP feature and has that 6GB of VRAM, making it their entry level CUDA development card. Given that it just can't match the fully-enabled, more carefully binned (for stable higher frequency, for power usage) GK110 chips used for the 780Ti, and given that the 780Ti uses faster GDDR5 already, it would be a total "gently caress you, Titan!" for anything not requiring DP if a card were released that could match its memory count but outperform it in every other regard.

I don't think nVidia wants to make Titan pointless for a lot of people. There are already developers who are buying it because of the VRAM count even though they don't really need DP, and it'd be a total rout for anything single precision. Given that a lot of CUDA stuff (and GPGPU in general) can be heavily memory limited, that's not a scenario favorable to the market segmentation that nVidia has established with the variety of GK110-based cards.

Agreed fucked around with this message at 18:37 on Jan 24, 2014

GrizzlyCow
May 30, 2011

Agreed posted:

I'm not saying I can predict the future, but I don't think that nVidia wants there to be any 780 Ti cards with more than 3GB of VRAM per GPU. The only point of the Titan now is that it has the DP feature and has that 6GB of VRAM, making it their entry level CUDA development card. Given that it just can't match the fully-enabled, more carefully binned (for stable higher frequency, for power usage) GK110 chips used for the 780Ti, and given that the 780Ti uses faster GDDR5 already, it would be a total "gently caress you, Titan!" for anything not requiring DP if a card were released that could match its memory count but outperform it in every other regard.

I don't think nVidia wants to make Titan pointless for a lot of people. There are already developers who are buying it because of the VRAM count even though they don't really need DP, and it'd be a total rout for anything single precision. Given that a lot of CUDA stuff (and GPGPU in general) can be heavily memory limited, that's not a scenario favorable to the market segmentation that nVidia has established with the variety of GK110-based cards.

The article did mention the Titan Black Edition which will supposedly be a 6GB VRAM version of the 780Ti with Titan features. If they are to release a thing, it would surely replace the Titan as the current consumer grade compute card.


Swartz posted:

So any thoughts on the rumors surrounding the GTX 780 Ti Black edition or GTX 790 that have been surfacing over the past few days?

Here's one such source: http://www.geek.com/games/leaked-specs-reveal-nvidias-dual-gpu-geforce-gtx-790-1583004/

I don't understand GPU stuff that well, but from what I understand the "leaked" specs for the GTX 790 don't make much sense to me. Why would it be 10gb or RAM? Seems more logical it would be 6gb or 12gb.

I'm pretty sure that it would be limited to 5GB of usable VRAM considering it is a dual-gpu card. Now, I obviously have no idea why it would need 5GB of VRAM for a consumer card. I suppose NVIDIA may try to push it as 4K UHD card, and the memory bandwidth only allowed multiples of 5?

Wistful of Dollars
Aug 25, 2009

I'll (still) believe any of it when I see it from the horse's mouth.

Duro
May 1, 2013

by Lowtax
Well, I got hosed over by NCIX and they gave me a Sapphire card despite me asking for anything but one. I think they shipped everything too, so I dunno what to do now. I think I'll just accept my fate and go with the flow. It might not be that bad....

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

Duro posted:

Well, I got hosed over by NCIX and they gave me a Sapphire card despite me asking for anything but one. I think they shipped everything too, so I dunno what to do now. I think I'll just accept my fate and go with the flow. It might not be that bad....

Return it? What do you mean they gave you a sapphire card? Did you just click a button that said order video card?

Wistful of Dollars
Aug 25, 2009

"Hello good vendor! Give me your finest, cheapest GPU!"

goobernoodles
May 28, 2011

Wayne Leonard Kirby.

Orioles Magician.

goobernoodles posted:

I bought a EVGA GTX 780 ACX/FTW whatever version at the beginning of the month. Looks like I can use their step-up program to hop to a 780ti for ~145 bucks - more for better shipping that I'd probably shoot for.

Someone tell me I shouldn't do it. (Do the other thing)
Sweet jebus I broke #100 on the step up waiting list finally. Down to #59! I want my 780TI now, damnit.

Duro
May 1, 2013

by Lowtax

Don Lapre posted:

Return it? What do you mean they gave you a sapphire card? Did you just click a button that said order video card?

No, I had posted earlier in the thread and didn't think I'd need to repeat the story

I ordered all the parts for a pc and a build on boxing day. Despite everything being in stock when I ordered, I got an e-mail saying practically half of my items were out of stock, including the MSI r9 270x card I wanted

After waiting forever to hear back from them, they offered me a Powercolor. I inquired about an Asus one, then told them I'd take the powercolor and that I didn't want the Sapphire. Next thing I know, I get an e-mail saying they chose the Sapphire and that everything was going to ship.

It kind of pisses me off, because I think that they didn't give me powercolor because it was on sale during those e-mail exchanges and they would have had to refund me some money, but the Sapphire is the same price as the one I ordered. Now the powercolor is out of stock on their site as well.

It's just a very lovely situation, I don't think I'll ever buy my parts from there again

MrMoo
Sep 14, 2000

Seems quite difficult to drive 4k pixels well, I have a wall display setup with 8 monitors (NEC kit at a sad 1366x768) and only an SLI of two AMD FirePro W600's is managing to keep up at 60fps. Anyone else tried this?

:lol: GPU accelerated webdev.

Matrox are popular but weak drivers, disabling hardware acceleration at 5 monitors and not supporting minimum OpenGL version for Chrome.

GrizzlyCow
May 30, 2011

Duro posted:

It's just a very lovely situation, I don't think I'll ever buy my parts from there again

poo poo. NCX is beginning to sound worse than TigerDirect.

Well, if you can get a full-refund, take that and buy from a more reputable vendor like Amazon or NewEgg.

Duro
May 1, 2013

by Lowtax

GrizzlyCow posted:

poo poo. NCX is beginning to sound worse than TigerDirect.

Well, if you can get a full-refund, take that and buy from a more reputable vendor like Amazon or NewEgg.

I'm waiting for a callback now. I'm gonna tell them off but probably accept the shipment as is. I just want it to at least be noted that I never actually ok'ed this particular card in case I ever have problems, and I'm hoping they do something for me but I have a feeling it's one of those companies that doesn't give a poo poo about its clients or making things right.

The only reason I didn't cancel my order is because I got most of my parts discounted by a decent amount, and probably saved like 300$ on my entire order.

So, as things stand, I have a Sapphire r9 270x BF4 Edition, whatever that means. I can't find one decent review or benchmark on this card using google. Hopefully it's ok. Other Sapphire cards seemed to benchmark lower than the competition, and ran hotter, which is the main reason I'm disappointed in the whole situation

veedubfreak
Apr 2, 2005

by Smythe

MrMoo posted:

Seems quite difficult to drive 4k pixels well, I have a wall display setup with 8 monitors (NEC kit at a sad 1366x768) and only an SLI of two AMD FirePro W600's is managing to keep up at 60fps. Anyone else tried this?

:lol: GPU accelerated webdev.

Matrox are popular but weak drivers, disabling hardware acceleration at 5 monitors and not supporting minimum OpenGL version for Chrome.

Well I'm running 3 2560x1440 monitors using 3 290x cards. Holds it's own when there is actually a profile for it.

For the record, I'm pushing 11.5 million pixels at bezel corrected 7990x1440.

MrMoo
Sep 14, 2000

We found 3-4 monitors on AMD FirePro 2460 or nVidia Quadro worked fine, working through different nVidia cards next week.

veedubfreak
Apr 2, 2005

by Smythe
Ok, so I'm using unigine heave benchmark on extreme settings. Why is this stupid thing only using 1 video card. And to make it even weirder, it's using my gpu2? The hell man. I know it's not a glitch in afterburner because it shows accurate gpu usage in games.

Doyzer
Jul 28, 2013

goobernoodles posted:

Sweet jebus I broke #100 on the step up waiting list finally. Down to #59! I want my 780TI now, damnit.

Same here. I moved to #21 from #69on the 780 ACX list. Before I was moving 1 spot a week.

SlayVus
Jul 10, 2009
Grimey Drawer
So the 800m series is supposedly launching this February, what would be a good time frame to actually start seeing laptops released with the 60/70 variants?

Hopefully, they ship with something better than a 128-bit memory bus.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.
Got my GTX 770 in last night, compared to my old Radeon HD 6950 it's on average a little over 100% increase in performance. :getin:

Was waiting on a R9 280X but it just wasn't worth hawking over Amazon for the 10 second windows once every few days they were in stock to get one at MSRP.

veedubfreak
Apr 2, 2005

by Smythe
Anyone know why my middle video card would be the one doing all the work in games that don't support crossfire? I have never seen anything like this happen. I thought it was just the unigine benchmark, but apparently anything that isn't using crossfire is using gpu2 now.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

veedubfreak posted:

Anyone know why my middle video card would be the one doing all the work in games that don't support crossfire? I have never seen anything like this happen. I thought it was just the unigine benchmark, but apparently anything that isn't using crossfire is using gpu2 now.

What's your motherboard? Some (many that are built for tri- and quad-SLI/CF) populate ports in a different order than their physical layout, so "GPU 1" to the PC may be different than "GPU 1" to your eyeball.

slidebite
Nov 6, 2005

Good egg
:colbert:

So a question:

Other the practicality (cords, adapters) does it make a difference to use an HDMI output on your video card versus DVI?

And if you use DVI, what do you lose (other than possible signal degradation) with a VGA adapter?

I've always used a DVI out to VGA adapter to my monitor, but I re-arranged my room and my speaker cable no longer reaches my tower. So on a whim, I hooked up the PC to the monitor via HDMI and using the mediocre built in speakers.. but for the heck of it I noticed 3dmark 11 actually increase about 100 points. Coincidence?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

slidebite posted:

So a question:

Other the practicality (cords, adapters) does it make a difference to use an HDMI output on your video card versus DVI?

No, other than HDMI carrying audio.

quote:

And if you use DVI, what do you lose (other than possible signal degradation) with a VGA adapter?

"Possible" is too generous. It's analog video - it's degraded just by the digital to analog to digital conversions. Besides that, most modern hards have a tough time outputting a real 1920x1080 on the analog ports because nobody has really cared about that for a while.

quote:

I've always used a DVI out to VGA adapter to my monitor, but I re-arranged my room and my speaker cable no longer reaches my tower. So on a whim, I hooked up the PC to the monitor via HDMI and using the mediocre built in speakers.. but for the heck of it I noticed 3dmark 11 actually increase about 100 points. Coincidence?

Coincidence. Run it again, then again, then again - there will be variances. *Maybe* there's a tiny effect from audio processing being offloaded from the motherboard's likely-CPU-based audio codec to the video card's more-independent codec.

veedubfreak
Apr 2, 2005

by Smythe

Factory Factory posted:

What's your motherboard? Some (many that are built for tri- and quad-SLI/CF) populate ports in a different order than their physical layout, so "GPU 1" to the PC may be different than "GPU 1" to your eyeball.

Well its this board. http://www.newegg.com/Product/Product.aspx?Item=N82E16813128596

And the only thing that changed was that I added the third card to the third orange slot. So it just seems wierd that it would decude that gpu2 is now the main card where when I only had 2 cards, gpu1 was the main card. Not that I have had issues, it's just weird.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
So, counting orange slots only:

Slot 1: x16/x8 (CPU) (first enumerated)

Slot 2: x4 (CPU) (third enumerated)

Slot 3: x8/x4 (CPU) (second enumerated)

Slot 4: x4 2.0 (PCH)

Why slot 2 would be called the primary I have no idea.

veedubfreak
Apr 2, 2005

by Smythe

Factory Factory posted:

So, counting orange slots only:

Slot 1: x16/x8 (CPU) (first enumerated)

Slot 2: x4 (CPU) (third enumerated)

Slot 3: x8/x4 (CPU) (second enumerated)

Slot 4: x4 2.0 (PCH)

Why slot 2 would be called the primary I have no idea.

Slot 2 (the black slot) is actually what they suggest using as your primary slot for 1 or 2 cards as it is fully 16x. This board has the plx chip so it should be running 16/8/8 or 8/8/8/8 if you run 4. I just can't understand why adding a third card while making no other changes would have made gpu2 become primary.

slidebite
Nov 6, 2005

Good egg
:colbert:

Factory Factory posted:

No, other than HDMI carrying audio.


"Possible" is too generous. It's analog video - it's degraded just by the digital to analog to digital conversions. Besides that, most modern hards have a tough time outputting a real 1920x1080 on the analog ports because nobody has really cared about that for a while.


Coincidence. Run it again, then again, then again - there will be variances. *Maybe* there's a tiny effect from audio processing being offloaded from the motherboard's likely-CPU-based audio codec to the video card's more-independent codec.

Thanks for the info. I am really pressed to see a visual improvement with HDMI over the VGA though. I agree with 3D mark. 100-ish points is probably within the margin of error/run variances.

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map

veedubfreak posted:

Well its this board. http://www.newegg.com/Product/Product.aspx?Item=N82E16813128596

And the only thing that changed was that I added the third card to the third orange slot. So it just seems wierd that it would decude that gpu2 is now the main card where when I only had 2 cards, gpu1 was the main card. Not that I have had issues, it's just weird.

This might sound dumb, but how are your CrossFire bridges hooked up? Try a different combination of connections.

e: try, not trying

vvv I forgot about that. Still, the motherboard was designed before it was discovered that the 290 series could do that. BIOS settings/change maybe?

Sidesaddle Cavalry fucked around with this message at 04:41 on Jan 26, 2014

GokieKS
Dec 15, 2012

Mostly Harmless.

Sidesaddle Cavalry posted:

This might sound dumb, but how are your CrossFire bridges hooked up? Trying a different combination of connections.

New AMD cards actually no longer use physical bridges and instead just goes over PCIe.

Tedronai66
Aug 24, 2006
Better to Reign in Hell...
e: aaand wrong thread.

Tedronai66 fucked around with this message at 23:27 on Jan 26, 2014

Adbot
ADBOT LOVES YOU

incoherent
Apr 24, 2004

01010100011010000111001
00110100101101100011011
000110010101110010

veedubfreak posted:

Slot 2 (the black slot) is actually what they suggest using as your primary slot for 1 or 2 cards as it is fully 16x. This board has the plx chip so it should be running 16/8/8 or 8/8/8/8 if you run 4. I just can't understand why adding a third card while making no other changes would have made gpu2 become primary.

Is a monitor on the second gpu?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply