|
Zero VGS posted:I thought the MSRP of the R9 290X is like $700, if that $2300 bundle has three of them you're only paying $200 for everything else. Mining or not, that's not as gouged as I'd have thought. Nuh-uh. MSRP for the 290X is $549.
|
# ? Jan 22, 2014 22:07 |
|
|
# ? May 26, 2024 21:29 |
|
My understanding of crossfire (in 2 card) has both cards sharing the VRAM of the "master" card, correct? Would crossfiring a 270X 4GB and 270X 2GB work or do they have to be identical down to the VRAM apportioning?
|
# ? Jan 22, 2014 22:17 |
|
AMD is actually pretty loose with Crossfiring, you can do different cards even. For example you can Crossfire a 7870 and a 7850. In terms of the memory amounts I believe it will take the lowest common denominator, in your example you would be limited to 2GB.
|
# ? Jan 22, 2014 22:29 |
|
I other amusing news for $850 you can now buy the fastest 780 Ti under the sun from Evga. Liquid nitrogen sold separately.
|
# ? Jan 22, 2014 22:39 |
|
El Scotch posted:I other amusing news for $850 you can now buy the fastest 780 Ti under the sun from Evga. This about the kingpin edition? We totally already talked about that, snooooore... (The weirdest thing about it is that it apparently actually doesn't have any sort of apparent TDP, not enforced nor visible nor ... anything, it'll just keep going until presumably something dies?)
|
# ? Jan 22, 2014 22:57 |
|
Nothing new, today's just the day it went on sale.
|
# ? Jan 22, 2014 23:14 |
|
HalloKitty posted:I would definitely leave the monitors turned on in software, and just hit their power buttons instead, for now at least.
|
# ? Jan 23, 2014 06:20 |
|
td4guy posted:This isn't actually possible on some monitors when using DisplayPort. http://support.microsoft.com/kb/2625567 I had a vague feeling someone might mention this, but the 660 Ti he is using has two DVI, HDMI, and only one DisplayPort. Unless he has some kind of weird DisplayPort only monitor, I was relatively confident he'd only be using DVI based ports. I know I ran into this very problem myself. I very quickly tossed the DisplayPort cable into a box and hooked up a DVI cable again. I would recommend avoiding DisplayPort for this very reason, but obviously with some configurations that isn't possible. HalloKitty fucked around with this message at 10:21 on Jan 23, 2014 |
# ? Jan 23, 2014 10:18 |
|
Not to mention a bad displayport cable keeping my machine from booting, but inexplicably lighting up the boot device light instead of the VGA light on my motherboard, causing a few hours of needless troubleshooting.
|
# ? Jan 23, 2014 15:15 |
|
So any thoughts on the rumors surrounding the GTX 780 Ti Black edition or GTX 790 that have been surfacing over the past few days? Here's one such source: http://www.geek.com/games/leaked-specs-reveal-nvidias-dual-gpu-geforce-gtx-790-1583004/ I don't understand GPU stuff that well, but from what I understand the "leaked" specs for the GTX 790 don't make much sense to me. Why would it be 10gb or RAM? Seems more logical it would be 6gb or 12gb.
|
# ? Jan 24, 2014 17:32 |
|
Mostly because people make up bullshit to get page hits.
|
# ? Jan 24, 2014 18:28 |
|
I'm not saying I can predict the future, but I don't think that nVidia wants there to be any 780 Ti cards with more than 3GB of VRAM per GPU. The only point of the Titan now is that it has the DP feature and has that 6GB of VRAM, making it their entry level CUDA development card. Given that it just can't match the fully-enabled, more carefully binned (for stable higher frequency, for power usage) GK110 chips used for the 780Ti, and given that the 780Ti uses faster GDDR5 already, it would be a total "gently caress you, Titan!" for anything not requiring DP if a card were released that could match its memory count but outperform it in every other regard. I don't think nVidia wants to make Titan pointless for a lot of people. There are already developers who are buying it because of the VRAM count even though they don't really need DP, and it'd be a total rout for anything single precision. Given that a lot of CUDA stuff (and GPGPU in general) can be heavily memory limited, that's not a scenario favorable to the market segmentation that nVidia has established with the variety of GK110-based cards. Agreed fucked around with this message at 18:37 on Jan 24, 2014 |
# ? Jan 24, 2014 18:34 |
|
Agreed posted:I'm not saying I can predict the future, but I don't think that nVidia wants there to be any 780 Ti cards with more than 3GB of VRAM per GPU. The only point of the Titan now is that it has the DP feature and has that 6GB of VRAM, making it their entry level CUDA development card. Given that it just can't match the fully-enabled, more carefully binned (for stable higher frequency, for power usage) GK110 chips used for the 780Ti, and given that the 780Ti uses faster GDDR5 already, it would be a total "gently caress you, Titan!" for anything not requiring DP if a card were released that could match its memory count but outperform it in every other regard. The article did mention the Titan Black Edition which will supposedly be a 6GB VRAM version of the 780Ti with Titan features. If they are to release a thing, it would surely replace the Titan as the current consumer grade compute card. Swartz posted:So any thoughts on the rumors surrounding the GTX 780 Ti Black edition or GTX 790 that have been surfacing over the past few days? I'm pretty sure that it would be limited to 5GB of usable VRAM considering it is a dual-gpu card. Now, I obviously have no idea why it would need 5GB of VRAM for a consumer card. I suppose NVIDIA may try to push it as 4K UHD card, and the memory bandwidth only allowed multiples of 5?
|
# ? Jan 24, 2014 22:14 |
|
I'll (still) believe any of it when I see it from the horse's mouth.
|
# ? Jan 24, 2014 22:36 |
|
Well, I got hosed over by NCIX and they gave me a Sapphire card despite me asking for anything but one. I think they shipped everything too, so I dunno what to do now. I think I'll just accept my fate and go with the flow. It might not be that bad....
|
# ? Jan 24, 2014 23:19 |
|
Duro posted:Well, I got hosed over by NCIX and they gave me a Sapphire card despite me asking for anything but one. I think they shipped everything too, so I dunno what to do now. I think I'll just accept my fate and go with the flow. It might not be that bad.... Return it? What do you mean they gave you a sapphire card? Did you just click a button that said order video card?
|
# ? Jan 24, 2014 23:21 |
|
"Hello good vendor! Give me your finest, cheapest GPU!"
|
# ? Jan 24, 2014 23:42 |
|
goobernoodles posted:I bought a EVGA GTX 780 ACX/FTW whatever version at the beginning of the month. Looks like I can use their step-up program to hop to a 780ti for ~145 bucks - more for better shipping that I'd probably shoot for.
|
# ? Jan 25, 2014 00:02 |
|
Don Lapre posted:Return it? What do you mean they gave you a sapphire card? Did you just click a button that said order video card? No, I had posted earlier in the thread and didn't think I'd need to repeat the story I ordered all the parts for a pc and a build on boxing day. Despite everything being in stock when I ordered, I got an e-mail saying practically half of my items were out of stock, including the MSI r9 270x card I wanted After waiting forever to hear back from them, they offered me a Powercolor. I inquired about an Asus one, then told them I'd take the powercolor and that I didn't want the Sapphire. Next thing I know, I get an e-mail saying they chose the Sapphire and that everything was going to ship. It kind of pisses me off, because I think that they didn't give me powercolor because it was on sale during those e-mail exchanges and they would have had to refund me some money, but the Sapphire is the same price as the one I ordered. Now the powercolor is out of stock on their site as well. It's just a very lovely situation, I don't think I'll ever buy my parts from there again
|
# ? Jan 25, 2014 00:25 |
|
Seems quite difficult to drive 4k pixels well, I have a wall display setup with 8 monitors (NEC kit at a sad 1366x768) and only an SLI of two AMD FirePro W600's is managing to keep up at 60fps. Anyone else tried this? GPU accelerated webdev. Matrox are popular but weak drivers, disabling hardware acceleration at 5 monitors and not supporting minimum OpenGL version for Chrome.
|
# ? Jan 25, 2014 00:37 |
|
Duro posted:It's just a very lovely situation, I don't think I'll ever buy my parts from there again poo poo. NCX is beginning to sound worse than TigerDirect. Well, if you can get a full-refund, take that and buy from a more reputable vendor like Amazon or NewEgg.
|
# ? Jan 25, 2014 01:03 |
|
GrizzlyCow posted:poo poo. NCX is beginning to sound worse than TigerDirect. I'm waiting for a callback now. I'm gonna tell them off but probably accept the shipment as is. I just want it to at least be noted that I never actually ok'ed this particular card in case I ever have problems, and I'm hoping they do something for me but I have a feeling it's one of those companies that doesn't give a poo poo about its clients or making things right. The only reason I didn't cancel my order is because I got most of my parts discounted by a decent amount, and probably saved like 300$ on my entire order. So, as things stand, I have a Sapphire r9 270x BF4 Edition, whatever that means. I can't find one decent review or benchmark on this card using google. Hopefully it's ok. Other Sapphire cards seemed to benchmark lower than the competition, and ran hotter, which is the main reason I'm disappointed in the whole situation
|
# ? Jan 25, 2014 01:10 |
|
MrMoo posted:Seems quite difficult to drive 4k pixels well, I have a wall display setup with 8 monitors (NEC kit at a sad 1366x768) and only an SLI of two AMD FirePro W600's is managing to keep up at 60fps. Anyone else tried this? Well I'm running 3 2560x1440 monitors using 3 290x cards. Holds it's own when there is actually a profile for it. For the record, I'm pushing 11.5 million pixels at bezel corrected 7990x1440.
|
# ? Jan 25, 2014 02:26 |
|
We found 3-4 monitors on AMD FirePro 2460 or nVidia Quadro worked fine, working through different nVidia cards next week.
|
# ? Jan 25, 2014 03:05 |
|
Ok, so I'm using unigine heave benchmark on extreme settings. Why is this stupid thing only using 1 video card. And to make it even weirder, it's using my gpu2? The hell man. I know it's not a glitch in afterburner because it shows accurate gpu usage in games.
|
# ? Jan 25, 2014 03:25 |
|
goobernoodles posted:Sweet jebus I broke #100 on the step up waiting list finally. Down to #59! I want my 780TI now, damnit. Same here. I moved to #21 from #69on the 780 ACX list. Before I was moving 1 spot a week.
|
# ? Jan 25, 2014 04:00 |
|
So the 800m series is supposedly launching this February, what would be a good time frame to actually start seeing laptops released with the 60/70 variants? Hopefully, they ship with something better than a 128-bit memory bus.
|
# ? Jan 25, 2014 16:30 |
|
Got my GTX 770 in last night, compared to my old Radeon HD 6950 it's on average a little over 100% increase in performance. Was waiting on a R9 280X but it just wasn't worth hawking over Amazon for the 10 second windows once every few days they were in stock to get one at MSRP.
|
# ? Jan 25, 2014 20:03 |
|
Anyone know why my middle video card would be the one doing all the work in games that don't support crossfire? I have never seen anything like this happen. I thought it was just the unigine benchmark, but apparently anything that isn't using crossfire is using gpu2 now.
|
# ? Jan 26, 2014 01:43 |
|
veedubfreak posted:Anyone know why my middle video card would be the one doing all the work in games that don't support crossfire? I have never seen anything like this happen. I thought it was just the unigine benchmark, but apparently anything that isn't using crossfire is using gpu2 now. What's your motherboard? Some (many that are built for tri- and quad-SLI/CF) populate ports in a different order than their physical layout, so "GPU 1" to the PC may be different than "GPU 1" to your eyeball.
|
# ? Jan 26, 2014 02:00 |
|
So a question: Other the practicality (cords, adapters) does it make a difference to use an HDMI output on your video card versus DVI? And if you use DVI, what do you lose (other than possible signal degradation) with a VGA adapter? I've always used a DVI out to VGA adapter to my monitor, but I re-arranged my room and my speaker cable no longer reaches my tower. So on a whim, I hooked up the PC to the monitor via HDMI and using the mediocre built in speakers.. but for the heck of it I noticed 3dmark 11 actually increase about 100 points. Coincidence?
|
# ? Jan 26, 2014 02:03 |
|
slidebite posted:So a question: No, other than HDMI carrying audio. quote:And if you use DVI, what do you lose (other than possible signal degradation) with a VGA adapter? "Possible" is too generous. It's analog video - it's degraded just by the digital to analog to digital conversions. Besides that, most modern hards have a tough time outputting a real 1920x1080 on the analog ports because nobody has really cared about that for a while. quote:I've always used a DVI out to VGA adapter to my monitor, but I re-arranged my room and my speaker cable no longer reaches my tower. So on a whim, I hooked up the PC to the monitor via HDMI and using the mediocre built in speakers.. but for the heck of it I noticed 3dmark 11 actually increase about 100 points. Coincidence? Coincidence. Run it again, then again, then again - there will be variances. *Maybe* there's a tiny effect from audio processing being offloaded from the motherboard's likely-CPU-based audio codec to the video card's more-independent codec.
|
# ? Jan 26, 2014 02:20 |
|
Factory Factory posted:What's your motherboard? Some (many that are built for tri- and quad-SLI/CF) populate ports in a different order than their physical layout, so "GPU 1" to the PC may be different than "GPU 1" to your eyeball. Well its this board. http://www.newegg.com/Product/Product.aspx?Item=N82E16813128596 And the only thing that changed was that I added the third card to the third orange slot. So it just seems wierd that it would decude that gpu2 is now the main card where when I only had 2 cards, gpu1 was the main card. Not that I have had issues, it's just weird.
|
# ? Jan 26, 2014 02:27 |
|
So, counting orange slots only: Slot 1: x16/x8 (CPU) (first enumerated) Slot 2: x4 (CPU) (third enumerated) Slot 3: x8/x4 (CPU) (second enumerated) Slot 4: x4 2.0 (PCH) Why slot 2 would be called the primary I have no idea.
|
# ? Jan 26, 2014 02:39 |
|
Factory Factory posted:So, counting orange slots only: Slot 2 (the black slot) is actually what they suggest using as your primary slot for 1 or 2 cards as it is fully 16x. This board has the plx chip so it should be running 16/8/8 or 8/8/8/8 if you run 4. I just can't understand why adding a third card while making no other changes would have made gpu2 become primary.
|
# ? Jan 26, 2014 02:42 |
|
Factory Factory posted:No, other than HDMI carrying audio. Thanks for the info. I am really pressed to see a visual improvement with HDMI over the VGA though. I agree with 3D mark. 100-ish points is probably within the margin of error/run variances.
|
# ? Jan 26, 2014 02:59 |
|
veedubfreak posted:Well its this board. http://www.newegg.com/Product/Product.aspx?Item=N82E16813128596 This might sound dumb, but how are your CrossFire bridges hooked up? Try a different combination of connections. e: try, not trying vvv I forgot about that. Still, the motherboard was designed before it was discovered that the 290 series could do that. BIOS settings/change maybe? Sidesaddle Cavalry fucked around with this message at 04:41 on Jan 26, 2014 |
# ? Jan 26, 2014 04:03 |
|
Sidesaddle Cavalry posted:This might sound dumb, but how are your CrossFire bridges hooked up? Trying a different combination of connections. New AMD cards actually no longer use physical bridges and instead just goes over PCIe.
|
# ? Jan 26, 2014 04:12 |
|
e: aaand wrong thread.
Tedronai66 fucked around with this message at 23:27 on Jan 26, 2014 |
# ? Jan 26, 2014 22:08 |
|
|
# ? May 26, 2024 21:29 |
|
veedubfreak posted:Slot 2 (the black slot) is actually what they suggest using as your primary slot for 1 or 2 cards as it is fully 16x. This board has the plx chip so it should be running 16/8/8 or 8/8/8/8 if you run 4. I just can't understand why adding a third card while making no other changes would have made gpu2 become primary. Is a monitor on the second gpu?
|
# ? Jan 27, 2014 00:31 |