|
Non-prerelease TR and the division are looking pretty solid for amd, and of course in Ashes the 390x is halfway from the 970 to the 980 Ti. It's also delivering good performance in clear goty Plants vs Zombies: Garden Warfare 2 (which is actually a frostbite engine game )
|
# ? Feb 29, 2016 18:17 |
|
|
# ? Apr 30, 2024 12:03 |
|
I'm sorry I triggered that guy for re-using the word "destroy", I should have said "performs better than" Which it should do, it's about $100 more expensive and it's not going to win any price/performance award.
|
# ? Feb 29, 2016 18:30 |
|
Didn't mean to start trouble. I got the MSI 970 which isn't aftermarket but it's 280 on Newegg after rebate and comes with the Division. Like I said this is just a stopgap until Pascal comes out proper. Thanks again for the help.
|
# ? Feb 29, 2016 18:30 |
|
THE DOG HOUSE posted:But im just clicking around. I'm sure a specific test is out there to show what you guys mean. overclocking the 950/960 is however always a total shitshow and is probably why GM206 doesn't appear in mobile at all when GM204 mobile can use 2/3 of the power while at the same cuda/tmu/rop configuration and on the very last chart the fury x has substantially less perf/watt than the 980ti which is part of what I meant Anime Schoolgirl fucked around with this message at 18:46 on Feb 29, 2016 |
# ? Feb 29, 2016 18:43 |
|
snuff posted:I'm sorry I triggered that guy for re-using the word "destroy", I should have said "performs better than" Sorry for making a MLG benchmark compilation I felt pretty dumb about that. But for the record I was triggered by "not going to google it". Back to a more civil discussion, I disagree that it performs better than at all. It performs equal to (as it is a straight 290x rebrand). The only clear runaway moments were in 4k. I would go so far as to say that the average aftermarket 970 compared to the average aftermarket 390x is the closest comparison out of the lineup between both brands. But of course benchmarks are all over the place due to everything I mentioned above. Blacktoll posted:Didn't mean to start trouble. I got the MSI 970 which isn't aftermarket but it's 280 on Newegg after rebate and comes with the Division. Like I said this is just a stopgap until Pascal comes out proper. Thanks again for the help. http://www.newegg.com/Product/Product.aspx?Item=N82E16814127833&cm_re=gtx_970-_-14-127-833-_-Product If you mean that its still aftermarket its just not their "gaming". If you choose to I'm sure it will OC just like the rest of them. penus penus penus fucked around with this message at 19:05 on Feb 29, 2016 |
# ? Feb 29, 2016 19:00 |
|
My bad that was a typo. Yeah, I'll have to play with it and do some overclocking myself. Does MSI have a tool for doing so or is there something better out there?
|
# ? Feb 29, 2016 19:06 |
|
Blacktoll posted:My bad that was a typo. Yeah, I'll have to play with it and do some overclocking myself. Does MSI have a tool for doing so or is there something better out there? MSI Afterburner is what you want to use
|
# ? Feb 29, 2016 19:09 |
|
If you just want VR-ready price/performance then the MSI/Asus/Gigbyte customer-cooler 290(x) on eBay is going to get you the most FPS for the money, being acquired pretty easily for $200-$220. If you were going to actually drop money for The Division, then sure, pretend the GTX 970 is $40 less than it is, as that's what the Division codes are going for on eBay right now. That plus a fresh 3-year warranty (as opposed to the used 290's which would have about a year left) and it could be considered the better deal.
|
# ? Feb 29, 2016 19:13 |
|
Thank you for the input, gents. I have zero interest in VR and prefer Nvidia for driver support and shadowplay, so I went with the MSI 970 and I'm looking forward to playing with the overclock and I was going to purchase the division anyway, which save me 40 to 60 bucks too.
|
# ? Feb 29, 2016 19:33 |
|
So uh, I've had my 290X flip out on me the last couple of days by freezing, dropping me to desktop from whatever I am playing, then going back to a black fullscreen, which I then must alt-tab out of and then back into. Wondering if this is a driver issue, cable issue or something with the GPU itself. Wonder if XFX even has a 290X to replace what I have if I RMA.
|
# ? Feb 29, 2016 23:10 |
|
FaustianQ posted:So uh, I've had my 290X flip out on me the last couple of days by freezing, dropping me to desktop from whatever I am playing, then going back to a black fullscreen, which I then must alt-tab out of and then back into. Wondering if this is a driver issue, cable issue or something with the GPU itself. Wonder if XFX even has a 290X to replace what I have if I RMA. If the card is bad, you usually get an upgrade to a card of equal or greater performance. Though companies keep spares on hand for just such an occasion.
|
# ? Feb 29, 2016 23:18 |
|
FaustianQ posted:So uh, I've had my 290X flip out on me the last couple of days by freezing, dropping me to desktop from whatever I am playing, then going back to a black fullscreen, which I then must alt-tab out of and then back into. Wondering if this is a driver issue, cable issue or something with the GPU itself. Wonder if XFX even has a 290X to replace what I have if I RMA.
|
# ? Mar 1, 2016 02:01 |
|
Alright. AMD's cooking something, and I don't know what. I do know that is a Razer Core, though. https://twitter.com/Thracks/status/704385232489742336 SwissArmyDruid fucked around with this message at 21:20 on Mar 1, 2016 |
# ? Mar 1, 2016 21:18 |
|
SwissArmyDruid posted:Alright. AMD's cooking something, and I don't know what. I do know that is a Razer Core, though. Much like FreeSync, I think they're trying to tape together some technologies that already exist and make it a standard.
|
# ? Mar 1, 2016 22:34 |
|
It's supposed to be a universal eGPU system that requires a minimum of changes for adoption if at all. So is AMD suggesting Thunderbolt 3 support on all their OEM mobile products? Honestly, it'd be kind of hilarious to have a compatible smartphone product that'd hook up to a Razor Core, but they're probably just talking ultrabooks and laptops.
|
# ? Mar 1, 2016 22:37 |
|
Maybe the Thunderbolt competitor they announced 2 years ago then never mentioned again is coming back? http://www.anandtech.com/show/7755/amds-dockport-given-virtual-overview
|
# ? Mar 1, 2016 22:50 |
|
This is AMD, so it's more likely they want the interfaces/whatever on using external GPUs through Thunderbolt 3 or whatever to be standardized instead of having a half-dozen different implementations over devices.
|
# ? Mar 1, 2016 22:52 |
|
Rastor posted:Much like FreeSync, I think they're trying to tape together some technologies that already exist and make it a standard. Is getting proper plug and play support still a problem that hasn't been solved?
|
# ? Mar 1, 2016 22:53 |
|
xthetenth posted:Is getting proper plug and play support still a problem that hasn't been solved? Plug and play support... for a GPU? Yes, I imagine there may be some problems to solve with that.
|
# ? Mar 1, 2016 22:59 |
|
Anime Schoolgirl posted:This is AMD, so it's more likely they want nobody to buy it
|
# ? Mar 1, 2016 23:15 |
|
lol is nvidia at it again? http://m.hardocp.com/article/2015/05/26/grand_theft_auto_v_image_quality_comparison_part_5/4 see: "our soft shadow solution is literally to use a 50x50 resolution shadow and blur it as gently caress! Yes, we do get higher FPS in GTA 5, why would you ask??" At least it doesn't have a glitch like the radeon one! e: Wait this is old as hell and probably fixed never mind. Truga fucked around with this message at 23:47 on Mar 1, 2016 |
# ? Mar 1, 2016 23:44 |
|
Truga posted:lol is nvidia at it again? But it looks the realest
|
# ? Mar 1, 2016 23:50 |
|
Truga posted:lol is nvidia at it again? http://http.download.nvidia.com/developer/presentations/2005/SIGGRAPH/Percentage_Closer_Soft_Shadows.pdf the theory is sounds. shadows aren't actually super hard in real life
|
# ? Mar 1, 2016 23:51 |
|
Yeah, but there's soft shadows and then there's "let's just make it super low resolution". Anyway, here's a better example: The jaggies are super obvious. I noticed this in the DCS 1.5 version and I thought it was due to lovely dx11 shadow implementation or something, but now I'm not sure anymore. Will have to check it out, if I can get DCS 1.5 but with old shadows it's going to be glorious.
|
# ? Mar 2, 2016 00:01 |
|
I can't find any reference to DCS using NVs PCSS/ShadowWorks Do you think this is being forced by the driver or something? It's middleware.
|
# ? Mar 2, 2016 00:26 |
|
Yeah, I thought it was a driver setting, but there is none and no matter what I do with the ingame settings jaggies stay. Guess it's just a thing with the dx11 version of DCS... The shadows are jaggy and they constantly move around just by zooming in/out. They didn't do that on the old client (but it performed like poo poo so I wouldn't want to use it anyway). Too bad, I hoped I'd be able to fix that.
|
# ? Mar 2, 2016 01:05 |
|
repiv posted:Maybe the Thunderbolt competitor they announced 2 years ago then never mentioned again is coming back? Oh poo poo, I'd forgotten about Dockport. Although, hm. I wonder if it isn't redundant now, since VESA just ratified DP1.4, which added better video stream compression, so *theoretically* you could just use a single Thunderbolt cable for both output and input from your notebook? At least, I'd like that very much. http://www.vesa.org/featured-articles/vesa-publishes-displayport-standard-version-1-4/ SwissArmyDruid fucked around with this message at 03:35 on Mar 2, 2016 |
# ? Mar 2, 2016 03:33 |
|
I recently bought an EVGA GeForce GTX 750 Ti SC and I had to check for a power connector slot to hook it up to my PSU, I checked my card and couldn't find a power slot anywhere, other than the DVI and HDMI slots. I'm just curious, do GTX 750 Ti SC cards don't need additional power or they do?
|
# ? Mar 2, 2016 04:15 |
Some 750ti models are bus powered, believe it or not.
|
|
# ? Mar 2, 2016 04:25 |
|
repiv posted:Maybe the Thunderbolt competitor they announced 2 years ago then never mentioned again is coming back? I'm pretty sure that's just USB 3.1 nowadays.
|
# ? Mar 2, 2016 04:32 |
|
Junior Jr. posted:I recently bought an EVGA GeForce GTX 750 Ti SC and I had to check for a power connector slot to hook it up to my PSU, I checked my card and couldn't find a power slot anywhere, other than the DVI and HDMI slots. Some do, some don't. The ones that have aux pins likely do - I think I remember that my EVGA FTW 750 Ti wouldn't power up without a connector, but that's marketed as a high-end gaming version of that chip. Others don't. You can get a Molex->6pin adapter if you need. Paul MaudDib fucked around with this message at 05:40 on Mar 2, 2016 |
# ? Mar 2, 2016 04:33 |
|
Okay, so I just found out if a graphics card runs at 60-watts, it doesn't need additional power. That's great to know, I didn't know the GTX 750 Ti was THAT good. Say, are there any new NVIDIA or Radeon cards that are bus powered too?
|
# ? Mar 2, 2016 04:37 |
|
I actually didn't know any 750tis were bus powered. As far as I know there is nothing better than those though. Perhaps 14nm will bring more
|
# ? Mar 2, 2016 04:57 |
|
Junior Jr. posted:Say, are there any new NVIDIA or Radeon cards that are bus powered too? Stay tuned. Polaris and Pascal are both likely to have major performance/watt improvements.
|
# ? Mar 2, 2016 05:19 |
|
Rastor posted:Plug and play support... for a GPU? Yes, I imagine there may be some problems to solve with that. It's understood how to hot-plug CPUs and RAM, what are the outstanding issues for GPUs?
|
# ? Mar 2, 2016 05:41 |
|
Junior Jr. posted:Okay, so I just found out if a graphics card runs at 60-watts, it doesn't need additional power. That's great to know, I didn't know the GTX 750 Ti was THAT good. Not right now, no. Even the 950 needs an AUX supply. The 750 Ti is currently the apex of bus-powered cards.
|
# ? Mar 2, 2016 05:41 |
|
Subjunctive posted:It's understood how to hot-plug CPUs and RAM, what are the outstanding issues for GPUs? Plug and play, not hot-plugging.
|
# ? Mar 2, 2016 05:50 |
|
I don't know why I am attracted to bus powered cards but I am. Can't wait till Pascal.
|
# ? Mar 2, 2016 05:50 |
|
Kramjacks posted:Plug and play, not hot-plugging. Hot plug seems like a superset of plug-and-play, no? I had a laptop with external GPU and I could just plug it in, didn't have to reboot or anything. Maybe I'm misunderstanding what you mean by it. What's the hard part of "plug and play" for GPUs?
|
# ? Mar 2, 2016 06:12 |
|
|
# ? Apr 30, 2024 12:03 |
|
Subjunctive posted:Hot plug seems like a superset of plug-and-play, no? I had a laptop with external GPU and I could just plug it in, didn't have to reboot or anything. I'm not the person you originally replied to, but I assume drivers. Like I can buy and LCD monitor, plug it into my computer and it works perfect, full features, no driver installation. For a GPU, I plug it in, and get very basic functionality until I install the drivers specific to the card. Edit: Unless external GPUs do just use generic display drivers already present in the OS? Kramjacks fucked around with this message at 06:26 on Mar 2, 2016 |
# ? Mar 2, 2016 06:22 |