|
Linux Nazi posted:Is it just me, or is SLI support getting really lovely lately? I've been rocking 1440p and Dual 570s for ages now, and I'm starting to feel the squeeze of performance definitely. But as for support, it amazes me how many games support sli. There are a couple big cases that don't, but it still baffles me when I pull up my afterburner overlay and both cards are being used in some obscure indie game. I also find it cool that Nvidia still does Optimiser profiles for Sli570s @ 1440p, it seems like such a bizarre and ancient combo and yet new games always have recommended settings from Nvidia. Biggest problem I'm having is instability crap with my Gigabyte 570, never buying their stuff again.
|
# ? Jun 27, 2014 00:22 |
|
|
# ? Apr 25, 2024 21:49 |
|
Don Lapre posted:Everyone should have a 64bit os at this point. So whats up with (MS at least) still releasing 32bit builds for Win8 and Win9(yes?)?
|
# ? Jun 27, 2014 01:41 |
|
Shaocaholica posted:So whats up with (MS at least) still releasing 32bit builds for Win8 and Win9(yes?)? Some people still need to run 16 bit software, lol.
|
# ? Jun 27, 2014 02:35 |
|
kode54 posted:Some people still need to run 16 bit software, lol. I know you're kidding but whats stopping MS from supporting that in 64bit windows?
|
# ? Jun 27, 2014 02:37 |
|
Shaocaholica posted:I know you're kidding but whats stopping MS from supporting that in 64bit windows? Some 16-bit features were so useless in 64-bit mode than they just shoved them off into a different operating mode. You can do 16+32bit mode or 32+64bit mode, but not 16+32+64. E: Late edit, but the CPU operating mode, that is. It's a hardware issue. Factory Factory fucked around with this message at 04:48 on Jun 27, 2014 |
# ? Jun 27, 2014 02:44 |
|
I wasn't totally kidding, either. For instance, the wholesale company my dad worked for up until March of 2013, ran Windows 98 on all of their office machines until about four years ago, because their sales and inventory tracking database software ran under MSDOS. They now sport Windows XP on all their machines, running a shiny new Gooey that accepts case insensitive logins and passwords.
|
# ? Jun 27, 2014 03:14 |
The main inventory system where I work uses software that probably predates DOS. 300 million dollars in assets circulates through this system. I don't know what it actually runs on though I telnet to it
|
|
# ? Jun 27, 2014 03:59 |
|
See also: why X86 is still the dominant architecture after forty loving years. "Oh yeah, we totally have far more efficient designs cooked up, we just can't deploy them because everyone would scream bloody murder about having to spend money."
|
# ? Jun 27, 2014 04:34 |
|
Rime posted:See also: why X86 is still the dominant architecture after forty loving years. Is that why Windows doesn't run on ARM? Oh wait...
|
# ? Jun 27, 2014 04:57 |
BurritoJustice posted:but it still baffles me when I pull up my afterburner overlay and both cards are being used in some obscure indie game. for example, Planetside 2, with Crossfire on (in AFR friendly mode which is afaik the only way to get it to work at all) I'm getting this weird sort of micro-rubberbanding which almost feels like a netcode issue, my view just sort of snaps around about twice a second. I tried turning CF off to fix it, and hooray it fixed it, but Afterburner still shows both GPUs at an equal amount of usage. What the gently caress is going on there? CF is OFF, the second GPU should just be floating around at like 2% usage.
|
|
# ? Jun 27, 2014 04:58 |
|
Ignoarints posted:The main inventory system where I work uses software that probably predates DOS. 300 million dollars in assets circulates through this system. I don't know what it actually runs on though I telnet to it Not surprised at all. A lot of mines i have been to have ISA based hardware they paid a lot of money for 30 years ago, and talk to the system via terminal emulation. Ascii interfaces aint dead, not by a long shot.
|
# ? Jun 27, 2014 06:12 |
|
Straker posted:Thing is, both cards might not be being used... If I turn off Sli the second card doesn't even leave idle voltage, which is my surefire test. I still see minor loads on my second card while sli is off, usually under 10%. If sli is on the cards will match load, so they'll both be on load voltage and have the same load percentage. Hell, with frame limiting often using two cards is quieter than one, as I'll have for example two cards at 50%/50% instead of one screaming at 100%. Sli Fermi cards is still a noisy experience either way.
|
# ? Jun 27, 2014 06:35 |
|
Factory Factory posted:We missed some news! Someone asked if 290X are fully enabled on the Beyond3d forums a good while ago, to which Dave Baumann answered with a straight yes. Fame Douglas posted:Or adaptive VSync and quieter/cooler cards. So all those people buying 770s instead of 280X are paying more for adaptive vsync(that you can get with a third party app anyways)?
|
# ? Jun 27, 2014 10:59 |
|
BurritoJustice posted:it still baffles me when I pull up my afterburner overlay and both cards are being used in some obscure indie game. You seem to be under the common misconception that developers have anything to do with SLI/Crossfire. Multi-GPU is entirely hidden away within the driver, without even vendor-specific APIs to manage it. The best you can do, as a developer, is get the GPU count and basic information about each model. Even games "sponsored" by Nvidia or AMD, at best, will get development support before launch to start optimizing single GPU driver performance, implement vendor-specific features and, very rarely, multi-GPU will also be addressed. The rest of the time, developers just launch their game and wait for driver developers to cobble together a profile for their game. Most recent example I can think of is Thief, which is "Powered by AMD" and was supposed to support Mantle out of the box. Well, Mantle only came two months later, and Crossfire initially was plagued with light flickering issues. In that sense, indies are the most likely to have their game work well with multi-GPU. Most of the time they'll be using popular, common game engines without making too many modifications to the renderer. Which means that the driver is far more likely to already have a good multi-GPU profile match for the game.
|
# ? Jun 27, 2014 15:20 |
|
Arzachel posted:Someone asked if 290X are fully enabled on the Beyond3d forums a good while ago, to which Dave Baumann answered with a straight yes. Sounds like they were selling faulty 295x's then while binning the 100% perfect ones?
|
# ? Jun 27, 2014 15:55 |
|
Looks like AMD's answer to Shadowplay is starting to mature a little bit: http://www.anandtech.com/show/8224/hands-on-with-amds-gaming-evolved-client-game-dvr It's still bit buggy (especially with the framerate), and it's still attached to Raptr, but it still has a ridiculously low performance hit and great quality (although it shares Shadowplay's 4:2:0 color problem).
|
# ? Jun 27, 2014 18:28 |
|
I'm more interested in AMD's answer to GameStream. I really like the idea of being able to hook a Fire TV up and play games on my TV streamed from my PC.
|
# ? Jun 27, 2014 18:47 |
|
Krailor posted:I'm more interested in AMD's answer to GameStream. I really like the idea of being able to hook a Fire TV up and play games on my TV streamed from my PC. Doesn't Steam do game streaming now?
|
# ? Jun 27, 2014 18:53 |
|
Hace posted:Looks like AMD's answer to Shadowplay is starting to mature a little bit: http://www.anandtech.com/show/8224/hands-on-with-amds-gaming-evolved-client-game-dvr There's already an OBS build with it too, very early but still, https://obsproject.com/forum/threads/obs-fork-branch-with-amd-vce-support.13996/
|
# ? Jun 27, 2014 19:03 |
|
deimos posted:Is that why Windows doesn't run on ARM? Oh wait... It doesn't
|
# ? Jun 27, 2014 19:23 |
|
HalloKitty posted:Doesn't Steam do game streaming now? Yeah, I'm just trying to go for lowest total system cost. The additional money over $99 I spend putting together a steambox for streaming could just be spent going from an AMD to NVIDA card in my comp.
|
# ? Jun 27, 2014 19:25 |
|
Wozbo posted:Sounds like they were selling faulty 295x's then while binning the 100% perfect ones? To sell salvage chips you have to disable faulty units but, according to Dave, 290X (and 295x2) are fully enabled. Parametric binning is a different thing entirely.
|
# ? Jun 27, 2014 20:13 |
|
So I got Far Cry 3 recently on Steam. Installed just fine and got through the cutscene just fine. I get about 1-2 minutes into the game before the game hands for 5 seconds and then black screen>comes back up and runs for another couple seconds to min before doing that again. I get a message about my Diplay Driver crashing. I posted my comp stats below. I find it really hard to believe that the graphics are too straining, considering I ran AC3 on Very High recently. Windows 7 15-2500k @ 3.3GHz 8gb RAM Radeon HD 6900 Series
|
# ? Jun 27, 2014 20:32 |
|
A driver crash has nothing to do with performance and everything to do with out-of-date drivers and/or unstable hardware. Update your drivers, and if you still get crashes, back off any overclock you may have set. If doing this still doesn't fix it, next you would look into the graphics card overheating or possibly being faulty and needing a warranty replacement.
|
# ? Jun 27, 2014 20:35 |
|
Factory Factory posted:A driver crash has nothing to do with performance and everything to do with out-of-date drivers and/or unstable hardware. Thanks for the quick response. I just updated the drivers this morning and have never overclocked, so I guess the card may be overheating? It does not crash on any other games so that seems to not lead to it being faulty?
|
# ? Jun 27, 2014 21:02 |
|
Hace posted:Looks like AMD's answer to Shadowplay is starting to mature a little bit: http://www.anandtech.com/show/8224/hands-on-with-amds-gaming-evolved-client-game-dvr
|
# ? Jun 27, 2014 21:19 |
|
Hace posted:(although it shares Shadowplay's 4:2:0 color problem). Is this really a problem? I like 4:4:4 as much anyone else but the truth is that any streaming service is going to downsample to 4:2:0 anyway.
|
# ? Jun 27, 2014 22:28 |
|
Shaocaholica posted:Is this really a problem? I like 4:4:4 as much anyone else but the truth is that any streaming service is going to downsample to 4:2:0 anyway. Also VCE 2.0 supports 4:4:4 so maybe it's something OBS can implement. From some samples I've seen VCE is no slouch when it comes to encoding, quality wise it seems oodles better than NVENC at the same bitrate. (e: but still worse than Haswell QS) deimos fucked around with this message at 22:52 on Jun 27, 2014 |
# ? Jun 27, 2014 22:41 |
|
A lot of software players and streaming services don't support anything beyond 4:2:0 encoded H.264. I think it would be nice to have for cutting machinema...maybe. But in all honestly I think anything above 4:2:0 for game footage capture is a bit extreme. Who's really going to notice?
|
# ? Jun 27, 2014 22:59 |
|
It's all about the editing. h.264 is a delivery format, and pretty much any edit to an h.264 file is going to accentuate artifacts. You want a 4:4:4 uncompressed format for the same reason you want an Illustrator file for your pamphlet instead of a 300x300 JPEG.
|
# ? Jun 27, 2014 23:32 |
|
You don't really degrade picture much in -editing-. You do in compositing. Macroblocking and other h.264 artifacts are what will degrade picture if you double up on them. Editing 420 and encoding another 420 from that edit shouldn't degrade picture with respect to color subsampling. The process of subsampling color down from 444 to 420 doesn't stack generationally. Its the compression artifacts that will stack on you but still, we're talking about game footage here. With respect to game footage, I think encoder quality, framerate and bitrate are way more important than color subsampling.
|
# ? Jun 27, 2014 23:38 |
|
Again, VCE 2.0 can do 4:4:4 if necessary, in fact the OBS branch does 444 downconversion using OpenCL.
|
# ? Jun 28, 2014 05:20 |
|
HalloKitty posted:Doesn't Steam do game streaming now? I made a gigantic thread OP about it: http://forums.somethingawful.com/showthread.php?threadid=3645789
|
# ? Jun 28, 2014 05:44 |
|
So, I just discovered that this thing exists. https://www.youtube.com/watch?v=N9SLjiR7Q4s
|
# ? Jun 28, 2014 13:35 |
|
Since cards won't draw more power than a set amount by design that seems kinda pointless.. Visiontek or Powercolor or some other company made an even more useless card at some point that was literally just a few caps on a PCB but I can't find the link now.
|
# ? Jun 28, 2014 14:52 |
|
The gently caress is that thing supposed to do? "Clean power to the PCIE bus!" Uhh... ok. There's a thread about it on the EVGA forums and people just seem to want it for no reason, but there are some people saying that there's a problem if you have an EVGA motherboard in that the 24 pin power connector can get "overloaded" and it's dangerous, and they are mad that EVGA is charging $20 for the fix. What is happening.
|
# ? Jun 28, 2014 16:38 |
|
beejay posted:The gently caress is that thing supposed to do? "Clean power to the PCIE bus!" Uhh... ok. There's a thread about it on the EVGA forums and people just seem to want it for no reason, but there are some people saying that there's a problem if you have an EVGA motherboard in that the 24 pin power connector can get "overloaded" and it's dangerous, and they are mad that EVGA is charging $20 for the fix. What is happening. Isn't that kinda like what Gigabyte does by having a sata power connector for the PCIe lanes? Pretty lovely if you ask me considering EVGA mobos are ludicrously overpriced and incredibly spartan in features compared to other top end boards from Asus/MSI/Asrock.
|
# ? Jun 28, 2014 17:04 |
|
td4guy posted:So, I just discovered that this thing exists. 4 pin molex, wtf year is this evga?
|
# ? Jun 28, 2014 17:39 |
|
cisco privilege posted:Since cards won't draw more power than a set amount by design that seems kinda pointless..
|
# ? Jun 28, 2014 17:40 |
|
|
# ? Apr 25, 2024 21:49 |
The Lord Bude posted:Isn't that kinda like what Gigabyte does by having a sata power connector for the PCIe lanes? Pretty lovely if you ask me considering EVGA mobos are ludicrously overpriced and incredibly spartan in features compared to other top end boards from Asus/MSI/Asrock. I have a board like that. It was important for bios-modded SLI overclocking stability, but as far as could tell it doesn't matter in any normal scenario (and that's on a motherboard that needs auxiliary PCIe power)
|
|
# ? Jun 28, 2014 18:48 |