Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Does it work now? Or no?

Adbot
ADBOT LOVES YOU

nrr
Jan 2, 2007

Well yeah, but it was working before, too. It's just I had that Windows Action centre thing with 3 or 4 instances of "unreported problems."

I've been getting ok results on BF4, but not really as high as I expected. Was under 60 fps on ultra, which I thought was a bit weird.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

craig588 posted:

The cheapest card that doesn't use a lot of power and would still offer a performance improvement would be the 650. Kepler has a huge focus on power efficiency. If you use anything earlier than Kepler the card will still use a whole lot of power even in the low load conditions of working as a PhysX board.

But you're saying if I do hook up a 650 as a Phys X board it will only draw low watts? Has anyone like Tom's hardware measured this out to see how the performance per watt / cost per FPS stacks up?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Average framerates aren't the deal so much as minimum frame times and reduction in stuttering. Switching between graphics rendering and PhysX's CUDA workload is a wasteful process for a single card to do.

And, small correction, a GTX 650 isn't the minimum card for PhysX, it's the optimum card. It'll handle the highest PhysX loads with aplomb and stay decently low-power for lighter loads. And it is very much a lower-power state, but don't fool yourself that this is a purchase that would pay off in FPS per watt or something. A PhysX card is a wasteful way to get FPS no matter how you cut it; a GTX 650 is just the least wasteful.

tjume
Feb 8, 2009
pcgh.de tests the Sapphire Radeon R9 290 Tri-X OC: http://www.pcgameshardware.de/AMD-Radeon-Grafikkarte-255597/Tests/Sapphire-Radeon-R9-290-Tri-X-OC-Test-1102085/

core: 1000MHz memory: 2600MHz
gaming: ~267W, 77°C, 2.9 Sone (amd reference cards 6.9 Sone)
their own Furmark'ish test: ~300W, 78°C temp, ~2.200 U/Min., 3.1 Sone

they did a bit of tweaking:
overclocking: core 1150MHz memory 3000MHz, +61mV, 150% Powertune limit, 50% fan speed result in 4.7 Sone
if you like silent systems it gets quiet with undervolting: -31mV & 35% fan speed results in 1.5 Sone

because of factory overclocking almost reaches performance of 290X in über mode while being half as quiet
hits market at the start of 2014, probably around 400€

Setzer Gabbiani
Oct 13, 2004

nrr posted:

Thanks. Is the free version of driver fusion enough or am I going to need the ~*PrEmIuM*~ version? Free version just told me it did what it could but couldn't get everything! (Maybe you should upgrade!) Is this bullshit, or are there likely bits leftover that are going to potentially cause problems?

I really think everyone should just dump Driver Fusion completely for Display Driver Uninstaller, partly because it's ricey garbage that destroyed everything that everyone used to like about Driver Sweeper, especially when it tries to bring OpenCandy along for the ride, and the other being that aside from being free, it gets rid of everything (and more) that Fusion expects you to pay for, leftover bits included

It also scores major points for having an interface that doesn't look like a tablet-friendly app

http://www.wagnardmobile.com/DDU/
http://forums.guru3d.com/showthread.php?t=379505

Rah
Mar 9, 2006

beejay posted:

Mod Note: This thread is for general GPU and videocard discussion, head over to the parts picking megathread if you just need help picking a card to buy.

You'll get better answers there.

Sorry! Removed it now. I didn't notice the thread for that and just saw GPU and assumed this would be the place to ask. I've posted it in the parts picking thread now.

Rah fucked around with this message at 02:22 on Dec 20, 2013

Wistful of Dollars
Aug 25, 2009


I'm going to leave this to pros, but AMD cpu's are not a good option these days, especially when it comes to games. I've never heard of the game you're playing but you can get a very good gpu for much less than 400 quid. 760 is a great choice (I don't know what your VRAM usage is with that many clients but it might make a 4gb 760 worth it)

You could probably use the money you saved on the 760 to get a much better CPU.

beejay
Apr 7, 2002

Mod Note: This thread is for general GPU and videocard discussion, head over to the parts picking megathread if you just need help picking a card to buy.

You'll get better answers there.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
AMD released a WHQL driver with the first round of CrossFire fixes, for those of you still waiting on that.

TheRevolution1
Sep 21, 2011
Just bought a gtx 770 and nvidia shield on newegg and they decided to tell me after my purchase went through that they don't have AC4 and splinter cell in stock so I won't be getting them.


:argh: :argh:

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

TheRevolution1 posted:

Just bought a gtx 770 and nvidia shield on newegg and they decided to tell me after my purchase went through that they don't have AC4 and splinter cell in stock so I won't be getting them.


:argh: :argh:

You know the Nvidia Shield was supposed to get a $100 off coupon from the 770 right? They might have run out of those too but I've seen it mentioned on Microcenter and here: http://www.tomshardware.com/news/nvidia-gtx-780-price-drop,24886.html

TheRevolution1
Sep 21, 2011

Zero VGS posted:

You know the Nvidia Shield was supposed to get a $100 off coupon from the 770 right? They might have run out of those too but I've seen it mentioned on Microcenter and here: http://www.tomshardware.com/news/nvidia-gtx-780-price-drop,24886.html

It wasn't a coupon, I just got them in a combo and it deducted it immediately. So yeah, at least I got that.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
Not to kick you when you're down or anything but I can sell you the Blacklist/Blackflag code for a generous $20 if you want, before I put it up on SA Mart for $25. I'd PM you but you don't have PM.

TheRevolution1
Sep 21, 2011

Zero VGS posted:

Not to kick you when you're down or anything but I can sell you the Blacklist/Blackflag code for a generous $20 if you want, before I put it up on SA Mart for $25. I'd PM you but you don't have PM.

Nah, go ahead and sell it on SA Mart. I was going to sell my code anyways.

EDIT: A little "backordered" note got added to my order. Maybe I'll be getting the code after all.

Seamonster
Apr 30, 2007

IMMER SIEGREICH
Installed the newest AMD drivers aaaannndd...my crossfire option has completely disappeared. There is no mention of the second card in the overdrive menu and its listed as a "disabled adapter" in the info section under hardware.

beejay
Apr 7, 2002

Is it advised to install this WHQL release if you have been running the betas?

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
OK, so I just said gently caress it, I went to Best Buy and nabbed a second GTX 770 and jammed it into my PC to see what would happen. My power supply is only 620 watts which was supposedly one of those "no loving way" deals.

I set things up for SLI and plugged in a Watts Up to measure the watts being pulled from the wall.

FFXIV, Max Settings, 1080p, 60fps frame cap, astonishingly the PC is pulling 260-300 watts from the wall, oddly this is the same or even less than a single card. Turned off the frame cap and the highest draw I could get from the wall was 505 watts with 124 fps.

COD: Ghosts, Max Settings, 1080p, no frame cap, came in at around 112 fps and pulled 450 from the wall max. Turning on VSync capped the rate back down to 60fps and again the watts went back to low 300's.

I thought it was going to be way crazier for draw than this? I'm not seeing any microstutter and things are just working fantastic. Without frame caps things won't hit above 505 and with a 60fps cap the cards will keep the rate completely locked down but draw only the watts of a single card. Seems like I only need a bigger PSU if I overclock? Neither the 4770K nor the 770's are overclocked.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.
The amount of wattage a GPU needs is generally highly exaggerated by the specifications of the card, this is to account for the people out there who buy junker PSU's that are rated at 1000 watts but can't do more than half that without exploding.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
drat it...

Now that The Mod's done, I wanted to turn my attention back to my ambilight project. Recently, Nvidia opened up NvAPI (their driver API) for public use, as well as NvENC (the video encode/decode API used by SHIELD and Gaikai streaming). I was really excited because this implied that NvFBC (Framebuffer Capture) would be included. NvFBC would trivialize making an ambilight project.

An ambilight, or more specifically the Adalight DIY project I'm working with, has three stages:
  • 1a) Screen capture
  • 1b) Processing
  • 2) Transmit RGB triplets to Arduino
  • 3) Arduino pipes RGB down an RGBLED chain for display
1a) Screen capture is the most difficult part because there is no one-size-fits-all solution. There's the ultra-slow GDI capture, hooked through some other ultra-slow Java API wrapper, and that's about it for easy ways. A group of Russians built a program to handle a number of ways, and it requires configuration and juggling for eight different screen capture methods, all of which work in different situations and never reliably.

NvFBC would trivialize this. I could capture the framebuffer in one line of code no matter what was on screen, then do all the processing in practically two clock cycles flat with CUDA. It would lower the barrier to entry on this project significantly and provide a thousand-times better experience.

And it turns out that NvFBC is still a private API only for registered developers. :bang: Everyone who knows how to use it is NDA'd.

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

Factory Factory posted:

drat it...

Now that The Mod's done, I wanted to turn my attention back to my ambilight project. Recently, Nvidia opened up NvAPI (their driver API) for public use, as well as NvENC (the video encode/decode API used by SHIELD and Gaikai streaming). I was really excited because this implied that NvFBC (Framebuffer Capture) would be included. NvFBC would trivialize making an ambilight project.

An ambilight, or more specifically the Adalight DIY project I'm working with, has three stages:
  • 1a) Screen capture
  • 1b) Processing
  • 2) Transmit RGB triplets to Arduino
  • 3) Arduino pipes RGB down an RGBLED chain for display
1a) Screen capture is the most difficult part because there is no one-size-fits-all solution. There's the ultra-slow GDI capture, hooked through some other ultra-slow Java API wrapper, and that's about it for easy ways. A group of Russians built a program to handle a number of ways, and it requires configuration and juggling for eight different screen capture methods, all of which work in different situations and never reliably.

NvFBC would trivialize this. I could capture the framebuffer in one line of code no matter what was on screen, then do all the processing in practically two clock cycles flat with CUDA. It would lower the barrier to entry on this project significantly and provide a thousand-times better experience.

And it turns out that NvFBC is still a private API only for registered developers. :bang: Everyone who knows how to use it is NDA'd.


Use an fpga to process the raw video signal by splitting your hdmi or display port cable . It'll probably have lower latency and would work on anything

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Factory Factory posted:

drat it...

Now that The Mod's done, I wanted to turn my attention back to my ambilight project. Recently, Nvidia opened up NvAPI (their driver API) for public use, as well as NvENC (the video encode/decode API used by SHIELD and Gaikai streaming). I was really excited because this implied that NvFBC (Framebuffer Capture) would be included. NvFBC would trivialize making an ambilight project.

An ambilight, or more specifically the Adalight DIY project I'm working with, has three stages:
  • 1a) Screen capture
  • 1b) Processing
  • 2) Transmit RGB triplets to Arduino
  • 3) Arduino pipes RGB down an RGBLED chain for display
1a) Screen capture is the most difficult part because there is no one-size-fits-all solution. There's the ultra-slow GDI capture, hooked through some other ultra-slow Java API wrapper, and that's about it for easy ways. A group of Russians built a program to handle a number of ways, and it requires configuration and juggling for eight different screen capture methods, all of which work in different situations and never reliably.

NvFBC would trivialize this. I could capture the framebuffer in one line of code no matter what was on screen, then do all the processing in practically two clock cycles flat with CUDA. It would lower the barrier to entry on this project significantly and provide a thousand-times better experience.

And it turns out that NvFBC is still a private API only for registered developers. :bang: Everyone who knows how to use it is NDA'd.

I used to use a program called Aurora Synaesthesia for my philips amBX kit back in the XP days, did full screen capture in video, games, no matter what. But when I upgraded to 7, I found it didn't work with Aero on, so I put the kit back in the box. It looks like it has been updated since then. I doubt it'll help with your gear, but it was my experience anyway.

Josh Lyman
May 24, 2009


Why do Nvidia drivers install virtual audio by default? That just seems like it's asking for pissed off users.

AzureSkys
Apr 27, 2003

I mentioned earlier having issues with my 680s in SLI. I messed around a bit with overclocking them and got it to what I thought was stable at around 1200mhz with mem at 7200mhz. Benchmarks passed good.

I thought it was fine, then when testing some games they'd get a bad FPS stutter that would basically pause it for a moment. I monitored a game menu that was pretty consistent in load and you can see the drops in GPU Power, Usage, and FPS (forgot to set range higher in the pic, would drop from ~120 to ~90). On GPU-Z you can see it on GPU load, Memory Controller Load and Power Consumption.


Things I've done based on tips read while searching for answers:
-set all OC settings back to default.
-reinstall drivers using these methods.
-fully uninstall all OC apps (EVGA Precision, Afterburner, etc)
-Set CPU back to default speeds
-Disabled HD Audio Controllers in Device Manager
-Enable "prefer max performance" in NVCP power management mode
-Enable, Disable vsync and adaptive vsync.
-rotated and reseated the SLI bridge
-connected one of the GPUs to it's own PSU to see if it wasn't getting enough power or something. It uses one molex to 6-pin adapter so I don't know if that affects things.
-Disabling SLI and running stuff on either card is fine with no stutters.

It was fine when I first put the PC together and now I'm afraid I broke something while testing for the OCs. If I disable SLI and test either GPU they both run fine with no drops. All I can think of next is a full format and reinstall of the OS. I'm about to head out of town for the weekend, though, so I guess I'll get to more testing next week on vacation.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Malcolm XML posted:

Use an fpga to process the raw video signal by splitting your hdmi or display port cable . It'll probably have lower latency and would work on anything

That's a brilliant idea, but would it cost less than $50 and a week of programming (for a programming idiot) to accomplish?

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!

Factory Factory posted:

That's a brilliant idea, but would it cost less than $50 and a week of programming (for a programming idiot) to accomplish?

How is your VHDL?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
I would have to Google VHDL just to tell you whether I know it or not. Computer science is not my forte.

E: I've never seen it before in my life.

I did find a blog post about a guy hacking a $20 HDMI color-timing widget for Buttcoin mining, though - maybe that same device could be hacked up to do what I need, already assembled. Can JTAG be used for communications, not just programming?

Haeleus
Jun 30, 2009

He made one fatal slip when he tried to match the ranger with the big iron on his hip.
My dual bios EVGA 780 will be arriving later today, can't wait to give it a spin on BF4 and AC4. Since I haven't done much GPU overclocking before, are there any performance differences between using a program like EVGA Precision and just flashing the bios? I was hoping to give the latter a try but I need to read up on how to go about doing it.

Edit: Nevermind, reading through Agreed's post on overclocking Keplar GPUs.

Haeleus fucked around with this message at 17:53 on Dec 20, 2013

jink
May 8, 2002

Drop it like it's Hot.
Taco Defender

Haeleus posted:

My dual bios EVGA 780 will be arriving later today, can't wait to give it a spin on BF4 and AC4. Since I haven't done much GPU overclocking before, are there any performance differences between using a program like EVGA Precision and just flashing the bios? I was hoping to give the latter a try but I need to read up on how to go about doing it.

Edit: Nevermind, reading through Agreed's post on overclocking Keplar GPUs.

Flashing a BIOS *can* increase the base clock but is usually performed for higher caps on power targets and voltage. Try overclocking without a flash first; if you run into limits outside of thermal then you can try flashing to get more voltage out of the card. Just be mindful that BIOS flashing is a very risky adventure.

Agreed's overclocking guide is nice. It's a great start. I've written a couple of posts on the subject as well.

Wistful of Dollars
Aug 25, 2009

My dual 290s, both flashed to 290x, are now humming along under water.

It's pretty nice. :unsmith:

I'm going to overclock them this weekend and see where we can go.

Also thanks to Deimos, who helped me simplify things to cram all that poo poo in an mATX case.

Wistful of Dollars fucked around with this message at 18:25 on Dec 20, 2013

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!

El Scotch posted:

Also thanks to Deimos, who helped me simplify things to cram all that poo poo in an mATX case.

:pcgaming::hf::pcgaming:

Ardlen
Sep 30, 2005
WoT



Factory Factory posted:

I would have to Google VHDL just to tell you whether I know it or not. Computer science is not my forte.

E: I've never seen it before in my life.

I did find a blog post about a guy hacking a $20 HDMI color-timing widget for Buttcoin mining, though - maybe that same device could be hacked up to do what I need, already assembled. Can JTAG be used for communications, not just programming?

JTAG really is not designed for communications. Most FPGAs have Ethernet though. They also have high-speed serial ports and PCIe if you are feeling ambitious.

Ardlen fucked around with this message at 19:13 on Dec 20, 2013

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
The FPGA apparently has GPIO headers, but they and all the serial communication are broken out to an unpopulated socket, apparently fit with a chip and/or headers on dev versions. I'd have to trace pinout and stick on my own doohickey, but that wouldn't be awful.

Assuming I can get anywhere with VHDL and write my own algorithms for this junk, rather than API_captureFrameBuffer(), graphics_convertColorSpace(), and math_averageBunchOfPixels().

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
Oh yeah, maybe I was losing my mind, but with the 770 SLI, not only was it barely taking any more power than a single card when games were capped at 60 FPS, but it seemed like the whole system was actually quieter too.

Does it make any sense that it could be spreading the load across two cards hence the fans have less work to do? I'm really only calling on the second 770 to give me 50% more performance over what I was getting, so I assume both cards are working at 75%. I'm just pleased as punch because it seems like I got better real-world performance and value than a more expensive card and still wound up with quiet and low-watt system. The NVidia Experience thing has so far made the SLI magic invisible to me on all the titles I've tried.

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

Factory Factory posted:

That's a brilliant idea, but would it cost less than $50 and a week of programming (for a programming idiot) to accomplish?

No, and probably not but VHDL is not a programming language

It was facetious but w/o doing it in hardware you're gonna get a shitload of lag which will make color transitions annoying

(unless you get the direct framebuffer thing to work)

2560x1440 * 24bpp * 60fps = 600MBps alone.

Just take http://taksi.sourceforge.net/ and have it output a color instead of saving it via VfW.

Ambilight clones are dime-a-dozen

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Oh, my setup works already, but I'm always on the lookout for a better mousetrap.

For what it's worth, I'd be working with 1080p or 1920x1200 rather than 1440p, so the bandwidth isn't quite that insane - 400 MB/s upper limit. And I don't need every frame; every other would still be great, so 200 MB/s. That needs to move to frame, but I don't need to manipulate the whole thing. Alls I need is to hold the frame, then do a few MSAA-style samples:



in a ~600 pixel area to generate one color per LED, 40 total in my case. Mostly reads and very simple processing. Fanciest I would get would be to convert to Lab for the averaging. Least fancy is just dumbly picking the center pixel in the area, which doesn't work too badly.

If I can do 20 FPS, I'm beating a PC-side software solution by 5 FPS. I don't find the light delay distracting.

Right now, it works off using PlayClaw, a Russian FRAPS clone, and piping its capture to a processing application. I get 10 to 25 FPS depending on the render pipeline of the game (Metro works great, for example - a solid 15 FPS with no noticeable monitor framerate drop).

--

If I could do this with NvFBC and CUDA, that'd open up all new possibilities. I could use a very large LED array to create a low-detail background projection, or a low-res long-distance screen.

Factory Factory fucked around with this message at 22:51 on Dec 20, 2013

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



jink posted:

Ehhh... No. I like Noctua and all but Gentle Typhoons (especially the GT15) reliably push more air through a radiator than any Noctua with less noise levels and at a cheaper price. http://martinsliquidlab.org/2012/05/07/r10-fan-testing-bitfenixnoctuasilenxnoiseblockerphobya/ Unless you need PWM, the GT-15 is the fan to go with. Be quick; Scythe and Nidec have reportedly stopped collaboration and no more Gentle Typhoon fans are to be produced.
I thought PWM was one of the things Factory was looking for, but apparently I was wrong. It is annoying though that the GT-15s don't have PWM control (though a lot still don't), as some software-based fan controllers have DPC issues (with Gigabyte being easily the worst in this regard).

I might need to pick up a couple more GT-15s though - the high RPM whine just drove me crazy in my old case. :(

Haeleus
Jun 30, 2009

He made one fatal slip when he tried to match the ranger with the big iron on his hip.
Dammit, my 780 was working just fine until I launched Black Flag. I got to play for maybe 20 minutes before the game crashed, and now it crashes every time I try to launch it (I'll see the barest glimpse of the "Inspired by..." screen before it goes back to desktop). Reinstalling didn't do anything either. WHY!?

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride
Overclocked? Even factory overclock? Drivers? Usually something like that if it's just one game.

Adbot
ADBOT LOVES YOU

Haeleus
Jun 30, 2009

He made one fatal slip when he tried to match the ranger with the big iron on his hip.

Dogen posted:

Overclocked? Even factory overclock? Drivers? Usually something like that if it's just one game.

First time I overclocked with Precision X, 2nd time it crashed at factory settings. BF4 is running smooth as butter, but after AC4 crashes once, I can't launch either it or BF4 until I reboot.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply