Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
jink
May 8, 2002

Drop it like it's Hot.
Taco Defender

SourKraut posted:

I thought PWM was one of the things Factory was looking for, but apparently I was wrong. It is annoying though that the GT-15s don't have PWM control (though a lot still don't), as some software-based fan controllers have DPC issues (with Gigabyte being easily the worst in this regard).

I might need to pick up a couple more GT-15s though - the high RPM whine just drove me crazy in my old case. :(

Ah. I missed the PWM requirement. That's why I opted for corsair SP120s instead of GT15; my mobo controls them based off temp. At idle they are silent but anything above that is noisy as hell. Ah well.

Regardless the gentle typhoons are great. Noctuas are great too! For PWM you can't argue with beQuiet!

Adbot
ADBOT LOVES YOU

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
It's not a requirement, it's just what the fan I want offers. All's said and done on that front, FYI.

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



I was looking through Martin's site and didn't see where he had tested the Prolimatech PRO-USV14s (his 140mm fan test is rather dated), but per this one very limited tested (which is where I originally saw the PRO-USV14 vs. GT-15 comparison), it'd be interesting to see what Martin comes up with. I saw where people "sponsor" tests, but is that as simple as sending in fans for him to use for his tests? If so, I'll send him a couple to use.

Wistful of Dollars
Aug 25, 2009

Braaaaaaains, I seek brains.

One of my 290s (Number 1) is running full tilt; but Number 2 is moseying along at only 1/3 clock speed. Looking at Afterburner, Number 1 has a min/max of 300/1050 and tends to run at max while Number 2 has a min max of 300/540 and tends to sit at 300 something.

What's going on here? :smith:

Shaocaholica
Oct 29, 2002

Fig. 5E
Are there any consumer or workstation GPUs that can drive 2x 4k displays over DP? Don't care about 3d performance. Just desktop apps.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Shaocaholica posted:

Are there any consumer or workstation GPUs that can drive 2x 4k displays over DP? Don't care about 3d performance. Just desktop apps.

Anything with two DP 1.2 ports. HIS makes a Radeon 7770 for $100 that should do the trick.

Shaocaholica
Oct 29, 2002

Fig. 5E

Oh great! I didn't realize that 2+ DP cards were a thing yet. Dare I ask if there are 3x and 4x DP cards?

Guni
Mar 11, 2010

Shaocaholica posted:

Oh great! I didn't realize that 2+ DP cards were a thing yet. Dare I ask if there are 3x and 4x DP cards?

There are some that can run 6 monitors IIRC.

Blue On Blue
Nov 14, 2012

Hey everyone,

I see the OP hasn't be updated since 2012, so the GPU recommendations are out of date.

Is the GTX 780 pretty well the fastest single card now? (aside from a titan of course)

And is there any real benefit to getting the TI version?


I have a 560ti now, and am finally starting to notice slow down in games, I think the 780 would be a good upgrade

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
This is not a parts-picking thread.

Shaocaholica posted:

Oh great! I didn't realize that 2+ DP cards were a thing yet. Dare I ask if there are 3x and 4x DP cards?

Here's a 6x. *Should* be all DP 1.2 ports with full 4K support.

Factory Factory fucked around with this message at 03:30 on Dec 22, 2013

Magic Underwear
May 14, 2003


Young Orc

Sappo569 posted:

Hey everyone,

I see the OP hasn't be updated since 2012, so the GPU recommendations are out of date.

Is the GTX 780 pretty well the fastest single card now? (aside from a titan of course)

And is there any real benefit to getting the TI version?


I have a 560ti now, and am finally starting to notice slow down in games, I think the 780 would be a good upgrade
What resolution? 780 is a bit overkill for 1080p. The 780ti is a terrible value, but it is the fastest out there if you must have that.

If you want value, the R9 290 is the best of the high end, at $400. Down from there is the R9 280x (rebadged 7970 Ghz Edition), but there has been a lot of price gouging lately due to cryptocurrency autists. Any of the cards I just mentioned will be a big jump from a 560ti.

Blue On Blue
Nov 14, 2012

Magic Underwear posted:

What resolution? 780 is a bit overkill for 1080p. The 780ti is a terrible value, but it is the fastest out there if you must have that.

If you want value, the R9 290 is the best of the high end, at $400. Down from there is the R9 280x (rebadged 7970 Ghz Edition), but there has been a lot of price gouging lately due to cryptocurrency autists. Any of the cards I just mentioned will be a big jump from a 560ti.

1920x1200

I also hate having to upgrade every year, that's why I bought the 560ti when it came out.

Blue On Blue fucked around with this message at 03:49 on Dec 22, 2013

Shaocaholica
Oct 29, 2002

Fig. 5E
So it seems like I can do 2x4k no prob and I plan to next year for work on some windows boxes. But....I will also need to do it in Linux. Not sure if I should post this in the Linux thread but can I do 4k over DP in Linux yet with current (nvidia) drivers? I say nvidia because our tools at work are not validated against AMD and nvidia cards seem to grow on trees here.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Shaocaholica posted:

So it seems like I can do 2x4k no prob and I plan to next year for work on some windows boxes. But....I will also need to do it in Linux. Not sure if I should post this in the Linux thread but can I do 4k over DP in Linux yet with current (nvidia) drivers? I say nvidia because our tools at work are not validated against AMD and nvidia cards seem to grow on trees here.

There shouldn't be a problem as long as you find a sufficient Kepler card. Off the top of my head, the only dual-DP 1.2 card is the MSI GeForce 760 Gaming ITX (the short one).

After some research, it looks like the Quadro K2000 is the cheapest beefy Quadro card with dual DP 1.2, and the Quadro NVS 510 has 4 DP 1.2 ports with next-to-no oomph behind them solely for business display use.

Quadro NVS 510. Book it, buy it, done. Four 4K monitors or four unique displays cloned to eight 2560x or sixteen 1920x screens.

Purgatory Glory
Feb 20, 2005
about 15 minutes in but so far a very interesting talk:
Oxide Games AMD Mantle Presentation and Demo http://www.youtube.com/watch?v=QIWyf8Hyjbg

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Purgatory Glory posted:

about 15 minutes in but so far a very interesting talk:
Oxide Games AMD Mantle Presentation and Demo http://www.youtube.com/watch?v=QIWyf8Hyjbg

This was posted a few days ago. Still "We can't talk about that" for anything meaningful (non-naive implementations question), still banging the draw calls drum.

Cool advert for a pretty neat sounding engine, though, given the growth in calculation power it makes sense to move from raw primitive throughput to explicit shader tflops instead of conventional rasterization. Kinda neat there.

Nobody thinks "man, D3D's draw call overhead is awesome!" Also, nobody thinks "a proprietary solution to the draw call issue rules." And it is de facto proprietary from all the info we have so far. Call it otherwise but they dramatically overstate how similar Kepler vs. GCN 1 and GCN 1.1 actually are in situ. It's just a way for AMD to wink and nod and say HEY it's not REALLY proprietary, all you have to do is ~basically alter your architecture to be more like ours and you too can use Mantle!

Agreed fucked around with this message at 14:22 on Dec 22, 2013

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Unexpected The Mod hiccup: the VRM fan on the GPU can't get enough air.

Option 1: Replace case window with mesh, allow airflow. Might have to cut mesh somewhat.

Option 2: Replace fan with low-profile fan. Would have to disassemble The Mod and remount.

E: It looks like a company called ModRight makes cheap mesh panels. I can replace the window with that stuff... DE: If I could find it. May just have to buy some Staples home office mesh thing and cut it up.

Factory Factory fucked around with this message at 05:50 on Dec 22, 2013

Purgatory Glory
Feb 20, 2005

Agreed posted:

This was posted a few days ago. Still "We can't talk about that" for anything meaningful (non-naive implementations question), still banging the draw calls drum.

Cool advert for a pretty neat sounding engine, though, given the growth in calculation power it makes sense to move from raw primitive throughput to explicit shader tflops instead of conventional rasterization. Kinda neat there.

Nobody thinks "man, D3D's draw call overhead is awesome!" Also, nobody thinks "a proprietary solution to the draw call issue sucks." And it is de facto proprietary from all the info we have so far. Call it otherwise but they dramatically overstate how similar Kepler vs. GCN 1 and GCN 1.1 actually are in situ. It's just a way for AMD to wink and nod and say HEY it's not REALLY proprietary, all you have to do is ~basically alter your architecture to be more like ours and you too can use Mantle!

I'm a complete novice but by the end of this video was scratching my head wondering what their demo was showing and comparing to what we can currently do. Maybe show the demo choking some other system?

Professor Science
Mar 8, 2006
diplodocus + mortarboard = party
For those of you interested in Mantle, this thread on Beyond3D may be interesting (primarily about draw call performance and CPU work dispatch versus GPU work dispatch). Andrew Lauritzen is a graphics researcher at Intel who knows more about shadowing systems than maybe anyone else and an all around Very Smart Dude. Several other industry developers in there too.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Professor Science posted:

For those of you interested in Mantle, this thread on Beyond3D may be interesting (primarily about draw call performance and CPU work dispatch versus GPU work dispatch). Andrew Lauritzen is a graphics researcher at Intel who knows more about shadowing systems than maybe anyone else and an all around Very Smart Dude. Several other industry developers in there too.

Man, this thread rules. I thought this post was a little funny.

quote:

OpenGL 4.4 adds a new feature (ARB_indirect_parameters) to fetch the "maximum" draw call count from a GPU buffer, but it doesn't guarantee that the driver will do any less draw calls than the parameter set by CPU. So it's kind of an optimization hint. Also currently only Nvidia supports OpenGL 4.4 (beta driver). Neither AMD or Intel have announced any plans to support OpenGL 4.4. ARB_indirect_parameters is a very good feature indeed (assuming it actually cuts draw calls), but unfortunately a production quality GPU-driven rendering pipeline cannot be build on top of the promise that sometime in the future maybe we will get broad support for a critical enabler feature.

:ironicat:

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
'Bout time. EVGA just released an MST hub for $100. Only hiccup is that apparently you can't enable Nvidia Surround on the MST'd displays. Since it's MST and thus splits bandwidth from a single DP 1.2(a) port, the maximum resolutions out of the HUB are officially 1920x1080 per monitor, but should definitely support 1920x1200 and maybe support 2x1080 + 1x1440.

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers
I've noticed that there are DP to VGA converters, would this be perfect for my needs since I can't use a DVI-VGA converter on my new video card? I'm not entirely sure how DP works, I've never used it.

By the way, is it still early days as far as driver support goes with the R9 290? Is it possible we could be seeing further performance gains in most games? I was playing Assassin's Creed 4 and either the driver performance is bad or the game is just awfully optimised. I can't seem to get more than 50fps stably in cities with drops to 40fps quite common. Kind of a bummer.

I checked GPU-Z while it was running and it was using 97% of my GPU but only roughly 60% of my CPU. I'm not sure that matters much but it didn't seem processor limited.

Oh also, since 13.12 is out I installed it and was able to turn the fan up so it's not hitting 94C all the time. I mean it's rated for 95 but I worry it's causing increases in ambient temps.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

cat doter posted:

Assassin's Creed 4 ... is just awfully optimised.

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map

cat doter posted:

I've noticed that there are DP to VGA converters, would this be perfect for my needs since I can't use a DVI-VGA converter on my new video card? I'm not entirely sure how DP works, I've never used it.

I currently use a DP to VGA adapter myself for my secondary monitor (also used a Mini-DP to VGA before as well), but in my experience results have been mixed and at one point they have involved 1. Intel graphics drivers and 2. my terribly-optimized computer that at one point had those drivers installed, and even had their low-level settings screwed with using LucidLogix Virtu MVP. Don't touch that program, it is made of snake oil and current issues I have may or may not be related to it.

Professor Science
Mar 8, 2006
diplodocus + mortarboard = party

Agreed posted:

Man, this thread rules. I thought this post was a little funny.
That is pretty much a textbook example of why standards don't mean nearly as much as people claim, especially when it comes to GPUs.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
I set up SLI 770's for the first time Friday and they worked flawlessly. Now I came back to my house today to play and I'm getting this pronounced stutter almost exactly once per second. I googled it and this looks like exactly the same problem as here:

http://www.tomshardware.com/forum/386158-33-setup-rhythmic-stutter-vsync-enabled

Turning off V-Sync fixes the issue but then I get screen tearing even when I use Precision X to lock the FPS at 60 (and that's weird because without a framerate cap my rate is around 120 solid, so there's no excuse for the tearing).

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Zero VGS posted:

I set up SLI 770's for the first time Friday and they worked flawlessly. Now I came back to my house today to play and I'm getting this pronounced stutter almost exactly once per second. I googled it and this looks like exactly the same problem as here:

http://www.tomshardware.com/forum/386158-33-setup-rhythmic-stutter-vsync-enabled

Turning off V-Sync fixes the issue but then I get screen tearing even when I use Precision X to lock the FPS at 60 (and that's weird because without a framerate cap my rate is around 120 solid, so there's no excuse for the tearing).

Try using the driver to enable double buffering as well. That'll toggle between drawn frames rather than overwrite while drawing.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Factory Factory posted:

Try using the driver to enable double buffering as well. That'll toggle between drawn frames rather than overwrite while drawing.

Where's that? I only see Triple Buffering in Nvidia Control Panel. There is also "Smooth" V-Sync which says that it reduces stuttering in SLI, is that a good idea with my setup?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Might need a fancier program, like Nvidia Inspector, to force double buffering on; otherwise, check the game settings. For now, you have a 120 Hz monitor? Try Adaptive VSync (half refresh). Triple buffering is a Vsync-on setting. You could also try the Smooth setting, see how it works, but I've never seen it, as a non-SLI user.

Straker
Nov 10, 2005
Finally home to play with my pair of HIS 290s... not unlockable, a little annoying since they're like, the last of the early cards and didn't come with bf4 keys or anything but whatever, can't complain since I basically got both cards for $400 :)

edit: I hope the next desktop CPUs released are actually worth upgrading to from SB... maybe then I'll finally give in and go with server + desktop instead of just one beefy desktop. 8 hard drives + SSD etc + crossfired high end video cards is cramped and ridiculous as gently caress in any ordinary ATX case... but until now, every time I upgraded, my old poo poo was so crappy as to not even be worth foisting on anybody. and this time, my old C2D and 120GB Intel G2 etc. ended up in a hand me down I built for the girlfriend so she could play on facebook while her kid plays PS2/BF4 on the new $2000 PC I built for her a few months ago :v:

Straker fucked around with this message at 06:04 on Dec 23, 2013

AzureSkys
Apr 27, 2003

Zero VGS posted:

I set up SLI 770's for the first time Friday and they worked flawlessly. Now I came back to my house today to play and I'm getting this pronounced stutter almost exactly once per second. I googled it and this looks like exactly the same problem as here:

http://www.tomshardware.com/forum/386158-33-setup-rhythmic-stutter-vsync-enabled

Turning off V-Sync fixes the issue but then I get screen tearing even when I use Precision X to lock the FPS at 60 (and that's weird because without a framerate cap my rate is around 120 solid, so there's no excuse for the tearing).

That's similar to my issue I posted. What I don't get is it worked fine for over two weeks. When I started testing OC settings, which often led to driver crashes until I figured out the right range, I then started to have the stutter. It's about every 5 seconds for me:


When I get home I'll try triple buffering and smooth settings. Adaptive vsync didn't change anything for me.

td4guy
Jun 13, 2005

I always hated that guy.

Sidesaddle Cavalry posted:

I currently use a DP to VGA adapter myself for my secondary monitor (also used a Mini-DP to VGA before as well), but in my experience results have been mixed and at one point they have involved 1. Intel graphics drivers
Same. I bought this DP to VGA converter and it would randomly stop working at any resolution higher than 1680x1050. Using an Intel video chipset.

Ham Sandwiches
Jul 7, 2000

AzureSkys posted:

That's similar to my issue I posted. What I don't get is it worked fine for over two weeks. When I started testing OC settings, which often led to driver crashes until I figured out the right range, I then started to have the stutter. It's about every 5 seconds for me:


I had a similar experience with a single 680.

My Geforce 570 died and rather than wait for the RMA return I decided to also buy a 680 in the meantime - and sell the 570 warranty replacement

As soon as I put the 680 in, I started getting a rhythmic stutter that happened every few seconds. I ran driver sweeper and did everything short of a full reinstall, and it never went away - even though I was going Nvidia to Nvidia. I returned the 680 to Amazon, switched back to the RMA 570, and no stutter ever again. When I researched the issue I found lots of threads complaining about stutter on the 680, and not just within games - in various simple tasks like video playback. There was no resolution in any of the threads I found, though some people reported power management tweaks and vsync helped.

The prevailing theory I read was the issue may caused by the power management introduced in the 680. Most people that complain in threads seem to have the 680, but the 670 can be affected too. I didn't find many complaints about the 7 series cards.

If you google search "geforce 680 stutter" you'll find most of the threads I read, you're welcome to draw your own conclusions.

Ham Sandwiches fucked around with this message at 10:38 on Dec 23, 2013

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
Huh, and a 680 and 770 are pretty much the same thing (680 can be BIOS flashed into a pseudo-770) so that would make sense.

It's weird because COD: Ghosts doesn't have the issue at all with 60hz V-Sync, but FFXIV: Realm Reborn does.

Gwaihir
Dec 8, 2009
Hair Elf
I have a GTX680 and have never seen issues like that, with power management set to the high performance settings. :iiam:

BOOTY-ADE
Aug 30, 2006

BIG KOOL TELLIN' Y'ALL TO KEEP IT TIGHT

Rakthar posted:

The prevailing theory I read was the issue may caused by the power management introduced in the 680. Most people that complain in threads seem to have the 680, but the 670 can be affected too. I didn't find many complaints about the 7 series cards.

If you google search "geforce 680 stutter" you'll find most of the threads I read, you're welcome to draw your own conclusions.

Makes me wonder if it's a similar issue to what some AMD/ATI users had with HD 6xxx cards - the ULPS feature would lower the clocks in 2D mode but would sometimes cause weird screen flickering, regardless of what was being done. I had the issue with an old 6850 card, and even after that with a 6970, and the "fix" was disabling ULPS with the MSI Afterburner and via registry key. Just a really odd issue, I thought initially my cards were bad because I'd never seen issues like that unless the card was either dying or had an unstable overclock.

I picked up a 670 FTW recently from a friend who upgraded to a 780 (somehow got a deal on Amazon for $300, wish I'd seen that and snagged one) and haven't had any issues thus far. Overclocks pretty well on stock and I was surprised how cool it stays, usually idles around 27ºC and even during gaming with the fan at 55% fixed, I've never seen it go above low 60s playing stuff like Bioshock:Infinite @ 1920x1080 with pretty much everything maxed out.

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride

Ozz81 posted:

Makes me wonder if it's a similar issue to what some AMD/ATI users had with HD 6xxx cards - the ULPS feature would lower the clocks in 2D mode but would sometimes cause weird screen flickering, regardless of what was being done. I had the issue with an old 6850 card, and even after that with a 6970, and the "fix" was disabling ULPS with the MSI Afterburner and via registry key. Just a really odd issue, I thought initially my cards were bad because I'd never seen issues like that unless the card was either dying or had an unstable overclock.

On nvidia if it's an idiot power state problem there is a tool called multiple monitor power saver included in nvidia inspector that you can use to set thresholds to change the power state manually.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
I'm partway through the threads/presentation that Agreed linked, and after talking with him about it so far, there's one big-rear end conclusion that's already come up:

The reason why CPU matters for minimum frame times in video games is not game sim, but inefficient render workflow memory management.

That's why Mantle works, and works well, and it's why Battlefield 4 can take the same engine as Battlefield 3, update it to be even shinier, and still make it gofast on basically everything including some old Phenom IIs and the new-gen consoles.

Every time the GPU makes a draw call to render part of the scene, it needs all the data necessary to render that part. Complex scenes can have dozens, hundreds, even thousands of draw calls - the current push in programming and hardware is to achieve a steady 1 million draws at 60 FPS. There are two major sources of performance bottleneck in draw calls:

1) Memory management, specifically fetching multiple copies of the same resource. The CPU -- PCIe --> GPU pathway is very high-latency and slow, and multiple transfers create a huge bottleneck. Nvidia is working on this (Agreed's link).

2) API overhead, especially in DirectX. When the GPU does make a resource request that has to take the slow way through the CPU, Windows doesn't guarantee anything in terms of how fast that process is tended to. AMD is working on this, reducing the average service time for draw calls - that's Mantle. Nvidia is doing some work here in OpenGL, but they haven't tried to push it like AMD has pushed Mantle.

Both of these factors can give an order-of-magnitude speedup to the time it takes to complete a draw call. If one or both are properly addressed, then the result would be that CPU is no longer a bottleneck to GPU performance.

Factory Factory fucked around with this message at 19:37 on Dec 23, 2013

Phuzun
Jul 4, 2007

Got my MSI GTX 780 installed last Friday. Stock is 902mhz(1050 boost)/3004mhz and with the factory air cooling, I got it up to 1176mhz(+126)/3499mhz with a 25mv voltage bump. After a good heat soak from folding@home, it was getting up to 76C with the stock fan config. The EK full coverage block just showed up in the mail and will be going on this week.

Also picked up some new fans as the ones I had been using were developing a really annoying bearing noise. These fans are pretty nice, low air flow for my quiet power radiators and don't seem to mind being horizontal or vertical for mounting.

Adbot
ADBOT LOVES YOU

MUFFlNS
Mar 7, 2004

Bit of an odd question here, but I was wondering which of these two mobile GPUs would offer superior performance?

GeForce GT 750M 1GB GDDR5 @ 1920x1080
or
GeForce GTX 775M 2GB GDDR5 @ 2560x1440

It's my understanding that the 775M having access to twice the amount of memory would compensate for the increase in resolution, and thus offer the superior performance of the two, but I'm not 100% sure so thought I'd ask here since I'm under the impression that the main use of more memory on GPUs is for supporting higher resolutions and multi-monitor gaming?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply