Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Gwaihir
Dec 8, 2009
Hair Elf

Jago posted:

Heat IS atomic vibrations.


Correct! At the old place where I used to work we had an absolute zero teset lab that had to incorporate stuff like a faraday cage, and a huge vibration insulation/dampening system for the actual experimental platform. Otherwise you end up with random things screwing up your stuff.

Adbot
ADBOT LOVES YOU

Josh Lyman
May 24, 2009


Back in 2003, Intel said quantum tunneling is why Moore's Law would hit a wall between 2013 and 2018.

Blorange
Jan 31, 2007

A wizard did it

I suppose a theoretical limit to the speed of a transistor switching is limited by its size and the speed of light. With the 22 nm process, I suppose that puts us at a maximum switching frequency in the petahertz range.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
From last page, if anyone has any idea:

---
My previous GPU setup was two 770's in SLI to run two instances of Realm Reborn, one on each 1080p monitor for a friend and I. Is there an easy way in Windows to set things up so one instance of a game would run on one 750ti and the other instance of the game would run on the other 750ti card on another monitor? I know they can't do SLI but that would be the next best thing since it's the only time my system is going full blast, and when I'm not using it for two copies of a game I could set it to PhysX so it's not totally useless.
---

Rastor
Jun 2, 2001

Does the game support windowed mode?

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
Yeah, windowed and borderless windowed... You're saying I can plug one monitor into each card and each game will automatically render on that card when I drag to that monitor?

Trickyrive
Mar 7, 2001

I'm a semiconductor engineer so I figured I'd jump in. A couple of things, you'll hit a wall due to quantum tunneling once you get small enough means that once the transistors get really small, you cant effectively trap the charge in there, which is where leakage current comes in (and its the reason why a transistor being off doesn't mean its at 0 voltage). This also means that there's definitely heat in the system, cause its not just one transistor its a few billion all put together so there's the other problem. Heat messes a ton of things up, suffice to say when things heat up, your typical electron slows down, why is this bad? because now we're losing electrons to recombination (any defects or whatnot in the semiconductor) and lowers drift velocity as one guy said. Also, electrons don't travel at the speed of light, so now you have to think of how fast they're moving inside the semiconductor, the metal, and the junction between them. This movement also causes heat.

Also, size just means you can cram more transistors on a chip. Other things will scale down and we try to scale down voltage and power (which once again, produce heat) so blame heat!

tl;dr Too much current means too much heat, means degradation of the physical device and in turn device performance so find a happy medium.

Trickyrive fucked around with this message at 04:43 on Feb 22, 2014

forbidden dialectics
Jul 26, 2005





So here's my best result for a 4770k @ 4.7 and a Classified 780ti @ 1.5/1950.

http://www.3dmark.com/fs/1719246

This is a 24/7 stable config (which also serves as a space heater), and with a 80+ Gold PSU, uses just under 900 watts under full load.

Don't do it. You have enough FPS. Don't....go....down....this...path. Free....Mars....

Scarecow
May 20, 2008

3200mhz RAM is literally the Devil. Literally.
Lipstick Apathy

Nostrum posted:

So here's my best result for a 4770k @ 4.7 and a Classified 780ti @ 1.5/1950.

http://www.3dmark.com/fs/1719246

This is a 24/7 stable config (which also serves as a space heater), and with a 80+ Gold PSU, uses just under 900 watts under full load.

Don't do it. You have enough FPS. Don't....go....down....this...path. Free....Mars....

Holy crap is that on water? And I'm going to go with the Classified because I hate money

SlayVus
Jul 10, 2009
Grimey Drawer
So EVGA Precision X starts the EVGA Voltage Control exe when it starts. VoltCont uses 17.8% CPU. Everytime I change the settings it loads another VoltCont exe. Which means another process eating up another 17.8% CPU for no apparent reason. Anyone have any clues. There seem to be similar cases going on in the EVGA forums with no response from EVGA.

Straker
Nov 10, 2005

Nostrum posted:

So here's my best result for a 4770k @ 4.7 and a Classified 780ti @ 1.5/1950.

http://www.3dmark.com/fs/1719246

This is a 24/7 stable config (which also serves as a space heater), and with a 80+ Gold PSU, uses just under 900 watts under full load.

Don't do it. You have enough FPS. Don't....go....down....this...path. Free....Mars....
I still don't understand why anyone would bother overvolting like that and wasting so much power blowing their poo poo up for nothing, it's like do you really need to play Planetside 2 at 4K or something? Why not just use two cards? :(

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Straker posted:

I still don't understand why anyone would bother overvolting like that and wasting so much power blowing their poo poo up for nothing, it's like do you really need to play Planetside 2 at 4K or something? Why not just use two cards? :(

This is exactly how I feel. Followed by an undeniable twinge of "heeeey I wish mine would do that, that's 250MHz more than my card and it only costs a little more :saddowns:." You guys talking about OCing being fun are right, it is a bit of a thrill to push things to see what you can get out of them and a really high result is exciting :shobon: It also damages the useful life of the product substantially, as my 2600K can attest after two years of 4.7GHz finally becoming unstable in measurable ways and having to get bumped down to 4.5GHz to attain stability again, for now. I do have a new system build ready to go if/when the processor takes a dump, at least.

Speaking of redundancy and worry for hardware failure... Tell you what, though, not to poo poo on you at all Nostrum - but I am a little bit :raise: at the label "24/7 stable" for your clocks. It just seems like a pretty bold claim to make with those clocks and voltage; maybe you just got a golden chip (they exist, and you did get the right card to go hunting for one at this point in the exercise of trying to nab 'em), but with all we know about GDDR5's instability and all that, I'd be curious how it holds up to the aforementioned OpenCL memtest? I know there's not a very good way to validate the core, just kinda eyeball it and see if you can spot any artifacts while gaming or if it has any crash and recovery cycles.

Are you on water or air? I can't keep straight who's using what at this point for cooling, since we've had several folks do "The Mod" and others just transition to closed loop or full liquid cooling in the past several months.

Edit: My FS Extreme score with a 2600K@4.5GHz and a 780Ti@1.25GHz core &etc. / 1825MHz GDDR5 is 5680, so there's definitely a real performance delta - partially attributable to the higher quality and higher clocked CPU, partially to the higher quality and higher clocked graphics card, but either way, big rear end performance difference.

Agreed fucked around with this message at 16:44 on Feb 22, 2014

INTJ Mastermind
Dec 30, 2004

It's a radial!
Can someone explain to me the difference between Burn-In and Benchmark on FurMark?

I have an ASUS GTX 770 DirectCU II that I've OC'ed to 1260 core and 7700 memory. On Burn-In, it runs at the max settings, approx 77% TPM with a temp of 69 degrees. On Benchmark, it's at 100% TPM throttling the core down to 1200, and gets to 80 degree C.

Also, if the card is stable on a core of 1260, the max GPU Boost allows, any advantage to increasing the gpu voltage from stock? IIRC, it doesn't help memory over clocking.

INTJ Mastermind fucked around with this message at 19:52 on Feb 23, 2014

GWBBQ
Jan 2, 2005


I just had blocks like this start jumping around on the screen, plus grey triangles extending from the center of the screen out to the edges flickering while playing Fallout New Vegas. Does it look like my graphics card is going bad? It's a ~3 year old Radeon HD 5770 and has been overclocked by about 10% using the AMD Overdrive utility. If it is going, can underclocking it and turning down graphics settings prolong its life for at least another few weeks?

forbidden dialectics
Jul 26, 2005





Straker posted:

I still don't understand why anyone would bother overvolting like that and wasting so much power blowing their poo poo up for nothing, it's like do you really need to play Planetside 2 at 4K or something? Why not just use two cards? :(

I don't really. For most things I turn it down to around 1398 MHz which is stable at about 1.3725V, which during load playing a game, uses only about 500-600 Watts. The performance difference is huge. Compared to stock it IS almost as fast as an SLI setup.


Agreed posted:

This is exactly how I feel. Followed by an undeniable twinge of "heeeey I wish mine would do that, that's 250MHz more than my card and it only costs a little more :saddowns:." You guys talking about OCing being fun are right, it is a bit of a thrill to push things to see what you can get out of them and a really high result is exciting :shobon: It also damages the useful life of the product substantially, as my 2600K can attest after two years of 4.7GHz finally becoming unstable in measurable ways and having to get bumped down to 4.5GHz to attain stability again, for now. I do have a new system build ready to go if/when the processor takes a dump, at least.

Speaking of redundancy and worry for hardware failure... Tell you what, though, not to poo poo on you at all Nostrum - but I am a little bit :raise: at the label "24/7 stable" for your clocks. It just seems like a pretty bold claim to make with those clocks and voltage; maybe you just got a golden chip (they exist, and you did get the right card to go hunting for one at this point in the exercise of trying to nab 'em), but with all we know about GDDR5's instability and all that, I'd be curious how it holds up to the aforementioned OpenCL memtest? I know there's not a very good way to validate the core, just kinda eyeball it and see if you can spot any artifacts while gaming or if it has any crash and recovery cycles.

Are you on water or air? I can't keep straight who's using what at this point for cooling, since we've had several folks do "The Mod" and others just transition to closed loop or full liquid cooling in the past several months.

Edit: My FS Extreme score with a 2600K@4.5GHz and a 780Ti@1.25GHz core &etc. / 1825MHz GDDR5 is 5680, so there's definitely a real performance delta - partially attributable to the higher quality and higher clocked CPU, partially to the higher quality and higher clocked graphics card, but either way, big rear end performance difference.

I've tested it with both OpenCL Memtest and G80Memtest. The memory is definitely a sticking point; no matter what, even 1 MHz higher and it artifacts like crazy or crashes. It is, however, completely stable at 1950MHz. It's a full water setup with EK blocks, a Phobya 2x 200MM radiator and an XSPC 420. The 4770k is definitely an "above average" chip. It'll run IBT all day long at 4.7 (with temps in the high 70's). It's "stable" at 4.8 but crashes after an hour or so. At 4.9 the computer is usable, but IBT crashes after a couple loops. 5.0 will boot to desktop but bluescreen almost instantly.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

GWBBQ posted:

I just had blocks like this start jumping around on the screen, plus grey triangles extending from the center of the screen out to the edges flickering while playing Fallout New Vegas. Does it look like my graphics card is going bad? It's a ~3 year old Radeon HD 5770 and has been overclocked by about 10% using the AMD Overdrive utility. If it is going, can underclocking it and turning down graphics settings prolong its life for at least another few weeks?


Try a different game, New Vegas is like the buggiest poo poo on the planet.

Seriously though that's almost certainly a failing card, when AMD cards start doing that it's usually when they heat up. Turning down all the game settings and rebooting night get it back to normal for a little bit while you shop for a new card.

GWBBQ
Jan 2, 2005


Haha, I know how buggy New Vegas is, that's why I was asking. KSP and Minecraft seemed OK even though the card sounds like a jet taking off on all 3. I underclocked it by about 10% and ran the FurMark burn-in test for 10 minutes keeping it at 99°C and there were no problems, and I played another hour of Fallout with the settings turned down a few notches without problems. I'm crossing my fingers that it was just a one-time glitch or a problem with the replacement textures I have installed, but one of my friends gave me his old GTX 260 in case my card dies.

INTJ Mastermind
Dec 30, 2004

It's a radial!

GWBBQ posted:

FurMark burn-in test for 10 minutes keeping it at 99°C

I'm no expert on gpus but won't that high of a temp shorten the lifespan of your card signifiantly? Not just the semiconductors but I'm sure the fan lubricant oils don't like high temps.

veedubfreak
Apr 2, 2005

by Smythe

Nostrum posted:

I don't really. For most things I turn it down to around 1398 MHz which is stable at about 1.3725V, which during load playing a game, uses only about 500-600 Watts. The performance difference is huge. Compared to stock it IS almost as fast as an SLI setup.


I've tested it with both OpenCL Memtest and G80Memtest. The memory is definitely a sticking point; no matter what, even 1 MHz higher and it artifacts like crazy or crashes. It is, however, completely stable at 1950MHz. It's a full water setup with EK blocks, a Phobya 2x 200MM radiator and an XSPC 420. The 4770k is definitely an "above average" chip. It'll run IBT all day long at 4.7 (with temps in the high 70's). It's "stable" at 4.8 but crashes after an hour or so. At 4.9 the computer is usable, but IBT crashes after a couple loops. 5.0 will boot to desktop but bluescreen almost instantly.

poo poo like this is why I have been contemplating selling my 290s and buying a single Classified. Knock it off.

Nephilm
Jun 11, 2009

by Lowtax

GWBBQ posted:

Haha, I know how buggy New Vegas is, that's why I was asking. KSP and Minecraft seemed OK even though the card sounds like a jet taking off on all 3. I underclocked it by about 10% and ran the FurMark burn-in test for 10 minutes keeping it at 99°C and there were no problems, and I played another hour of Fallout with the settings turned down a few notches without problems. I'm crossing my fingers that it was just a one-time glitch or a problem with the replacement textures I have installed, but one of my friends gave me his old GTX 260 in case my card dies.

INTJ Mastermind posted:

I'm no expert on gpus but won't that high of a temp shorten the lifespan of your card signifiantly? Not just the semiconductors but I'm sure the fan lubricant oils don't like high temps.

IIRC the thermal tolerance on the old radeons was something stupid like 115°C, but a 5770 shouldn't be going over 80 on load, and would be symptomatic of cooling issues with either the card or the case. Additionally, if it sounds like a jet engine I'd replace it with a new card out of principle; much more powerful modern cards are whisper quiet.

Gwaihir
Dec 8, 2009
Hair Elf

Nephilm posted:

IIRC the thermal tolerance on the old radeons was something stupid like 115°C, but a 5770 shouldn't be going over 80 on load, and would be symptomatic of cooling issues with either the card or the case. Additionally, if it sounds like a jet engine I'd replace it with a new card out of principle; much more powerful modern cards are whisper quiet.

Even legendarily hot GTX480 wasn't supposed to go over 105 degrees.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
Yeah, you might take a look and see if you need to take a can of duster to it and/or redo the paste. Duster and paste are both really cheap so you're not out much and you might keep the thing alive longer.

Scarecow
May 20, 2008

3200mhz RAM is literally the Devil. Literally.
Lipstick Apathy

veedubfreak posted:

poo poo like this is why I have been contemplating selling my 290s and buying a single Classified. Knock it off.

Its made me lock in a pre order from PCcasegear for the 780ti classified, I was trying to hold out and see what the 290 lightning would do over the reference PCB but I don't think even with two 8-pins and a 6-pin power connector will make the 290x match a 780ti classified

Evil Crouton
Oct 4, 2004

The Amish scare me
I think my 7950 has developed really bad coil whine under slight load. I haven't changed anything in about 6 months but starting yesterday it started making a loud high-pitched screeching noise when playing simple games like KSP or Banished. It doesn't occur if I'm doing anything more taxing but a few minutes into simple games like I mentioned it starts a painfully loud whine that doesn't go away until I stress the card or shut the system down. Is this coil whine? It is far louder and more painful than anything google has shown me. And is there anything I can do about it?

norg
Jul 5, 2006
Factory Factory: I think I saw you say a few pages back that you had a Corsair H55 on your GTX 680. Is that in your Prodigy? If so how did you fit it? I was looking at putting one in my Prodigy with an NZXT G10 bracket, but it doesn't seem like it'll fit in the case.

future ghost
Dec 5, 2005

:byetankie:
Gun Saliva

Evil Crouton posted:

I think my 7950 has developed really bad coil whine under slight load. I haven't changed anything in about 6 months but starting yesterday it started making a loud high-pitched screeching noise when playing simple games like KSP or Banished. It doesn't occur if I'm doing anything more taxing but a few minutes into simple games like I mentioned it starts a painfully loud whine that doesn't go away until I stress the card or shut the system down. Is this coil whine? It is far louder and more painful than anything google has shown me. And is there anything I can do about it?
If it is coil whine you can isolate the coil(s) at fault and use hot glue or nail polish to seal them. Alternatively the card may still be under warranty. Have you made sure it's not the power supply instead?

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

cisco privilege posted:

If it is coil whine you can isolate the coil(s) at fault and use hot glue or nail polish to seal them. Alternatively the card may still be under warranty. Have you made sure it's not the power supply instead?

I never heard of that before but it sounds like a nice idea. Googling it further, OEMs use parafin wax to reduce coil whine. Now I can use all my leftover Valentine's candles!

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

norg posted:

Factory Factory: I think I saw you say a few pages back that you had a Corsair H55 on your GTX 680. Is that in your Prodigy? If so how did you fit it? I was looking at putting one in my Prodigy with an NZXT G10 bracket, but it doesn't seem like it'll fit in the case.

I used a Dwood bracket, which is functionally identical to a G10, and I sawed off part of it with a Dremel.



If you have a windowed side panel, you will need a low-profile fan, too. A vented side panel will work fine with the G10's stock 25mm-thick fan. I actually replaced my panel's window with some mesh from a cheap letter holder, so, that's a third option, I guess.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
TechReport has a little recap of some Game Developer Conference sessions that might be of interest. Bottom line: it looks like the DirectX and OpenGL folks saw Mantle and want to do that poo poo.

some Windows/DirectX guy posted:

For nearly 20 years, DirectX has been the platform used by game developers to create the fastest, most visually impressive games on the planet.

However, you asked us to do more. You asked us to bring you even closer to the metal and to do so on an unparalleled assortment of hardware. You also asked us for better tools so that you can squeeze every last drop of performance out of your PC, tablet, phone and console.

Come learn our plans to deliver.

Emphasis added.

Another Direct3D guy posted:

In this session we will discuss future improvements in Direct3D that will allow developers an unprecedented level of hardware control and reduced CPU rendering overhead across a broad ecosystem of hardware.

Meanwhile, OpenGL's session is about reducing driver CPU overhead to zero and features folks from AMD, Intel, and Nvidia together.

Nephilm
Jun 11, 2009

by Lowtax
DirectCape, coming to you with Windows 8.2 this fall. Windows 7 need not apply.

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

Nephilm posted:

DirectCape, coming to you with Windows 8.2 this fall. Windows 7 need not apply.

I don't know whether OpenGL's fractured extension support is any better honestly

ShaneB
Oct 22, 2002


I currently have a 270X but am debating selling it to some miner and getting a GTX770 for a little resolution futureproofing. Is this a bad time to get a 770?

The_Franz
Aug 8, 2003

Factory Factory posted:

TechReport has a little recap of some Game Developer Conference sessions that might be of interest. Bottom line: it looks like the DirectX and OpenGL folks saw Mantle and want to do that poo poo.


Emphasis added.


Meanwhile, OpenGL's session is about reducing driver CPU overhead to zero and features folks from AMD, Intel, and Nvidia together.

OpenGL has actually had things like multi-draw-indirect for a few years now which lets you concatenate thousands or even tens of thousands of draw calls into one. People are only just noticing now because Valve has started dragging the entire industry away from it's Windows-centric viewpoint.

What will be really nice is when all of the vendors finally support ARB_BINDLESS and concepts like texture units and texture array limitations become a distant bad memory.

The_Franz fucked around with this message at 19:33 on Feb 26, 2014

Ignoarints
Nov 26, 2010

ShaneB posted:

I currently have a 270X but am debating selling it to some miner and getting a GTX770 for a little resolution futureproofing. Is this a bad time to get a 770?

I mean it's better in every way but it's also like $100 more. I've been following the 660, 660 TI, 760, and 770 prices and 770's have been the most stable. I asked earlier in this thread when 870's are supposedly coming out and it's probably 3rd quarter this year at the earliest.

veedubfreak
Apr 2, 2005

by Smythe

Ignoarints posted:

I mean it's better in every way but it's also like $100 more. I've been following the 660, 660 TI, 760, and 770 prices and 770's have been the most stable. I asked earlier in this thread when 870's are supposedly coming out and it's probably 3rd quarter this year at the earliest.

I honestly wouldn't expect anything that is supposed to be on 20nm until Thanksgiving.

ShaneB
Oct 22, 2002


Ignoarints posted:

I mean it's better in every way but it's also like $100 more. I've been following the 660, 660 TI, 760, and 770 prices and 770's have been the most stable. I asked earlier in this thread when 870's are supposedly coming out and it's probably 3rd quarter this year at the earliest.

I guess I honestly should just think about running another 270X in Crossfire if I really want to do 1440p gaming without replacing it with a much more expensive card. I should save up for the monitor first. :)

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

ShaneB posted:

I guess I honestly should just think about running another 270X in Crossfire if I really want to do 1440p gaming without replacing it with a much more expensive card. I should save up for the monitor first. :)

You might also catch a HD 7870 on sale for like 150 bucks that crossfires with the 270X just fine.

Nephilm
Jun 11, 2009

by Lowtax
Wouldn't a r9 270x 2GB crossfire be memory constrained for 1440p?

ShaneB
Oct 22, 2002


Nephilm posted:

Wouldn't a r9 270x 2GB crossfire be memory constrained for 1440p?

All the benchmarks look pretty drat good to me.

Adbot
ADBOT LOVES YOU

veedubfreak
Apr 2, 2005

by Smythe
In single monitor, not really.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply