|
Jago posted:Heat IS atomic vibrations. Correct! At the old place where I used to work we had an absolute zero teset lab that had to incorporate stuff like a faraday cage, and a huge vibration insulation/dampening system for the actual experimental platform. Otherwise you end up with random things screwing up your stuff.
|
# ? Feb 21, 2014 22:46 |
|
|
# ? Apr 23, 2024 21:02 |
|
Back in 2003, Intel said quantum tunneling is why Moore's Law would hit a wall between 2013 and 2018.
|
# ? Feb 21, 2014 23:10 |
|
I suppose a theoretical limit to the speed of a transistor switching is limited by its size and the speed of light. With the 22 nm process, I suppose that puts us at a maximum switching frequency in the petahertz range.
|
# ? Feb 22, 2014 00:10 |
|
From last page, if anyone has any idea: --- My previous GPU setup was two 770's in SLI to run two instances of Realm Reborn, one on each 1080p monitor for a friend and I. Is there an easy way in Windows to set things up so one instance of a game would run on one 750ti and the other instance of the game would run on the other 750ti card on another monitor? I know they can't do SLI but that would be the next best thing since it's the only time my system is going full blast, and when I'm not using it for two copies of a game I could set it to PhysX so it's not totally useless. ---
|
# ? Feb 22, 2014 00:32 |
|
Does the game support windowed mode?
|
# ? Feb 22, 2014 02:35 |
|
Yeah, windowed and borderless windowed... You're saying I can plug one monitor into each card and each game will automatically render on that card when I drag to that monitor?
|
# ? Feb 22, 2014 03:36 |
|
I'm a semiconductor engineer so I figured I'd jump in. A couple of things, you'll hit a wall due to quantum tunneling once you get small enough means that once the transistors get really small, you cant effectively trap the charge in there, which is where leakage current comes in (and its the reason why a transistor being off doesn't mean its at 0 voltage). This also means that there's definitely heat in the system, cause its not just one transistor its a few billion all put together so there's the other problem. Heat messes a ton of things up, suffice to say when things heat up, your typical electron slows down, why is this bad? because now we're losing electrons to recombination (any defects or whatnot in the semiconductor) and lowers drift velocity as one guy said. Also, electrons don't travel at the speed of light, so now you have to think of how fast they're moving inside the semiconductor, the metal, and the junction between them. This movement also causes heat. Also, size just means you can cram more transistors on a chip. Other things will scale down and we try to scale down voltage and power (which once again, produce heat) so blame heat! tl;dr Too much current means too much heat, means degradation of the physical device and in turn device performance so find a happy medium. Trickyrive fucked around with this message at 04:43 on Feb 22, 2014 |
# ? Feb 22, 2014 04:40 |
|
So here's my best result for a 4770k @ 4.7 and a Classified 780ti @ 1.5/1950. http://www.3dmark.com/fs/1719246 This is a 24/7 stable config (which also serves as a space heater), and with a 80+ Gold PSU, uses just under 900 watts under full load. Don't do it. You have enough FPS. Don't....go....down....this...path. Free....Mars....
|
# ? Feb 22, 2014 08:39 |
|
Nostrum posted:So here's my best result for a 4770k @ 4.7 and a Classified 780ti @ 1.5/1950. Holy crap is that on water? And I'm going to go with the Classified because I hate money
|
# ? Feb 22, 2014 09:18 |
|
So EVGA Precision X starts the EVGA Voltage Control exe when it starts. VoltCont uses 17.8% CPU. Everytime I change the settings it loads another VoltCont exe. Which means another process eating up another 17.8% CPU for no apparent reason. Anyone have any clues. There seem to be similar cases going on in the EVGA forums with no response from EVGA.
|
# ? Feb 22, 2014 15:05 |
Nostrum posted:So here's my best result for a 4770k @ 4.7 and a Classified 780ti @ 1.5/1950.
|
|
# ? Feb 22, 2014 15:40 |
|
Straker posted:I still don't understand why anyone would bother overvolting like that and wasting so much power blowing their poo poo up for nothing, it's like do you really need to play Planetside 2 at 4K or something? Why not just use two cards? This is exactly how I feel. Followed by an undeniable twinge of "heeeey I wish mine would do that, that's 250MHz more than my card and it only costs a little more ." You guys talking about OCing being fun are right, it is a bit of a thrill to push things to see what you can get out of them and a really high result is exciting It also damages the useful life of the product substantially, as my 2600K can attest after two years of 4.7GHz finally becoming unstable in measurable ways and having to get bumped down to 4.5GHz to attain stability again, for now. I do have a new system build ready to go if/when the processor takes a dump, at least. Speaking of redundancy and worry for hardware failure... Tell you what, though, not to poo poo on you at all Nostrum - but I am a little bit at the label "24/7 stable" for your clocks. It just seems like a pretty bold claim to make with those clocks and voltage; maybe you just got a golden chip (they exist, and you did get the right card to go hunting for one at this point in the exercise of trying to nab 'em), but with all we know about GDDR5's instability and all that, I'd be curious how it holds up to the aforementioned OpenCL memtest? I know there's not a very good way to validate the core, just kinda eyeball it and see if you can spot any artifacts while gaming or if it has any crash and recovery cycles. Are you on water or air? I can't keep straight who's using what at this point for cooling, since we've had several folks do "The Mod" and others just transition to closed loop or full liquid cooling in the past several months. Edit: My FS Extreme score with a 2600K@4.5GHz and a 780Ti@1.25GHz core &etc. / 1825MHz GDDR5 is 5680, so there's definitely a real performance delta - partially attributable to the higher quality and higher clocked CPU, partially to the higher quality and higher clocked graphics card, but either way, big rear end performance difference. Agreed fucked around with this message at 16:44 on Feb 22, 2014 |
# ? Feb 22, 2014 16:33 |
|
Can someone explain to me the difference between Burn-In and Benchmark on FurMark? I have an ASUS GTX 770 DirectCU II that I've OC'ed to 1260 core and 7700 memory. On Burn-In, it runs at the max settings, approx 77% TPM with a temp of 69 degrees. On Benchmark, it's at 100% TPM throttling the core down to 1200, and gets to 80 degree C. Also, if the card is stable on a core of 1260, the max GPU Boost allows, any advantage to increasing the gpu voltage from stock? IIRC, it doesn't help memory over clocking. INTJ Mastermind fucked around with this message at 19:52 on Feb 23, 2014 |
# ? Feb 23, 2014 19:32 |
|
I just had blocks like this start jumping around on the screen, plus grey triangles extending from the center of the screen out to the edges flickering while playing Fallout New Vegas. Does it look like my graphics card is going bad? It's a ~3 year old Radeon HD 5770 and has been overclocked by about 10% using the AMD Overdrive utility. If it is going, can underclocking it and turning down graphics settings prolong its life for at least another few weeks?
|
# ? Feb 24, 2014 00:32 |
|
Straker posted:I still don't understand why anyone would bother overvolting like that and wasting so much power blowing their poo poo up for nothing, it's like do you really need to play Planetside 2 at 4K or something? Why not just use two cards? I don't really. For most things I turn it down to around 1398 MHz which is stable at about 1.3725V, which during load playing a game, uses only about 500-600 Watts. The performance difference is huge. Compared to stock it IS almost as fast as an SLI setup. Agreed posted:This is exactly how I feel. Followed by an undeniable twinge of "heeeey I wish mine would do that, that's 250MHz more than my card and it only costs a little more ." You guys talking about OCing being fun are right, it is a bit of a thrill to push things to see what you can get out of them and a really high result is exciting It also damages the useful life of the product substantially, as my 2600K can attest after two years of 4.7GHz finally becoming unstable in measurable ways and having to get bumped down to 4.5GHz to attain stability again, for now. I do have a new system build ready to go if/when the processor takes a dump, at least. I've tested it with both OpenCL Memtest and G80Memtest. The memory is definitely a sticking point; no matter what, even 1 MHz higher and it artifacts like crazy or crashes. It is, however, completely stable at 1950MHz. It's a full water setup with EK blocks, a Phobya 2x 200MM radiator and an XSPC 420. The 4770k is definitely an "above average" chip. It'll run IBT all day long at 4.7 (with temps in the high 70's). It's "stable" at 4.8 but crashes after an hour or so. At 4.9 the computer is usable, but IBT crashes after a couple loops. 5.0 will boot to desktop but bluescreen almost instantly.
|
# ? Feb 24, 2014 00:36 |
|
GWBBQ posted:I just had blocks like this start jumping around on the screen, plus grey triangles extending from the center of the screen out to the edges flickering while playing Fallout New Vegas. Does it look like my graphics card is going bad? It's a ~3 year old Radeon HD 5770 and has been overclocked by about 10% using the AMD Overdrive utility. If it is going, can underclocking it and turning down graphics settings prolong its life for at least another few weeks? Try a different game, New Vegas is like the buggiest poo poo on the planet. Seriously though that's almost certainly a failing card, when AMD cards start doing that it's usually when they heat up. Turning down all the game settings and rebooting night get it back to normal for a little bit while you shop for a new card.
|
# ? Feb 24, 2014 06:34 |
|
Haha, I know how buggy New Vegas is, that's why I was asking. KSP and Minecraft seemed OK even though the card sounds like a jet taking off on all 3. I underclocked it by about 10% and ran the FurMark burn-in test for 10 minutes keeping it at 99°C and there were no problems, and I played another hour of Fallout with the settings turned down a few notches without problems. I'm crossing my fingers that it was just a one-time glitch or a problem with the replacement textures I have installed, but one of my friends gave me his old GTX 260 in case my card dies.
|
# ? Feb 24, 2014 19:12 |
|
GWBBQ posted:FurMark burn-in test for 10 minutes keeping it at 99°C I'm no expert on gpus but won't that high of a temp shorten the lifespan of your card signifiantly? Not just the semiconductors but I'm sure the fan lubricant oils don't like high temps.
|
# ? Feb 24, 2014 19:15 |
|
Nostrum posted:I don't really. For most things I turn it down to around 1398 MHz which is stable at about 1.3725V, which during load playing a game, uses only about 500-600 Watts. The performance difference is huge. Compared to stock it IS almost as fast as an SLI setup. poo poo like this is why I have been contemplating selling my 290s and buying a single Classified. Knock it off.
|
# ? Feb 24, 2014 19:19 |
|
GWBBQ posted:Haha, I know how buggy New Vegas is, that's why I was asking. KSP and Minecraft seemed OK even though the card sounds like a jet taking off on all 3. I underclocked it by about 10% and ran the FurMark burn-in test for 10 minutes keeping it at 99°C and there were no problems, and I played another hour of Fallout with the settings turned down a few notches without problems. I'm crossing my fingers that it was just a one-time glitch or a problem with the replacement textures I have installed, but one of my friends gave me his old GTX 260 in case my card dies. INTJ Mastermind posted:I'm no expert on gpus but won't that high of a temp shorten the lifespan of your card signifiantly? Not just the semiconductors but I'm sure the fan lubricant oils don't like high temps. IIRC the thermal tolerance on the old radeons was something stupid like 115°C, but a 5770 shouldn't be going over 80 on load, and would be symptomatic of cooling issues with either the card or the case. Additionally, if it sounds like a jet engine I'd replace it with a new card out of principle; much more powerful modern cards are whisper quiet.
|
# ? Feb 24, 2014 19:53 |
|
Nephilm posted:IIRC the thermal tolerance on the old radeons was something stupid like 115°C, but a 5770 shouldn't be going over 80 on load, and would be symptomatic of cooling issues with either the card or the case. Additionally, if it sounds like a jet engine I'd replace it with a new card out of principle; much more powerful modern cards are whisper quiet. Even legendarily hot GTX480 wasn't supposed to go over 105 degrees.
|
# ? Feb 24, 2014 21:39 |
|
Yeah, you might take a look and see if you need to take a can of duster to it and/or redo the paste. Duster and paste are both really cheap so you're not out much and you might keep the thing alive longer.
|
# ? Feb 24, 2014 21:46 |
|
veedubfreak posted:poo poo like this is why I have been contemplating selling my 290s and buying a single Classified. Knock it off. Its made me lock in a pre order from PCcasegear for the 780ti classified, I was trying to hold out and see what the 290 lightning would do over the reference PCB but I don't think even with two 8-pins and a 6-pin power connector will make the 290x match a 780ti classified
|
# ? Feb 24, 2014 23:15 |
|
I think my 7950 has developed really bad coil whine under slight load. I haven't changed anything in about 6 months but starting yesterday it started making a loud high-pitched screeching noise when playing simple games like KSP or Banished. It doesn't occur if I'm doing anything more taxing but a few minutes into simple games like I mentioned it starts a painfully loud whine that doesn't go away until I stress the card or shut the system down. Is this coil whine? It is far louder and more painful than anything google has shown me. And is there anything I can do about it?
|
# ? Feb 25, 2014 03:24 |
|
Factory Factory: I think I saw you say a few pages back that you had a Corsair H55 on your GTX 680. Is that in your Prodigy? If so how did you fit it? I was looking at putting one in my Prodigy with an NZXT G10 bracket, but it doesn't seem like it'll fit in the case.
|
# ? Feb 25, 2014 12:52 |
|
Evil Crouton posted:I think my 7950 has developed really bad coil whine under slight load. I haven't changed anything in about 6 months but starting yesterday it started making a loud high-pitched screeching noise when playing simple games like KSP or Banished. It doesn't occur if I'm doing anything more taxing but a few minutes into simple games like I mentioned it starts a painfully loud whine that doesn't go away until I stress the card or shut the system down. Is this coil whine? It is far louder and more painful than anything google has shown me. And is there anything I can do about it?
|
# ? Feb 25, 2014 18:35 |
|
cisco privilege posted:If it is coil whine you can isolate the coil(s) at fault and use hot glue or nail polish to seal them. Alternatively the card may still be under warranty. Have you made sure it's not the power supply instead? I never heard of that before but it sounds like a nice idea. Googling it further, OEMs use parafin wax to reduce coil whine. Now I can use all my leftover Valentine's candles!
|
# ? Feb 25, 2014 20:48 |
|
norg posted:Factory Factory: I think I saw you say a few pages back that you had a Corsair H55 on your GTX 680. Is that in your Prodigy? If so how did you fit it? I was looking at putting one in my Prodigy with an NZXT G10 bracket, but it doesn't seem like it'll fit in the case. I used a Dwood bracket, which is functionally identical to a G10, and I sawed off part of it with a Dremel. If you have a windowed side panel, you will need a low-profile fan, too. A vented side panel will work fine with the G10's stock 25mm-thick fan. I actually replaced my panel's window with some mesh from a cheap letter holder, so, that's a third option, I guess.
|
# ? Feb 25, 2014 21:03 |
|
TechReport has a little recap of some Game Developer Conference sessions that might be of interest. Bottom line: it looks like the DirectX and OpenGL folks saw Mantle and want to do that poo poo.some Windows/DirectX guy posted:For nearly 20 years, DirectX has been the platform used by game developers to create the fastest, most visually impressive games on the planet. Emphasis added. Another Direct3D guy posted:In this session we will discuss future improvements in Direct3D that will allow developers an unprecedented level of hardware control and reduced CPU rendering overhead across a broad ecosystem of hardware. Meanwhile, OpenGL's session is about reducing driver CPU overhead to zero and features folks from AMD, Intel, and Nvidia together.
|
# ? Feb 26, 2014 18:13 |
|
DirectCape, coming to you with Windows 8.2 this fall. Windows 7 need not apply.
|
# ? Feb 26, 2014 18:29 |
|
Nephilm posted:DirectCape, coming to you with Windows 8.2 this fall. Windows 7 need not apply. I don't know whether OpenGL's fractured extension support is any better honestly
|
# ? Feb 26, 2014 19:08 |
|
I currently have a 270X but am debating selling it to some miner and getting a GTX770 for a little resolution futureproofing. Is this a bad time to get a 770?
|
# ? Feb 26, 2014 19:29 |
|
Factory Factory posted:TechReport has a little recap of some Game Developer Conference sessions that might be of interest. Bottom line: it looks like the DirectX and OpenGL folks saw Mantle and want to do that poo poo. OpenGL has actually had things like multi-draw-indirect for a few years now which lets you concatenate thousands or even tens of thousands of draw calls into one. People are only just noticing now because Valve has started dragging the entire industry away from it's Windows-centric viewpoint. What will be really nice is when all of the vendors finally support ARB_BINDLESS and concepts like texture units and texture array limitations become a distant bad memory. The_Franz fucked around with this message at 19:33 on Feb 26, 2014 |
# ? Feb 26, 2014 19:30 |
ShaneB posted:I currently have a 270X but am debating selling it to some miner and getting a GTX770 for a little resolution futureproofing. Is this a bad time to get a 770? I mean it's better in every way but it's also like $100 more. I've been following the 660, 660 TI, 760, and 770 prices and 770's have been the most stable. I asked earlier in this thread when 870's are supposedly coming out and it's probably 3rd quarter this year at the earliest.
|
|
# ? Feb 26, 2014 19:36 |
|
Ignoarints posted:I mean it's better in every way but it's also like $100 more. I've been following the 660, 660 TI, 760, and 770 prices and 770's have been the most stable. I asked earlier in this thread when 870's are supposedly coming out and it's probably 3rd quarter this year at the earliest. I honestly wouldn't expect anything that is supposed to be on 20nm until Thanksgiving.
|
# ? Feb 26, 2014 19:59 |
|
Ignoarints posted:I mean it's better in every way but it's also like $100 more. I've been following the 660, 660 TI, 760, and 770 prices and 770's have been the most stable. I asked earlier in this thread when 870's are supposedly coming out and it's probably 3rd quarter this year at the earliest. I guess I honestly should just think about running another 270X in Crossfire if I really want to do 1440p gaming without replacing it with a much more expensive card. I should save up for the monitor first.
|
# ? Feb 26, 2014 20:04 |
|
ShaneB posted:I guess I honestly should just think about running another 270X in Crossfire if I really want to do 1440p gaming without replacing it with a much more expensive card. I should save up for the monitor first. You might also catch a HD 7870 on sale for like 150 bucks that crossfires with the 270X just fine.
|
# ? Feb 26, 2014 20:19 |
|
Wouldn't a r9 270x 2GB crossfire be memory constrained for 1440p?
|
# ? Feb 26, 2014 20:29 |
|
Nephilm posted:Wouldn't a r9 270x 2GB crossfire be memory constrained for 1440p? All the benchmarks look pretty drat good to me.
|
# ? Feb 26, 2014 20:34 |
|
|
# ? Apr 23, 2024 21:02 |
|
In single monitor, not really.
|
# ? Feb 26, 2014 20:37 |