|
The GTX 770 has max temp of 98C, 81C is not to bad under heavy load for an extended time. Aftermarket might be an improvement, but it doesn't seem like you need an immediate solution. If you want to watercool, just spend some time researching and planning a layout.
|
# ¿ Dec 11, 2013 18:18 |
|
|
# ¿ Apr 23, 2024 11:39 |
|
cat doter posted:Doesn't seem to be, I ran OCCT again with GPU-Z and under 51% load for 3 minutes it reached 91C. The only memory options I see in OCCT are for the CPU. For GPU, there is memtestCL and memtest80. But I don't know how good this works since I can put a several hundred mhz increase on my memory and not see an error, yet games will absolutely poo poo themselves the second they try to render at the same memory speed. https://simtk.org/home/memtest/ As far as the issue you are having, that sounds exactly like the video driver stopping. Try different drivers with clean installs. Other things, re-seat the card in the motherboard and try another power supply (preferably larger).
|
# ¿ Dec 12, 2013 14:38 |
|
Blurbusters has a good demonstration to give you an idea of how g-sync will look. Just ordered up the MSI GTX 780 Twin Frozr that is on-sale on Newegg and an EK waterblock for it. Comes with a mild oc already, so I'll see how much further I can push it. I'll be moving my current GTX 570 down a few slots and just let it fold non-stop. Also, nice write up on the differences in the products, Agreed. Seems like every only focuses on the fps/benchmarks and don't consider the rest of the card functions. Shadowplay/built-in encoder is something that kept me with nVidia this upgrade. Phuzun fucked around with this message at 19:20 on Dec 17, 2013 |
# ¿ Dec 17, 2013 19:16 |
|
Got my MSI GTX 780 installed last Friday. Stock is 902mhz(1050 boost)/3004mhz and with the factory air cooling, I got it up to 1176mhz(+126)/3499mhz with a 25mv voltage bump. After a good heat soak from folding@home, it was getting up to 76C with the stock fan config. The EK full coverage block just showed up in the mail and will be going on this week. Also picked up some new fans as the ones I had been using were developing a really annoying bearing noise. These fans are pretty nice, low air flow for my quiet power radiators and don't seem to mind being horizontal or vertical for mounting.
|
# ¿ Dec 23, 2013 20:45 |
|
Well I had both my GTX 570 and 780 installed with the idea that I'd keep the 570 folding 24/7 or have it for Physx games, but I ran into several issues and pulled it back out of this machine. First, folding@home does not use the normal index for the GPUs... but it does for CUDA/OpenCL indexes. This results in all kinds of fuckery with the config.xml to find which settings will get them to fold, but then I ran into issues with GPUs folding, and the work units just disappearing when pausing them. Second, the waterblocks don't line up (pretty much expected this), so the water was looping through the 780 before hitting the restrictive 570 block versus having the flow split between them. This resulted in adding about 15-20C onto my CPU and 780 temperatures. While there was some performance improvement from the dedicated Physx card, it was insignificant to the raw power of the 780. Batman origins already runs maxed out in the 60-90fps range and with the dedicated Physx card, it gained around 10fps. I'm going to throw that 570 into another machine at this point and just work on overclocking/BIOS tweaking the 780. e: on the topic of Shadowplay. Has anyone encountered an issue where the audio is not captured. Or if it is captured, it is only the rear/surround audio? Figured this out. Asus Xonar DSX has an option called GX and this is what was causing audio to go missing in Shadowplay. Phuzun fucked around with this message at 17:09 on Dec 30, 2013 |
# ¿ Dec 30, 2013 14:22 |
|
sedaps posted:I just purchased 2 X-Star DP2710 monitors hoping for a dual setup with both running 1440. Unfortunately I only have one DVI port on my Radeon 7870. I was about to just get a Mini-DP to DVI adapter that's about $100, but then I thought what if I get a new card? Would these cables not work for your card/display? They are much cheaper than $100. http://www.monoprice.com/Category?c_id=102&cp_id=10246&cs_id=1024604
|
# ¿ Dec 30, 2013 15:30 |
|
sedaps posted:I think those are single link DVI. This is the monoprice version of what I need, although the reviews are not so great on them from some research. Yeah, looking at them now, they do appear to be single link. You had mentioned the GTX 600 series, but AMD also has cards that feature 2 dual-link DVI outputs, including other 7870s. Instead of buying an adapter, you could get a second 7870 for the extra performance and even have a third DVI output for later.
|
# ¿ Dec 30, 2013 16:54 |
|
With the way releases tend to be staggered between the two brands, there is simply always something around the corner. Upgrade when you got money and desire for a bit more performance. 760 recently got a price drop, thanks to the ATI release. Don't expect an 860 to be as cheap as the 760s are when it comes out, should that be March, which could easily turn into May.
|
# ¿ Jan 3, 2014 21:02 |
|
craig588 posted:You can install second card and flash the original bios you backed up if you messed up any of your edits. The 780ti still uses basically the same bios as the Titan which is still incredibly similar to as far back as the 680. You really should edit your bios yourself instead of flashing someone elses. Here's the Google Translate version of the tool take makes Kepler bios editing really easy. I don't remember any of the offsets to do it manually anymore since that tool makes it so easy. I wouldn't recommend using the recently released software patches to force higher voltages though afterburner because you're fighting against the power management implementation and are probably going to waste a whole lot of heat. A potentiometer hard mod would be a better idea if you really wanted more than 1.21. That way you're just giving the GPU a set offset voltage and the driver/software doesn't know anything about it. This tool doesn't seem to allow you to modify the voltage table. Everything else seems open, but that is pretty important if you want to overclock higher. There are BIOS files that have been modified with an increased voltage table. Without going into further fuckery with voltages, I can hit 1320 core, 3456 memory this on my board. I haven't tried it, but 1.4v is possible with BIOS and softmods. It absolutely crushes things at 1920x1200 and that has really kept me from doing further tinkering.
|
# ¿ Jan 13, 2014 17:43 |
|
craig588 posted:You really don't want to do that with softmods. If you want more voltage you really want to do a mod with a potentiometer that the software and driver doesn't see. The important thing to edit in the bios is the power limits. Those premodded bioses have boost disabled which was one of the best power saving things that the Kepler generation started. I guess that isn't my concern. I always keep it folding or gaming when running, which just happens to be always. e: always
|
# ¿ Jan 13, 2014 18:14 |
|
The R9 270X is a rebranded 7870, if I remember correctly. I think these are still desired for coin mining. This could explain the lack of supply.
|
# ¿ Jan 14, 2014 08:35 |
|
ItBurns posted:I'm pretty sure I've seen other MSI cards listed as being discontinued, so there might be something there. NCIX in general is slow and unresponsive compared to Amazon or Newegg, not unbearably so but they're not really 'on the ball'. The combination of the two is just asking for trouble. I had the same issue with the 280xes before the holiday, and have just decided to stick with my card until something better comes out or the ltc miners get their power bill and throw in the towel. Well he does want that BF4 pack-in. Newegg.ca has the XFX 270X with the game code and is the cheapest available. I certainly wouldn't wait on NCIX any further at this point.
|
# ¿ Jan 14, 2014 14:58 |
|
Duro posted:Well, since I already paid NCIX I figured I'd stick with them for now and just make them switch the card out I've used XFX several times when they made nvidia products. They didn't let me down and still work if swapped in the water loop.
|
# ¿ Jan 14, 2014 20:20 |
|
Don Lapre posted:Id just spend the money on a better quadro card and game on it. Unless I am misunderstanding the naming format for Quadro cards, you go from $1,799 for a 1536 core K5000 to $4,999 for the 2880 core K6000. You could easily afford to get the K5000 and a GTX 780 Ti for specialized tasks, instead of getting an equivalent Quadro for gaming.
|
# ¿ Jan 22, 2014 16:09 |
|
Gyrotica posted:I have a question about the GTX Titan. Are there any neat/fun/useful things one can do with a Titan that makes use of its compute capacity? My rig is primarily built for gaming and I'm not a molecular biologist, but I can't shake the feeling that I should find other uses for it since I have the capability. Folding@home or the various grid projects, if you'd like to contribute to science without expecting a personal return.
|
# ¿ Jan 28, 2014 21:43 |
|
Adding a second NVIDIA card for physx processing is as simple as installing it to an available slot, then installing the drivers (reinstalling). The default is for the driver to choose the best card, but you can force it in the NVIDIA control panel.
|
# ¿ Jan 30, 2014 07:17 |
|
exquisite tea posted:Is it normal for the Geforce Experience optimization tool to randomly switch around its "optimal" configuration without any changes to the system? Last week it was telling me to play AC4 with high shadows, no SSAO and FXAA, now it's saying medium shadows with HBAO 2x and MSAA, neither of which seem to play any better or worse than the settings I was using initially. Maybe check the setting that auto-updates the game configurations. Never used that myself, but that sounds like what it says it does.
|
# ¿ Jan 30, 2014 15:00 |
|
Agreed posted:It's a totally different matter talking about stability under GPGPU conditions where math errors manifest more readily - a Memtest86-like program for GDDR5 would be pretty damned revealing even when pointed at overclocks that we as gamers probably consider stable, throwing around bit errors left and right. There is a MemtestCL that works on the gpu memory. https://simtk.org/home/memtest/
|
# ¿ Feb 4, 2014 07:22 |
|
The waterblock they are using on the hydocopper version is garbage as well. No VRM cooling at all. If that was even a remote consideration.
|
# ¿ Feb 21, 2014 10:17 |
|
Until every single chip that is made, works at the very best potential speed, overclocking will exist. Overclocking gets so out of focus with the internet and instant gratification of community benchmarks. It isn't that exciting, because the only people that have to constantly make changes to their machines, don't know what they are doing or only care about pushing it as far as you can for some benchmark. Set it up and test over the first few weeks, then let it go. A good overclock lasts for years. Don't mistake careful marketing and binned processes for actual processor capabilities, they are all unique.
|
# ¿ Feb 21, 2014 17:47 |
|
Nephilm posted:The discussion here is an effort/benefit balance. As the efficiency of "auto overclock" on CPUs/GPUs improves and thermal constraints get narrower, for the end user the cost entry bar gets higher while the potential gains lower. Pretty soon it's just going to be "get good cooling and let Boost do its thing" (and the definition of good cooling is also changing as parts need less heat dissipation), with little sense in getting costlier unlocked components or BIOS tweaking except for, as you mentioned, push-it-to-the-limit enthusiasts. No, this is just the mainstream being confused about power saving circuitry. Running something below a known max speed, then dynamically increasing to that under load, is not overclocking.
|
# ¿ Feb 21, 2014 19:47 |
|
Nephilm posted:...correct me if I'm completely wrong, but for example Nvidia's GPU Boost, it isn't a "throttling down from a known max speed", but boosting clocks up until hitting temperature or TDP constraints - the former being a variable factor based upon cooling solutions implemented (which is what I mentioned), and the latter a manufacturer-specified limit that, as has been brought up by other people, has an ever narrowing fringe between the spec and how much a user can push beyond it without causing actual hardware damage. Boost clocks are set by the manufacturer and are binned specs, not individually tested and set. Overclocking is pushing it further than the manufacturers specifications and each chip is unique in capabilities. Not trying to handwave anything. The technology in running these cards efficiently is great. But overclocking has a definition and it isn't the same as boost. Phuzun fucked around with this message at 20:20 on Feb 21, 2014 |
# ¿ Feb 21, 2014 20:15 |
|
Nephilm posted:Did I forget to put the quotes on my post? Oh poo poo, I missed the quotes. I didn't realize I was arguing against a generalization, guess I'm just to loving anal. Thanks.
|
# ¿ Feb 21, 2014 21:22 |
|
This is going to be a nice upgrade from my current GTX 780. Just gotta wait for the 3rd party and water blocks to keep using my custom loop.
|
# ¿ May 17, 2016 17:25 |
|
This Founders Edition drama has been so great. They have been doing the same thing for awhile now, where they send out reference cards to 3rd parties for initial release and then they are supposed push out custom stuff as they are supplied with only the gpu chip after the initial release. The addition $100 is new, though it is hard to blame a corporation for taking advantage of demand since they are heartless entities run on profits. They should release a no heatsink model for an extra $200 just for the press and pubbie drama. Was honestly hoping they'd give us a Titan card with HBM2 right away, though I guess I should be relieved that my wallet isn't taking that hit. I have no intention of waiting for a Ti model since I got my 780 when the Ti came out and kick myself for not making the jump a year earlier since the price drop wasn't huge.
|
# ¿ May 18, 2016 21:24 |
|
They can't make their own? I'll wait and see, though it is usually more expensive for the pre installed block and then you have no air cooler for selling used or putting in another computer later.
|
# ¿ May 29, 2016 15:45 |
|
EVGA GTX1080 SC was up for $649, snagged one and when I refreshed a few minutes later, it was already sold out again. Pure luck that I happen to check. Should be a notable upgrade from a 780.
|
# ¿ Jun 10, 2016 22:46 |
|
Last 3 have been Zotac and had no issues. I do like their warranty since I plan to put a water block on this, though I've never had to warranty a video card and the last 3 had blocks as well. EVGA was actually in stock and not founder's edition.
|
# ¿ Jun 10, 2016 22:59 |
|
Got my EK waterblock installed and let it run through the 3GB memory burner in ScannerX after some quick (no voltage) overclocking. After 45 minutes it was at 2100MHz (max 2151) and memory held stable at 5500MHz, the GPU hit a maximum of 57C (Also had IntelBurnTest running on 2600k 4.5GHz). This is +150 Offset and +550 Offset on memory for an EVGA GTX1080 SC. I'm pretty happy with this after running a GTX780 since the price drop/release of the 780ti.
|
# ¿ Jun 18, 2016 08:52 |
|
THE DOG HOUSE posted:???? ?????? The manufacture is cheap, the r&d going into the cards is expensive, which is why or costs are so high. FE wouldn't be so bad if we saw less of those from aftermarket companies now that they've shipped for a few weeks.
|
# ¿ Jun 20, 2016 23:35 |
|
spasticColon posted:But I don't have a G-sync or a Freesync monitor. I do plan on getting a new monitor soon but it's still only going to be 1080p like my current monitor but with the refresh rate bumped up to 120Hz. But will a 1060 or RX480 cut the mustard for 1080p 120Hz gaming? There are some newer games that aren't able to maintain 120 on the GTX1080 at 1080p (maxed). Either should be fine if you are okay with lowering details.
|
# ¿ Jul 11, 2016 07:37 |
|
THE DOG HOUSE posted:look at that loving thing Sad (or ironic) that it doesn't overclock any better than other 1080s. Certainly not worth the cost unless this is a going in a system that the owner won't manually oc and price isn't an issue.
|
# ¿ Jul 11, 2016 23:12 |
|
Yes, needs to be put in the game code. Nothing supported yet.
|
# ¿ Jul 14, 2016 10:58 |
|
Deuce posted:They just need to add pre filled extra radiators with the QDC and there's a golden age of custom water loops available for even novice builders. I've had a custom loop that has been upgraded and changed through the past decade. I'm not going to even argue the cost, cause air has it beat without a doubt. If you are willing to spend the time to plan out your build and not rush it, I doubt you'll have major issues. Fit install and cut tubes, remove everything and fill/bleed/leak check, then install for use. Right now I run 6 Enermax 14db fans and a quiet 200mm side panel fan. 240mm and 360mm fan radiators with quick disconnects between each component in the loop. It is very quiet at this point, and doing a single component change is painless. The days prior to quick disconnects meant pulling the entire setup out, to avoid leaking on your system, so I'd recommend them if in budget. Bare minimum would be a block (or multiple), pump, radiator, reservoir, something to kill algae like a silver plug or coil, tubes, enough tube fittings for everything, and fans. Be mindful of the tube size your buying and pick it on all parts that need it. Sorry I don't know of any guide links, don't really need em.
|
# ¿ Jul 17, 2016 23:07 |
|
This generation of Nvidia doesn't really seem to benefit from the fancier editions. Best to look at the cooler to see how loud it is as they all seem to hit within a 100mhz of each other. Plus with the way boosting works now, it is hard to pin point the actual maximum.
|
# ¿ Jul 22, 2016 12:13 |
|
twxabfn posted:My new TV does 1080p@120Hz and I've seen a couple occasions where my 1070 will drop into the 90s, but in that case it's not actually noticeable during gameplay - I can only tell if I actually have the frame counter up on the screen. I tried DOOM at 4k for a few minutes (sitting 10.5' from a 70" TV) and didn't notice any difference other than 1/3 the FPS. I might try 4k in games with 60FPS locks (EDF 4.1, if I ever play FO4 again), but if I have to turn down 4k settings to hit 60 I'll probably just stick with 1080p. Seeing the additional 4k detail becomes harder with distance, very quickly. 120hz is noticeable at any distance and exactly why I prefer that in my TV as well.
|
# ¿ Aug 4, 2016 10:24 |
|
As much as you can afford, then use DSR to make it look really good. Without getting into that, yeah, 1070 is likely good enough. My 1080 is on a TV that takes 120hz at 1080p, and plays most at 120fps with max detail. I imagine a 1070 will maintain 60 in everything. Plus it saves some for games that you'll likely want to buy.
|
# ¿ Mar 11, 2017 01:16 |
|
SinineSiil posted:Nvidia GFE driver installer chooses express install even though I clicked custom. I've noticed it once before but then I thought I just misclicked. Really want a version that just has Shadowplay, no game optimizing or driver updates. I always try to hang back on drivers, to make sure it doesn't break folding, and GFE updates always seem to introduce problems vs manual downloads.
|
# ¿ Mar 11, 2017 14:37 |
|
kuroiXiru posted:Well, you can just use GFE, turn off auto updates, and ignore optimizations. Lots of unnecessary stuff still running and causing issues. Latest one was breaking Xbox one wireless adapter due to one of the GFE services.
|
# ¿ Mar 11, 2017 16:10 |
|
|
# ¿ Apr 23, 2024 11:39 |
|
The Science Goy posted:Driver chat: Usually you can get away with installing over the existing and not using the clean install option (custom if you don't need/want 3d vision drivers). For new hardware, I'd uninstall everything from Windows control panel and use the clean install option. Same thing if you end up with issues. If you don't run custom settings or resolutions through the Nvidia control panel, there isn't a huge loss in using clean install all the time.
|
# ¿ Mar 11, 2017 16:42 |