|
Agreed posted:
I'm trying to resist this temptation right now myself I picked up an EVGA GTX 480 on some insane dell deal for $360 back when they came out (I know, I know, GTX 480, but I had recently gotten a 3008wfp and my old 260 was realllly not cutting it at that point). I've got a huge Thermalright Spitfire on it so the thing is at least silent, and OCs to 800 mhz on stock voltage, but I can 100% still use more graphics horsepower.
|
# ¿ May 31, 2012 18:54 |
|
|
# ¿ Apr 23, 2024 11:00 |
|
A P6T is old enough to likely be out of warranty anyhow, so go for it! For real though, it looks like there is no more risk doing this than there is when updating the bios normally, which is to say, there shouldn't be much of any risk at all, so long as your power does not go out in the middle of the flash.
|
# ¿ Jun 29, 2012 18:11 |
|
Practically speaking, the only way to make the CPU a bottleneck for nearly any game is to run at super low resolution (1024*768, maybe) with no details. There are a few games that are more CPU bound than others (Blizzard games, Civ games, stuff like that), but in general the GPU is always going to be the limiting factor at modern resolutions. You'd have to do something like get a 2+ generation old CPU for it to really choke you performance wise.
|
# ¿ Jul 9, 2012 20:55 |
|
Agreed posted:snip Man, BL2 really has been a pleasant surprise. I upgraded from a GTX480 (Don't laugh too hard, I was struggling with a gtx260 driving a 30" screen, and got the 480 for 350$) to a 680. I was considering doing the even more laughably silly version of your setup, and leaving the 680 in there, but ultimately after trying it both ways I didn't see enough of a slowdown on the 680 to justify leaving the 480 in the machine sucking power. I'm looking forward to trying all the tweaks / DX11 rendering path tonight though. On a related note, the Asus DirectCU II cards are really damned nice. I've had basically nothing but EVGA cards till now, but wanted one that was at least as quiet as my 480 was- I had a thermalright spitfire on it with a 140mm fan, because the stock 480 is probably in the same realm of volume as the infamous FX dustbuster. The Asus card delivers, no coil whine or anything else annoying, and no louder under load gaming than the spitfire's slow fan was.
|
# ¿ Sep 19, 2012 19:59 |
|
Mid 70s are perfectly fine for GPU temperatures. The hottest recent GPU (GTX480) regularly ran up to 95 degrees C. Most cards have a hard shutoff around 105-120 though.
|
# ¿ Oct 1, 2012 21:58 |
|
I also upgraded from a 480 to a 680, also at 2560*1600 with an i7-920 @ 3.4 ghz, but didn't see a huge increase until I upgraded to an ivy bridge chip at @ 4.5 a few weeks back. You're probably CPU limited.
|
# ¿ Mar 7, 2013 21:23 |
|
There's very rarely instances when the latest driver isn't the best, and the last one of those that I can remember was years ago. Random crashes while benching/burn testing/etc might point more too a wheezing power supply or a slightly too ambitious overclock more than drivers, these days.
|
# ¿ May 21, 2013 16:38 |
|
It doesn't really work like that in practice, none of the single GPU cards are powerful enough to match up to twin GTX680s, except in cases where it's lack of RAM causing the issue. The new AMD card is not going to be better enough over a Titan or GTX780 to measure up.
|
# ¿ Oct 17, 2013 22:24 |
|
"GTX 780Ti", Good job NV. This g-sync initiative sounds pretty damned cool, if I didn't need to get a new monitor to use it.
|
# ¿ Oct 18, 2013 16:36 |
|
Yea, this seems like something I would really like to try- It would just mean selling my new 3014 and upgrading from the GTX680 to something that could push those very high frame rates at high resolutions. From the list of monitor makers that they had signed on, it seems like they'll likely be using the same 144 HZ TN panels though, which is sorta.. Eh. I dunno.
|
# ¿ Oct 18, 2013 16:46 |
|
Agreed posted:I don't know about y'all but this G-Sync stuff is some stuff I would totally buy. Like, the more I think about it, the more sense it makes and I'm hyped as hell (as well as wondering why someone didn't already do it, given that the need to sync refresh to power or refresh to phosphor fade rate went away in like 2004). Yea, it's legit making me regret JUUUUST buying my U3014. I can't realistically turn on vsync, because I can't maintain 60 FPS solid with only one OCed GTX680, so I get lots of tearing. But vsync leads to all those weird feeling lags and stutters in motion. Hopefully Asus at the very least puts it in one of their IPS models, because gently caress if I want to go back to a TN 144hz screen from a full sized 30" just to get this cool new tech.
|
# ¿ Oct 21, 2013 22:53 |
|
Yes, kneejerking the other way is dumb.
|
# ¿ Oct 23, 2013 14:28 |
|
Rahu X posted:Well, I got my 780 in today. In pure excitement and relief, I plugged it in, installed the drivers, and enjoyed THE loving DISPLAY ISSUES I HAD BEFORE AGAIN. Yea, that absolutely sounds like it's not the GPU if you got the same thing from two different cards. Can you take a pic of the actual corruption and artifacting on the screen and post it? Some artifacts are more obviously cable related vs the memory corruption style.
|
# ¿ Nov 1, 2013 00:33 |
|
A card that's both louder and hotter than the GTX480 was is pretty impressive. As someone that used a GTX480 for all of a week ( it was only 350$ thanks to a mistake on Dell's site, don't look at me like that) before getting a Thermalright Spitfire, holy poo poo do not get this thing unless you have water or a similarly outrageous air cooler ready to go.
|
# ¿ Nov 5, 2013 14:25 |
|
Zero VGS posted:Is there a way to set a global framerate cap with NVidia? Because I want to run Final Fantasy at 120hz, but the in-game menu only has framerate caps for 30 and 60 fps, and if I uncap it my GTX 770 goes right for infinity FPS and revs up like a vacuum cleaner. Not sure about nvidia drivers themselves, but usually video recording programs like DxTory have built in adjustable FPS caps. I know DXtory you can set it to whatever you want, maybe MSI afterburner and FRAPS will do the same thing?
|
# ¿ Nov 6, 2013 16:44 |
|
Nope.
|
# ¿ Nov 15, 2013 18:20 |
|
Agreed posted:Man EVGA get your poo poo together and ship this puppy, I don't want to have to be on a 650Ti for more than like a day, tops, 30% of the performance of a 780? Pffft what am I gonna do, play browser tower defense games? (yes) (also after the 780Ti comes in, have y'all HEARD of Gemcraft Labyrinth? That poo poo is off the chain) Gemcraft Labyrinth is loving legit, no arguments there
|
# ¿ Nov 19, 2013 17:33 |
|
mobby_6kl posted:Sadly, while a 6200LE can render Windows solitaire, it chokes badly on 1080p video, or pretty much any video in windows media player, for that matter. Old post, but yea- If you installed the latest drivers for a 650ti, they don't support an ancient 6200LE (Jesus how old is that card now, 10, 11 years?) The most recent drivers that support that series of GPUs are the 300s, and I think we're on 331.something. http://www.geforce.com/drivers/results/57491
|
# ¿ Dec 2, 2013 15:16 |
|
El Scotch posted:Perhaps we could start a pool on how long it takes Agreed to get one after it's released. Going to go with "The instant someone else can talk themselves in to buying his current 780ti"
|
# ¿ Dec 2, 2013 23:32 |
|
Agreed posted:Ok, god drat it, I sold my old card for a moderate loss to someone who is very happy with it because it's a high-performing model within the model family, and I spent that money plus $300ish tops out of pocket to get a 780Ti direct. One 780Ti. In this particular, specific transaction, I'm on the hook for that $300 or so, and I get a really great card for my comparably extremely sane graphics desires. Veedubfreak is in a class of silly all his own, so he just gets ignored as a ~clearrrr~ outlier. You're still close enough to the edge of sanity to make it worth poking fun at
|
# ¿ Dec 3, 2013 22:15 |
|
BITCOIN MINING RIG posted:although I don't know if I really want to assist idiots in giving themselves thermally-induced brain damage. How is this even a question? Of COURSE you do!
|
# ¿ Dec 5, 2013 15:28 |
|
PC LOAD LETTER posted:Unless you're using software that needs a Quadro or FirePro I don't think you'd want a laptop that used either for gaming. The driver difference for games on Quadro cards is almost nonexistent. I've never run in to anything that presented issues in the slightest on my Thinkpad with a Quadro. At worst, you'll get maaaaybe 3-5 fewer FPS. Don't worry about playing games on a Quadro. The Firepro M5100 will be the "best," but for actually using the laptop as a laptop, the Nvidia card is going to come out far ahead thanks to Optimus.
|
# ¿ Dec 6, 2013 16:08 |
|
Meltycat posted:Just a note -- on the m4800, if you get the QHD screen, Optimus is completely disabled as far as I know. The QHD m4800 looks like a super nice laptop, other than the fact that battery life is ~3 hours due to the lack of Optimus. Wow, really? That's loving bizarre. I wonder why they did that/if we'll see Lenovo do the same thing on the new T/W540 with the similar screen?
|
# ¿ Dec 6, 2013 16:43 |
|
Factory Factory posted:Sorry, Deimos. Hm, I'm not sure if they still offer it, but a Thermalright Spitfire might be an option. It's loving huge and given the orientation of that case might necessitate a low profile CPU cooler, but if you've got a CLC on the CPU it could work. Notably, it requires zero space in the adjoining slots, instead moving the cooler over the top area of the card, to where your CPU heatsink likely is: (Pictured is my old GTX480, which the Spitfire took from a noisy as gently caress 95 degrees down to a silent ~60 degrees at load). This thing actually comes with a support system because it's so large and heavy- There is a bracket that mounts over where the ram is on most motherboards, with an extension rod to hold up the other end of the cooler since it's so lopsided. Gwaihir fucked around with this message at 00:53 on Dec 10, 2013 |
# ¿ Dec 10, 2013 00:50 |
|
Zero VGS posted:Did they mean compared to V-Sync off? I thought if V-Sync is on, there isn't really going to be tearing anyways? 5% lower framerates than that other thing that eliminates tearing makes it awfully hard to justify the premium they charge when it can be put towards a GPU upgrade. And I'm one of the guys who has the upgradable Asus monitor and was looking forward to this. Vsync on eliminates tearing, but at the cost of introducing lag and stuttering any time your FPS hits something other than an exact multiple of your monitor's refresh rate (60 FPS, 30 FPS, etc). Gsync also eliminates tearing, but has no lag or stuttering at all, so long as your FPS is anywhere between 30 FPS and (60-144 FPS depending on your screen's refresh rate). Since it's way more common, especially for folks running 2560 monitors, to have FPS ranging between 35-55ish FPS, gsync is a godsend. Of course, they haven't introduced any 2560 IPS monitors with gsync yet. e: The other thing is, with this tech, so long as you're getting over 30 FPS, it seems like extra FPS is effectively wasted? I know with current monitors I see a big difference between 30 and 60 FPS, but I wonder if that will still be true with a Gsync screen? It seems possible, at least. (I never turn on vsync presently, I play on a 30" screen and don't have the GPU power to keep it at 60 FPS all the time with a single GTX680). Gwaihir fucked around with this message at 16:13 on Dec 12, 2013 |
# ¿ Dec 12, 2013 16:08 |
|
We have a post the pictures of your ~real~ desktop thread, what kind of hardware forum would we be without a "Post the pictures of your desktop(pc this time)" thread?
|
# ¿ Dec 12, 2013 16:52 |
|
Yea, I dunno. I don't really mind tearing so much, but I hate lurching/stuttering. So I play with vsync disabled, and usually get between 40 and 60 FPS, which is about as much as you can reasonably get on a 30" screen without spending $1000s on video cards. A gsync module would basically mean I get the same lack of lurching/stuttering, but would also get rid of tearing, too. If the price was reasonable I'd go for it. Considering I'm a crazy person that already has a 30" monitor I guess I'd consider 75-maybe 100$ reasonable for the monitor module? I think it just comes down to having to see it in person. This is such a personal preference thing, and some people just can't see differences at all because of who knows why.
|
# ¿ Dec 12, 2013 18:10 |
|
Yea, tearing happens at all FPS other than "Exactly 30/60 all the time" (for 60hz monitors).
|
# ¿ Dec 12, 2013 18:44 |
|
movax posted:Go for it, and if it turns into a twisted parody of [H] build threads, I'm OK with that too. We could use more threads than just the megathreads floating around. Veedub, obviously you need to start this thread then. I enjoy both nice meticulous builds as well as "Living like poo poo" builds, so hey, room for all to post!
|
# ¿ Dec 12, 2013 19:43 |
|
Do you play with vsync enabled?
|
# ¿ Dec 12, 2013 21:08 |
|
Without vsync turned on you still have tearing, it likely just doesn't bother you very much. It's never really bothered me a ton either. It's likely to be much more noticeable if you're playing FPS games vs something like an RTS though.
|
# ¿ Dec 12, 2013 21:29 |
|
Mad_Lion posted:Would it be possible for somebody to manufacture a device that sits between your video card and your regular monitor and does the same thing? That would be nice for those who like their current monitor but want to get in on G-Sync. I also wonder if this sort of tech will happen for AMD cards (possibly using a device like the one I described)? A man in the middle type device wouldn't work from what I understand, it has to be a chip connected directly to the LCD panel in order to control the refresh timing and to be able to report back to the video card/allow it to adjust timing to match frame render times.
|
# ¿ Dec 13, 2013 16:10 |
|
I don't think Ghostpilot is talking about a literal resolution scaler, he's talking about the performance scaling of going to crossfire configs at 4K+ resolutions, compared to SLI. (Crossfire 290s seeing a generally larger % increase in performance over a single card vs SLI at 4k+ resolution)
|
# ¿ Dec 17, 2013 15:51 |
|
Tearing and stuttering both still exist on 120hz screens. Higher refresh shortens the interval between frames on the monitor, but that still doesn't guarantee that those frames line up with the GPU's frame times. Also vsync and 120hz is not a good combination, because it's quite hard to get enough GPU power to peg a game at 120 fps and keep it there. Most people with 120hz screens are likely gaming with Vsync off and just deal with tearing.
|
# ¿ Dec 17, 2013 18:15 |
|
I have a GTX680 and have never seen issues like that, with power management set to the high performance settings.
|
# ¿ Dec 23, 2013 15:24 |
|
eggyolk posted:Can anyone explain what the deal is with the Tegra K1? It seems to be taking up a lot of headlines but hasn't been mentioned here yet. It's the first ARM based SOC that builds in a full desktop Kepler core. It's a smartphone chip with a whole Kepler based compute unit, so it supports everything a real desktop GPU does, instead of the much more limited featureset most mobile GPUs offer. e: Notably it's got just about the same raw stats compute power wise as an Xbox360 class GPU, although most implementations will have slightly less memory bandwidth (Although vastly more actual memory). So you have quite good potential for running last gen quality console ports, on something like a Shield v2. Gwaihir fucked around with this message at 02:09 on Jan 15, 2014 |
# ¿ Jan 15, 2014 02:04 |
|
Factory Factory posted:TechReport has a little on the Oculus Rift "Crystal Cove" prototype. Still a 1080p screen, but this time with an AMOLED panel and a lot of enhancements to avoid motion sickness. In the vein of rifts, space sims, and collaboration between the two, you really owe it to yourself to check out this one: http://forums.somethingawful.com/showthread.php?threadid=3530373&userid=0&perpage=40&pagenumber=25 Stream of the dev working on it: http://www.twitch.tv/marauderinteractive/b/495077760 There's also a great Rift demo up on youtube (I would link it, but can't search youtubes on my work machine). e: I think this is it: https://www.youtube.com/watch?v=oSKeseN4uJk
|
# ¿ Jan 17, 2014 20:04 |
|
The veedub method
|
# ¿ Feb 4, 2014 22:00 |
|
Meanwhile, R9 290Xes on Newegg are selling for $899. GTX780tis are a value!
|
# ¿ Feb 14, 2014 18:59 |
|
|
# ¿ Apr 23, 2024 11:00 |
|
Rastor posted:The new nVidia cards are now officially announced: The Maxwell parts are extremely impressive. Remember, this chip is the replacement for the plain GT650- Not the 650Ti or Ti-Boost, those are GK106 parts vs the GK107 plain 650. That, and the OCing results put a SIXTY WATT tdp 150$ card right up on the same tier with a *140* watt 230$ GTX660 in everything but crysis 3. The potential mobile SOCs using this guy plus the high end cards should own. Yea, they're still slower (usually) than the AMD card at the same price point, but use far less power.
|
# ¿ Feb 18, 2014 18:58 |