Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Gwaihir
Dec 8, 2009
Hair Elf

Agreed posted:


Edit: Also I broke down and ordered a fuckin' GTX 680. I just want to turn up the graphics... :negative:/:rock:


I'm trying to resist this temptation right now myself :(

I picked up an EVGA GTX 480 on some insane dell deal for $360 back when they came out (I know, I know, GTX 480, but I had recently gotten a 3008wfp and my old 260 was realllly not cutting it at that point). I've got a huge Thermalright Spitfire on it so the thing is at least silent, and OCs to 800 mhz on stock voltage, but I can 100% still use more graphics horsepower.

Adbot
ADBOT LOVES YOU

Gwaihir
Dec 8, 2009
Hair Elf
A P6T is old enough to likely be out of warranty anyhow, so go for it!

For real though, it looks like there is no more risk doing this than there is when updating the bios normally, which is to say, there shouldn't be much of any risk at all, so long as your power does not go out in the middle of the flash.

Gwaihir
Dec 8, 2009
Hair Elf
Practically speaking, the only way to make the CPU a bottleneck for nearly any game is to run at super low resolution (1024*768, maybe) with no details. There are a few games that are more CPU bound than others (Blizzard games, Civ games, stuff like that), but in general the GPU is always going to be the limiting factor at modern resolutions. You'd have to do something like get a 2+ generation old CPU for it to really choke you performance wise.

Gwaihir
Dec 8, 2009
Hair Elf

Agreed posted:

snip

-----

All I've got so far, but it's looking really nice and playing smooth as hell.

Man, BL2 really has been a pleasant surprise. I upgraded from a GTX480 (Don't laugh too hard, I was struggling with a gtx260 driving a 30" screen, and got the 480 for 350$) to a 680. I was considering doing the even more laughably silly version of your setup, and leaving the 680 in there, but ultimately after trying it both ways I didn't see enough of a slowdown on the 680 to justify leaving the 480 in the machine sucking power. I'm looking forward to trying all the tweaks / DX11 rendering path tonight though.

On a related note, the Asus DirectCU II cards are really damned nice. I've had basically nothing but EVGA cards till now, but wanted one that was at least as quiet as my 480 was- I had a thermalright spitfire on it with a 140mm fan, because the stock 480 is probably in the same realm of volume as the infamous FX dustbuster. The Asus card delivers, no coil whine or anything else annoying, and no louder under load gaming than the spitfire's slow fan was.

Gwaihir
Dec 8, 2009
Hair Elf
Mid 70s are perfectly fine for GPU temperatures. The hottest recent GPU (GTX480) regularly ran up to 95 degrees C. Most cards have a hard shutoff around 105-120 though.

Gwaihir
Dec 8, 2009
Hair Elf
I also upgraded from a 480 to a 680, also at 2560*1600 with an i7-920 @ 3.4 ghz, but didn't see a huge increase until I upgraded to an ivy bridge chip at @ 4.5 a few weeks back. You're probably CPU limited.

Gwaihir
Dec 8, 2009
Hair Elf
There's very rarely instances when the latest driver isn't the best, and the last one of those that I can remember was years ago. Random crashes while benching/burn testing/etc might point more too a wheezing power supply or a slightly too ambitious overclock more than drivers, these days.

Gwaihir
Dec 8, 2009
Hair Elf
It doesn't really work like that in practice, none of the single GPU cards are powerful enough to match up to twin GTX680s, except in cases where it's lack of RAM causing the issue. The new AMD card is not going to be better enough over a Titan or GTX780 to measure up.

Gwaihir
Dec 8, 2009
Hair Elf
"GTX 780Ti", Good job NV. This g-sync initiative sounds pretty damned cool, if I didn't need to get a new monitor to use it.

Gwaihir
Dec 8, 2009
Hair Elf
Yea, this seems like something I would really like to try- It would just mean selling my new 3014 and upgrading from the GTX680 to something that could push those very high frame rates at high resolutions. From the list of monitor makers that they had signed on, it seems like they'll likely be using the same 144 HZ TN panels though, which is sorta.. Eh. I dunno.

Gwaihir
Dec 8, 2009
Hair Elf

Agreed posted:

I don't know about y'all but this G-Sync stuff is some stuff I would totally buy. Like, the more I think about it, the more sense it makes and I'm hyped as hell (as well as wondering why someone didn't already do it, given that the need to sync refresh to power or refresh to phosphor fade rate went away in like 2004).

This is going to be some cool tech.

Yea, it's legit making me regret JUUUUST buying my U3014. I can't realistically turn on vsync, because I can't maintain 60 FPS solid with only one OCed GTX680, so I get lots of tearing. But vsync leads to all those weird feeling lags and stutters in motion. Hopefully Asus at the very least puts it in one of their IPS models, because gently caress if I want to go back to a TN 144hz screen from a full sized 30" just to get this cool new tech.

Gwaihir
Dec 8, 2009
Hair Elf
Yes, kneejerking the other way is dumb.

Gwaihir
Dec 8, 2009
Hair Elf

Rahu X posted:

Well, I got my 780 in today. In pure excitement and relief, I plugged it in, installed the drivers, and enjoyed THE loving DISPLAY ISSUES I HAD BEFORE AGAIN.

gg NVIDIA. :argh:

Needless to say, I requested a replacement RMA to see if it's maybe the GPU itself and not the drivers. One reason why I think it may be the GPU is that I once logged in to strange graphical artifacts and color flashing.

If the replacement does the same thing, then I have no loving idea what the problem is. Any of you guys have any ideas what it might be?

The only things I can think of at this point are my motherboard or my PSU, both of which are relatively new.

Yea, that absolutely sounds like it's not the GPU if you got the same thing from two different cards. Can you take a pic of the actual corruption and artifacting on the screen and post it? Some artifacts are more obviously cable related vs the memory corruption style.

Gwaihir
Dec 8, 2009
Hair Elf
A card that's both louder and hotter than the GTX480 was is pretty impressive. As someone that used a GTX480 for all of a week ( it was only 350$ thanks to a mistake on Dell's site, don't look at me like that) before getting a Thermalright Spitfire, holy poo poo do not get this thing unless you have water or a similarly outrageous air cooler ready to go.

Gwaihir
Dec 8, 2009
Hair Elf

Zero VGS posted:

Is there a way to set a global framerate cap with NVidia? Because I want to run Final Fantasy at 120hz, but the in-game menu only has framerate caps for 30 and 60 fps, and if I uncap it my GTX 770 goes right for infinity FPS and revs up like a vacuum cleaner.

Not sure about nvidia drivers themselves, but usually video recording programs like DxTory have built in adjustable FPS caps. I know DXtory you can set it to whatever you want, maybe MSI afterburner and FRAPS will do the same thing?

Gwaihir
Dec 8, 2009
Hair Elf
Nope.

Gwaihir
Dec 8, 2009
Hair Elf

Agreed posted:

Man EVGA get your poo poo together and ship this puppy, I don't want to have to be on a 650Ti for more than like a day, tops, 30% of the performance of a 780? Pffft what am I gonna do, play browser tower defense games? (yes) (also after the 780Ti comes in, have y'all HEARD of Gemcraft Labyrinth? That poo poo is off the chain)

I'd like to note that the only reason this is an outrageous expenditure at all is because I'm doing it twice, but that I recouped about 2/3rds of what I paid for the 780 and that knocks the 780Ti down to under $300 out of pocket, and also G-Sync is seriously gonna rule

:kiddo:

veedubfreak, man, are you making GBS threads me that the ones that unlock actually have 1337 imprinted on them? Haha, if that's true I don't even know what to think anymore. Also, can you confirm or deny that any of your reference coolers were blemished and scored to poo poo from the factory like the ones that tomshardware wrote about? That was pretty outrageous, some serious lack of contact considering the transistor density and very high watt/transistor...

At least we can count on Tom's to get to the bottom of any ATI AMD related issues, no matter what they might be or how hard you have to look to find them! Although this time around, AMD did give official notice that the 290 fan controller issue was a real issue and while I'm sure they'd have gotten around to it, Tom's probably got them to move faster. Still, they seem to have picked their side a long, long time ago.

Gemcraft Labyrinth is loving legit, no arguments there :colbert:

Gwaihir
Dec 8, 2009
Hair Elf

mobby_6kl posted:

Sadly, while a 6200LE can render Windows solitaire, it chokes badly on 1080p video, or pretty much any video in windows media player, for that matter.

Speaking of which, are Nvidia's drivers incapable of dealing with two cards of different generations? When I put in the 650 Ti BOOST, I moved the old card to the second slot. It then looked that the latest drivers installed fine, but after restarting, the card appeared as "not working properly" and any attempts to reinstall the drivers failed completely. Only pulling the old card, booting into safe mode and running driver sweeper allowed the new drivers to be installed properly.

Which tools are used nowadays for monitoring and overclockig Nvidia cars? I used to use Riva Tuner for this, but it seems like it hasn't been updated in years :(

Old post, but yea- If you installed the latest drivers for a 650ti, they don't support an ancient 6200LE (Jesus how old is that card now, 10, 11 years?) The most recent drivers that support that series of GPUs are the 300s, and I think we're on 331.something.

http://www.geforce.com/drivers/results/57491

Gwaihir
Dec 8, 2009
Hair Elf

El Scotch posted:

Perhaps we could start a pool on how long it takes Agreed to get one after it's released. :getin:

Going to go with "The instant someone else can talk themselves in to buying his current 780ti"

Gwaihir
Dec 8, 2009
Hair Elf

Agreed posted:

Ok, god drat it, I sold my old card for a moderate loss to someone who is very happy with it because it's a high-performing model within the model family, and I spent that money plus $300ish tops out of pocket to get a 780Ti direct. One 780Ti. In this particular, specific transaction, I'm on the hook for that $300 or so, and I get a really great card for my comparably extremely sane graphics desires.

So I have a question.

How am I the face of :pcgaming: to you people?!

I just want as close to the best gaming experience as is possible without going to extremes. There's a lunatic fringe. I am not part of it. veedubfreak is. Let him carry around the $-letter for a while, I haven't earned it. I do things that are generally on the expensive side of normal for gaming, and I like to have an overall very good computing and gaming experience with really good sound, really good peripheral interface, really good monitor (hence waiting for G-Sync tech to go practical). I don't want or need like $2500 worth of gaming stuff, I leave that to crazy people. So why am I crazy people to you turkeys? :psyduck:

Veedubfreak is in a class of silly all his own, so he just gets ignored as a ~clearrrr~ outlier.

You're still close enough to the edge of sanity to make it worth poking fun at :haw:

Gwaihir
Dec 8, 2009
Hair Elf

BITCOIN MINING RIG posted:

although I don't know if I really want to assist idiots in giving themselves thermally-induced brain damage.

How is this even a question? Of COURSE you do!

Gwaihir
Dec 8, 2009
Hair Elf

PC LOAD LETTER posted:

Unless you're using software that needs a Quadro or FirePro I don't think you'd want a laptop that used either for gaming.

Those are workstation GPU's and their drivers are "tuned" for stuff like CAD and not gaming.

AMD mobile GPU's have good performance and as far as driver stability are fine but they tend to be power hogs compared to the Nvidia GPU's. AMD really needs to work the bugs out Enduro which have persisted for quite a while now. Until they do a gaming oriented Nvidia GPU would be best to get over all.

Only get a laptop with a AMD GPU if its an APU (power saving works fine with those) or if you can get a good deal on it and can live with much less battery time. Generally speaking though any gaming laptop with a mid to high end dGPU is going to have generally poor battery time compared to one with only a iGPU.

The driver difference for games on Quadro cards is almost nonexistent. I've never run in to anything that presented issues in the slightest on my Thinkpad with a Quadro.

At worst, you'll get maaaaybe 3-5 fewer FPS. Don't worry about playing games on a Quadro.

The Firepro M5100 will be the "best," but for actually using the laptop as a laptop, the Nvidia card is going to come out far ahead thanks to Optimus.

Gwaihir
Dec 8, 2009
Hair Elf

Meltycat posted:

Just a note -- on the m4800, if you get the QHD screen, Optimus is completely disabled as far as I know. The QHD m4800 looks like a super nice laptop, other than the fact that battery life is ~3 hours due to the lack of Optimus.

Wow, really? That's loving bizarre. I wonder why they did that/if we'll see Lenovo do the same thing on the new T/W540 with the similar screen?

Gwaihir
Dec 8, 2009
Hair Elf

Factory Factory posted:

Sorry, Deimos. :(

--

I'm getting really frustrated. Need and cashflow combined just right that I'm looking at putting a CLC on my 680 in my Bitfenix Prodigy, and Goddamn nothing will fit and it's running more expensive than I thought it would be or should be. Arctic Accelero Hybrid misses by 14 mm. The NZXT Kraken G10 looks like it should fit, except proper mounting makes it float way above the PCB so I'd probably need to cut parts off and replace the fan. All the custom air coolers are 3-slot affairs.

And on top of all this, I have to account for replacing my optical drive with a shorter (i.e. slim) model - more cost, unless I want a hole in the case.

I look into doing The Mod, because that would be cheaper, and Dwood has stopped selling mounting brackets. Also, part of the reason I wanted to do this CLC thing is because the stock fan was starting to fail, but The Mod requires leaving the stock fan for VRM cooling. So if I did The Mod, I'd have to RMA this thing for a rattle-fan and get a fresh chip lottery; I might as well RMA it, sell the returned card, and buy a new 770 with a semicustom cooler.

Argh. :mad:

Any advice? All I want is this thing to be quiet. and the GPU is the loudest thing in it by far.

Hm, I'm not sure if they still offer it, but a Thermalright Spitfire might be an option. It's loving huge and given the orientation of that case might necessitate a low profile CPU cooler, but if you've got a CLC on the CPU it could work. Notably, it requires zero space in the adjoining slots, instead moving the cooler over the top area of the card, to where your CPU heatsink likely is:

(Pictured is my old GTX480, which the Spitfire took from a noisy as gently caress 95 degrees down to a silent ~60 degrees at load).

This thing actually comes with a support system because it's so large and heavy- There is a bracket that mounts over where the ram is on most motherboards, with an extension rod to hold up the other end of the cooler since it's so lopsided.

Gwaihir fucked around with this message at 00:53 on Dec 10, 2013

Gwaihir
Dec 8, 2009
Hair Elf

Zero VGS posted:

Did they mean compared to V-Sync off? I thought if V-Sync is on, there isn't really going to be tearing anyways? 5% lower framerates than that other thing that eliminates tearing makes it awfully hard to justify the premium they charge when it can be put towards a GPU upgrade. And I'm one of the guys who has the upgradable Asus monitor and was looking forward to this.

Vsync on eliminates tearing, but at the cost of introducing lag and stuttering any time your FPS hits something other than an exact multiple of your monitor's refresh rate (60 FPS, 30 FPS, etc). Gsync also eliminates tearing, but has no lag or stuttering at all, so long as your FPS is anywhere between 30 FPS and (60-144 FPS depending on your screen's refresh rate).

Since it's way more common, especially for folks running 2560 monitors, to have FPS ranging between 35-55ish FPS, gsync is a godsend.

Of course, they haven't introduced any 2560 IPS monitors with gsync yet.

:smith:

e: The other thing is, with this tech, so long as you're getting over 30 FPS, it seems like extra FPS is effectively wasted? I know with current monitors I see a big difference between 30 and 60 FPS, but I wonder if that will still be true with a Gsync screen? It seems possible, at least. (I never turn on vsync presently, I play on a 30" screen and don't have the GPU power to keep it at 60 FPS all the time with a single GTX680).

Gwaihir fucked around with this message at 16:13 on Dec 12, 2013

Gwaihir
Dec 8, 2009
Hair Elf
We have a post the pictures of your ~real~ desktop thread, what kind of hardware forum would we be without a "Post the pictures of your desktop(pc this time)" thread?

Gwaihir
Dec 8, 2009
Hair Elf
Yea, I dunno. I don't really mind tearing so much, but I hate lurching/stuttering. So I play with vsync disabled, and usually get between 40 and 60 FPS, which is about as much as you can reasonably get on a 30" screen without spending $1000s on video cards. A gsync module would basically mean I get the same lack of lurching/stuttering, but would also get rid of tearing, too. If the price was reasonable I'd go for it. Considering I'm a crazy person that already has a 30" monitor I guess I'd consider 75-maybe 100$ reasonable for the monitor module?

I think it just comes down to having to see it in person. This is such a personal preference thing, and some people just can't see differences at all because of who knows why.

Gwaihir
Dec 8, 2009
Hair Elf
Yea, tearing happens at all FPS other than "Exactly 30/60 all the time" (for 60hz monitors).

Gwaihir
Dec 8, 2009
Hair Elf

movax posted:

Go for it, and if it turns into a twisted parody of [H] build threads, I'm OK with that too. We could use more threads than just the megathreads floating around. :science:

Veedub, obviously you need to start this thread then. I enjoy both nice meticulous builds as well as "Living like poo poo" builds, so hey, room for all to post!

Gwaihir
Dec 8, 2009
Hair Elf
Do you play with vsync enabled?

Gwaihir
Dec 8, 2009
Hair Elf
Without vsync turned on you still have tearing, it likely just doesn't bother you very much. It's never really bothered me a ton either. It's likely to be much more noticeable if you're playing FPS games vs something like an RTS though.

Gwaihir
Dec 8, 2009
Hair Elf

Mad_Lion posted:

Would it be possible for somebody to manufacture a device that sits between your video card and your regular monitor and does the same thing? That would be nice for those who like their current monitor but want to get in on G-Sync. I also wonder if this sort of tech will happen for AMD cards (possibly using a device like the one I described)?

I've got a 7850 that overclocks well (almost to 7870 ghz levels of performance) and I'm about to upgrade to an i7 rig. I imagine I would sit perfectly in the 30-60 fps range for most games with this setup and I really like what this can do for smoothness. I really like my video card though, and I'm certainly not going to buy a new monitor just yet.

A man in the middle type device wouldn't work from what I understand, it has to be a chip connected directly to the LCD panel in order to control the refresh timing and to be able to report back to the video card/allow it to adjust timing to match frame render times.

Gwaihir
Dec 8, 2009
Hair Elf
I don't think Ghostpilot is talking about a literal resolution scaler, he's talking about the performance scaling of going to crossfire configs at 4K+ resolutions, compared to SLI. (Crossfire 290s seeing a generally larger % increase in performance over a single card vs SLI at 4k+ resolution)

Gwaihir
Dec 8, 2009
Hair Elf
Tearing and stuttering both still exist on 120hz screens. Higher refresh shortens the interval between frames on the monitor, but that still doesn't guarantee that those frames line up with the GPU's frame times.

Also vsync and 120hz is not a good combination, because it's quite hard to get enough GPU power to peg a game at 120 fps and keep it there. Most people with 120hz screens are likely gaming with Vsync off and just deal with tearing.

Gwaihir
Dec 8, 2009
Hair Elf
I have a GTX680 and have never seen issues like that, with power management set to the high performance settings. :iiam:

Gwaihir
Dec 8, 2009
Hair Elf

eggyolk posted:

Can anyone explain what the deal is with the Tegra K1? It seems to be taking up a lot of headlines but hasn't been mentioned here yet.

It's the first ARM based SOC that builds in a full desktop Kepler core. It's a smartphone chip with a whole Kepler based compute unit, so it supports everything a real desktop GPU does, instead of the much more limited featureset most mobile GPUs offer.

e: Notably it's got just about the same raw stats compute power wise as an Xbox360 class GPU, although most implementations will have slightly less memory bandwidth (Although vastly more actual memory). So you have quite good potential for running last gen quality console ports, on something like a Shield v2.

Gwaihir fucked around with this message at 02:09 on Jan 15, 2014

Gwaihir
Dec 8, 2009
Hair Elf

Factory Factory posted:

TechReport has a little on the Oculus Rift "Crystal Cove" prototype. Still a 1080p screen, but this time with an AMOLED panel and a lot of enhancements to avoid motion sickness.

The reason it's in this thread, though, is a little tidbit that they're working on something G-sync like, since that's a natural ally for something like a Rift. Unfortunately, they aren't talking about it any more than saying just that much.

I am also super hype that the tie-in launch game for the Rift will be EVE Valkyrie, but that's neither here nor there. SPACE SIMS! :byodood:

But there was some intimation about AMD being all up ons this thing. AMD had a Rift at their CES booth accompanied by a prototype positional audio headset with TrueAudio acceleration. Audio on the Rift is another notable "Yes, it's a thing, but we're not talking about it" subject.

In the vein of rifts, space sims, and :flashfap: collaboration between the two, you really owe it to yourself to check out this one: http://forums.somethingawful.com/showthread.php?threadid=3530373&userid=0&perpage=40&pagenumber=25

Stream of the dev working on it: http://www.twitch.tv/marauderinteractive/b/495077760

There's also a great Rift demo up on youtube (I would link it, but can't search youtubes on my work machine).

e: I think this is it: https://www.youtube.com/watch?v=oSKeseN4uJk

Gwaihir
Dec 8, 2009
Hair Elf
The veedub method :golfclap:

Gwaihir
Dec 8, 2009
Hair Elf
Meanwhile, R9 290Xes on Newegg are selling for $899. GTX780tis are a value!

Adbot
ADBOT LOVES YOU

Gwaihir
Dec 8, 2009
Hair Elf

Rastor posted:

The new nVidia cards are now officially announced:

The first Maxwells: The GTX 750 and GTX 750 Ti

The fully armed and operational Kepler GK110: The GTX Titan Black

The Maxwell parts are extremely impressive. Remember, this chip is the replacement for the plain GT650- Not the 650Ti or Ti-Boost, those are GK106 parts vs the GK107 plain 650. That, and the OCing results put a SIXTY WATT tdp 150$ card right up on the same tier with a *140* watt 230$ GTX660 in everything but crysis 3. The potential mobile SOCs using this guy plus the high end cards should own.

Yea, they're still slower (usually) than the AMD card at the same price point, but use far less power.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply