|
Carecat posted:Is the TOP version of the Asus 2GB GeForce GTX 670 DirectCU II different apart from the lower clock speeds? All the reviews are of the TOP version and I'm mainly after the low temperature and fan noise when the card is already extremely fast. The cooler is identical on both DCU II cards.
|
# ? Jul 13, 2012 19:17 |
|
|
# ? Apr 28, 2024 13:25 |
|
Star War Sex Parrot posted:Wait what?! Yep, dead as a doornail. Stone dead. Ex-development. It's not finished, it's just shelved for good. Timothy Lottes, former FXAA guy, now TXAA guy posted:TXAA is slower and much higher quality, FXAA is fast and low quality. There is no mixing of FXAA and TXAA (FXAA on top of TXAA would reduce the quality). Yes I'm moving on from FXAA, no plans to update. My goals moving forward are to provide higher IQ making use of hardware features and depending on rendering lower-than-native resolution to scale down to lower performance devices. TXAA relates more to the quality of 4xSGSSAA with a wider filter for cost similar to 4xMSAA. As for "texture quality", as SGSSAA with super-sampling is softer than point-sampling with MSAA, and add on top of this a wider filter as common to CG film industry, TXAA is softer but has much better filtering quality than prior methods. If you'd rather have something sharper with more aliasing, then something like CSAA with transparency super-sampling is probably what you are looking for. However, if you are looking to approach what you'd expect from a Blu Ray video, TXAA is working towards that direction. What we've got is all we'll get.
|
# ? Jul 13, 2012 19:18 |
|
Heads up for any of you nerds with Nvidia forum accounts: dey got hacked.quote:Our investigation has identified that unauthorized third parties gained access to some user information, including:
|
# ? Jul 14, 2012 12:23 |
|
Hi guys, I tried reading the OP and honestly am a little everwhelmed by all the information. I have a simple question: how much better is the GTX 560 compared to the GTX 460?
|
# ? Jul 14, 2012 13:00 |
|
taupoke posted:Hi guys, I tried reading the OP and honestly am a little everwhelmed by all the information. I have a simple question: how much better is the GTX 560 compared to the GTX 460? http://www.anandtech.com/bench/Product/542?vs=543 But most people will talk about the 560 Ti, not the regular 560.
|
# ? Jul 14, 2012 13:13 |
|
Got my wife a new computer that came with a Radeon 7850, Youtube videos seem to crash the graphics driver when put in HD mode and then click to another tab or application. Anyone come across this before? Haven't tried lots, but so far games seem to be working absolutely fine with no TDRs, has anyone experienced a similar problem? I'm hoping this is a driver/chrome/youtube/flash problem and not indicative of the video card. It's not overclocked and the PSU is a more than capable and a brand name. Just thought I'd post here before in Haus of Technical Support. Edit: Seems purely Flash related. Found a few folks over at the Tomshardware forum complaining of exactly the same issue. Lord Dekks fucked around with this message at 18:10 on Jul 14, 2012 |
# ? Jul 14, 2012 15:16 |
|
Yes the new Flash versions added a Protected Mode to reduce the risk of unpatched exploits, but it seems to have murdered both performance and reliability. I'd make sure you have the Catalyst 12.7 Beta drivers installed, the very latest version of Flash, and the very latest version of Firefox. If you still experience issues, there's instructions in the Firefox thread for how to disable Flash's Protected Mode.
|
# ? Jul 14, 2012 18:55 |
|
So I think I'll try here as well about my problem. Now few months ago I was playing perfectly all my games and usually at max settings (i.e. Skyrim which probably isn't really resource intensive) with an NVIDIA 285M graphics card on the 296.10 driver. After each consecutive update, I've noticed a decreased in performance and increased microstutter in Skyrim and FPS loss in other games with the addition of my car running hotter and hotter. Now at the current beta drivers 304.79, it's just outright stuttering in that game and all my other games, old and new, run my card very hot to the touch despite not doing this before. Turning off anti-aliasing and using just FXAA seems to not make the card run that hot and makes the stuttering less-noticeable. Running games in windowed mode makes them run fine with no problems. I have clean-installed the beta. My question is, despite the card, I had been playing Skyrim and other games in the past with no problems, but after each beta update it gets worse and worse. Is this due to NVIDIA adding support for FXAA at all or adaptive verticle sync? Less optimization for the games? Anyone else experiencing these problems or have an idea? I know there's a thread on both Official and Steam forums about this but I doubt GPU people visit those awful forums often.
|
# ? Jul 14, 2012 19:18 |
|
Dead Man Posting posted:So I think I'll try here as well about my problem. Now few months ago I was playing perfectly all my games and usually at max settings (i.e. Skyrim which probably isn't really resource intensive) with an NVIDIA 285M graphics card on the 296.10 driver. After each consecutive update, I've noticed a decreased in performance and increased microstutter in Skyrim and FPS loss in other games with the addition of my car running hotter and hotter. Now at the current beta drivers 304.79, it's just outright stuttering in that game and all my other games, old and new, run my card very hot to the touch despite not doing this before.
|
# ? Jul 14, 2012 20:51 |
|
Thanks Aleron. I didn't really mean to make that post a troubleshooting one, I was really just asking if it's possible that games become un-optimized in newer drivers because of newer features, or if it's the game updating, or various other factors, but reading more about it I'm getting more answers.
|
# ? Jul 15, 2012 04:31 |
|
Is the 7970 Ghz Edition available anywhere? Kind of feels like a marketing stunt in the absence of any product.
|
# ? Jul 16, 2012 09:09 |
|
It's not just yet, and it's a standard trick: the "paper launch." Lots of top-end video cards have been getting them this year, and some SSDs as well (like Crucial's mSATA M4 drives).
|
# ? Jul 16, 2012 09:31 |
|
Factory Factory posted:It's not just yet, and it's a standard trick: the "paper launch." Lots of top-end video cards have been getting them this year, and some SSDs as well (like Crucial's mSATA M4 drives). I remember the reviews saying they'd be out in early July. I imagine it might be taking longer to do custom board designs as the reference design got fairly bad noise scores.
|
# ? Jul 16, 2012 10:02 |
|
Moses posted:I remember the reviews saying they'd be out in early July. I imagine it might be taking longer to do custom board designs as the reference design got fairly bad noise scores. What I'm worried about is partners pricing it out of the market. nVidia still seems to have wiggle room on prices and aren't cutting it so close to the bone. Every GHz edition is a carefully binned part, and ATI's costlier design can't be good for distribution channels. Profit margins move product as much as folks' ideas of products... ATI should be having a totally baller generation here at this point, they've had price:performance parity since they cut a bunch off the MSRP, but I mainly see (and myself own) nVidia in the high end regardless. FF's right (as usual) that it's a paper launch, and in my opinion the most interesting consequence may be the "regular" 7970, which may still be an outrageously good overclocker since the binning is for a combination of undervolting capability (which anyone who has overclocked substantially in the past knows does not correlate necessarily to overvolting capability) at a given stock clock rate. There are still going to be a bunch of those gigantic, >4billion transistor Tahiti chips out there that don't meet "undervolt to x volts + specified clockrate" but which might be great overclockers. And they'll be the full Tahiti architecture, pricey GPU and tons of VRAM and all. That price reduction is potentially more significant than just introducing a part that can reclaim single GPU performance status sans overclocking.
|
# ? Jul 16, 2012 10:31 |
|
Moses posted:Is the 7970 Ghz Edition available anywhere? AnandTech posted:Finally, we’ve been asking AMD about the status of the new 7970 GHz Edition, which has so far been missing in action. After originally being scheduled to have limited availability in late June with wider availability in early July, the 7970GE has slipped by at least a couple of weeks – an unusual thing to happen to what has otherwise been a punctual AMD. At this time AMD is telling us that most of their partners have decided to launch the 7970GE on their customized premium cards, which has resulted in availability being pushed back. If all goes according to plan, AMD is expecting XFX and Sapphire to have cards available early next week. However prices will bear keeping an eye on since it’s unlikely that partners will stick to the $499 MSRP if they’re using the 7970GE for their premium cards.
|
# ? Jul 16, 2012 14:56 |
|
Are there any major differences between the various board manufacturers? I don't generally OC or buy cutting edge stuff (I'm looking to go from my current GTX 460 to a 7850, for example). Does it just boil down to which card fits my setup and hits my price range? Or are there "go-to" vendors that are known for quality?
|
# ? Jul 16, 2012 19:27 |
|
Tolan posted:Are there any major differences between the various board manufacturers? I don't generally OC or buy cutting edge stuff (I'm looking to go from my current GTX 460 to a 7850, for example). Does it just boil down to which card fits my setup and hits my price range? Or are there "go-to" vendors that are known for quality? quote:8. What brand should I buy once I've decided on a video card?
|
# ? Jul 16, 2012 19:39 |
|
I get a lot of crashes to desktop / BSOD while playing SW:ToR with my GTX 680. I'm pretty sure its the drivers. My build is brand new, the PSU is solid, I have no problems in other games, and my wife's computer [which is my old i5-750 + HD5850 build] never crashes at all. *shakes fist at nvidia drivers*
|
# ? Jul 16, 2012 20:10 |
|
tijag posted:I get a lot of crashes to desktop / BSOD while playing SW:ToR with my GTX 680. I'm pretty sure its the drivers. My build is brand new, the PSU is solid, I have no problems in other games, and my wife's computer [which is my old i5-750 + HD5850 build] never crashes at all.
|
# ? Jul 17, 2012 00:09 |
|
I don't have it overclocked, although I did slide the power to 132%. However I don't think it ever gets higher than 1000mhz while playing the game, and I made the fan really really aggressive in ramping up speed. I think it runs at 100% fan speed if the GPU gets to 60C.
|
# ? Jul 17, 2012 00:32 |
|
Tunga posted:The general parts picking thread has the advice you are looking for: Thanks--I'm a dumbass; I did read the thread, just must have missed that part.
|
# ? Jul 17, 2012 02:05 |
|
tijag posted:I don't have it overclocked, although I did slide the power to 132%. If you're not overclocking it, there's no real point to setting a higher-than-100% power target. And while it does very much love to be kept cool and overclocks best if you can keep it from ever seeing 70ºC, you can probably relax a bit from "fan hits max speed at 60ºC" as that's a little overkill. For a non-overclocked card, to keep it cool and relatively quiet, you could set it to run with, say, 5% less fan than the temperature, with the "crank it up" big fan speed boost coming at 68ºC to keep it from hitting 70ºC. With 2ºC of hysteresis that should be ample (once it hits 68ºC, it won't reduce the fan speed until it's at 66ºC, allowing for demanding sections to get the cooling you need without ramping the fan up quite so aggressively for what is in general not an aggressively tuned setup). What brand and model is the card? I'm generally wary of blaming drivers for crashes, because Windows has a pretty robust mechanism in place for recovering from a too-aggressive overclock (it'll say something like "the display driver stopped working and has recovered"). Next time you get a BSOD, write down the failure code and look it up. Check your error logs as well. It could be you're looking for the wrong culprit.
|
# ? Jul 17, 2012 02:26 |
|
Any news on Saints Row the Third performance on the 7000 series? Due to it being my most played and loved game I grabbed a GTX 680 and have been playing it flawlessly at 60fps (can drop near the zombie area with all the smoke though, no biggie). I've always liked the design and aesthetics of my 7970 so I'd love to chuck it back in, but if my favorite game still runs like hell it's a no go. Big thanks if anyone has any input on this situation.
|
# ? Jul 17, 2012 05:21 |
|
That's bizarre, I run Saints Row 3 on an unlocked 6950 at 1920x1200 with the settings cranked way up, and it runs fine. I haven't bothered to check frame rates, but it's more than playable. Having a quick search does yield results that people are having lower framerates on AMD cards. I don't know what the benchmarks are like or whether a specific driver had a fix for it.
|
# ? Jul 17, 2012 08:45 |
|
AvatarSteve posted:Any news on Saints Row the Third performance on the 7000 series? Due to it being my most played and loved game I grabbed a GTX 680 and have been playing it flawlessly at 60fps (can drop near the zombie area with all the smoke though, no biggie). I've always liked the design and aesthetics of my 7970 so I'd love to chuck it back in, but if my favorite game still runs like hell it's a no go. I recently moved from 5850 to 7950, and my SR3 performance went from working decently at 1920x1200 on Medium, to 2560x1440 on Ultra or whatever. This was only a brief check as I've been focused on other games, and I hadn't really checked performance in a while, so I don't know how much is card and how much drivers. Either way, it felt really smooth in some brief play with everything cranked up.
|
# ? Jul 17, 2012 11:28 |
|
What causes weird spiking of polygons and odd lighting drop-outs? Is that a sign of bad hardware? Or is that more a driver issue?
|
# ? Jul 19, 2012 00:26 |
|
Bad video RAM on the graphics card. Hardware failure, needs to be replaced.
|
# ? Jul 19, 2012 00:30 |
|
Looking at the pictures for this GPU: http://www.newegg.com/Product/Product.aspx?Item=N82E16814130604 It has 2 power cables and 2 ports on the back of the card, does this mean you have to use both of them and then have 4 open 4-pin molex connections to power it with?
|
# ? Jul 19, 2012 16:19 |
|
Porkchop Express posted:Looking at the pictures for this GPU: The molex connectors are for backwards compatibility with older power supplies that do not have the PCI-Express 6-pin cables. I do not suggest using a video card with this power usage with an older power supply such as this (unless it has a high single 12v rail that the molex connectors run off of). You should have two 6-pin cables that plug directly into the video card. You do not need to use the molex adapters as well.
|
# ? Jul 19, 2012 16:29 |
|
My current power supply is new but doesn't have those, I got it on the cheap because I was in a pinch. But thats good to know however. See this is what happens when you don't do anything with PC's for a long time, everything goes and changes on you. Until the past month, the last time I did anything with pc components AGP & regular PCI slots were the only options you had, now I feel like a confused old person!
|
# ? Jul 19, 2012 16:33 |
|
Yeeah that probably means it doesn't have enough juice to even run a decent GPU unfortunately
|
# ? Jul 19, 2012 16:39 |
|
No it runs what I have and thats about it, when I got it that was all I needed though. But it wasn't really meant to be a long term PSU, just one to get me buy until a little later.
|
# ? Jul 19, 2012 16:41 |
|
Porkchop Express posted:No it runs what I have and thats about it, when I got it that was all I needed though. But it wasn't really meant to be a long term PSU, just one to get me buy until a little later. AnandTech recently put out an article with cheap quality power supplies: http://www.anandtech.com/show/6013/350450w-roundup-11-cheap-psus They run it through load tests, noise, etc.
|
# ? Jul 19, 2012 19:00 |
|
jink posted:AnandTech recently put out an article with cheap quality power supplies: Thanks, but my next one is going to be a quality one! (Interesting read though.)
|
# ? Jul 19, 2012 19:36 |
|
I have a quick question about Nvidia PhysX. I played some mirror's edge on my new PC and enjoyed looking at all the broken glass and other particles with PhysX enabled, however I noticed that after several seconds/or when there was too stuff flying around, these particles would disappear. Now that's completely understandable because there's only so much a machine can render, however I want to know what determines how many different objects PhysX will keep on the screen at the same time. Is the limit set by the game, the video memory, or what?
|
# ? Jul 19, 2012 20:40 |
|
Bonus follow up PhysX question: I now have a spare GF 430 and was curious to see if it is worth putting in with my GTX 680 or not. From what little googling I've done, I've found its probably marginal, but was wondering what you guys think.
|
# ? Jul 19, 2012 21:44 |
|
mayodreams posted:Bonus follow up PhysX question: I now have a spare GF 430 and was curious to see if it is worth putting in with my GTX 680 or not. From what little googling I've done, I've found its probably marginal, but was wondering what you guys think. haha no, that thing has like 96 CUDA cores I think. No use. Even with the less impressive compute performance of the 6xx series the 680 is light years ahead.
|
# ? Jul 19, 2012 21:58 |
|
I have an EVGA GTX 570 HD, when I run the witcher 2 I'm getting temps around 90C in EVGA Precision X. This seems a bit too hot to me, I'm running the recommended settings at 1920x1080. Is this a safe temp? If I crank the fan up, it drops down to about 78 or so. It's LOUD though. I'm posting from my phone so I can't look at the numbers right now. Do I risk damage to my GPU running it like this? I've found conflicting answers elsewhere online as to whether or not this is safe.
|
# ? Jul 19, 2012 22:05 |
|
Despite the magic NVIDIA worked with 5 series over the 4, Kepler was always destined to be a very hot beast. I'd say if as long as your case is well ventilated, and the card is not overclocked, you can probably trust in the default fan curve. I can see why you'd be concerned with 90°c though, my unlocked 6950 will usually top out at about 80, however, I did make a custom fan curve in MSI Afterburner.
|
# ? Jul 19, 2012 22:08 |
|
|
# ? Apr 28, 2024 13:25 |
|
Ok, I don't plan on OCing this card, from what I recall you don't see much improvement anyway. My case has pretty good ventilation, 2 intake and 1 exhaust, would I ve better off flipping the exhaust around? And over pressuring my case? I have a Silverstone Pst07b if that matters. Also the case sits on wooden floor and there's not a dust filter for the rear fan, easily rectified though.
|
# ? Jul 19, 2012 22:29 |