|
slidebite posted:What's Zotac like for a manufacturer for 6xx series? I've never heard of them until recently. I've had a Zotac GTX 460 1GB for almost 17 months now with no problem. I have it in SLI with a Sparkle GTX 460 1GB that I traded an HD 4870 X2 for. The Sparkle was about 6 months old when I traded for it. Now, it's about 3 years old.
|
# ¿ Mar 10, 2013 16:16 |
|
|
# ¿ Apr 27, 2024 16:08 |
|
iuvian posted:Multi-card setups in general are dumb as heck, the appeal of a single card solution is less noise/power/heat issues. Although I still had to deal with driver issues and sli profiles even with a dual gpu single slot card. Never again. Yeah, the biggest draw of single card, dual gpus is that it allows you do get increased GPU performance in a smaller area. You can put in two 7990s into just two PCI-E slots and get Quad-CFX. Where as, I believe, you can only do Tri-CFX with single card solutions. Same thing for SLI. You can get get Tri-SLI in single card solutions, but dual GPU cards allow you to go Quad-SLI. However, as far as I am aware, the performance gain from a 2nd to 3rd card was something like 40-50% with Titan. Then imagine if you were to do quad-SLI with Titan. You would only get probably something in the range of 20-30% increase in performance for only another $1000. Edit:I don't know effective Tri-SLI is on Titan now, but when it first came out it was less than a 30% increase in performance when going from SLI to Tri-SLI. Where as going from a single card to a SLI setup with Titan was only about an 80% increase in performance. SlayVus fucked around with this message at 13:32 on Apr 26, 2013 |
# ¿ Apr 26, 2013 13:26 |
|
Mad_Lion posted:Re: Dual GPU cards. Some games, the way they are coded, can't actually run in SLI. They get worse performance over a single GPU. Even a game like that will have a SLI profile that will disable SLI. There are also different forms of rendering the frames. I think AMD has more ways to do it than Nvidia does though. These are the basic ones
SlayVus fucked around with this message at 17:28 on Apr 26, 2013 |
# ¿ Apr 26, 2013 17:25 |
|
Whats the best way to go about determining the performance difference between video cards that are several generations apart? Like an HD 7000 series and an HD 4000? They hardly use the same benchmarks any more and I would think you would need to rerun the older cards on the newer drivers to make sure that they were given a fair shot.
|
# ¿ Jun 26, 2013 01:39 |
|
Zotix posted:I have a hard time finding what my stock(from EVGA) numbers are. According to Newegg Core Clock: 967 MHz Boost Clock: 1020 MHz
|
# ¿ Jun 26, 2013 01:56 |
|
So I am buying a http://www.newegg.com/Product/Product.aspx?Item=N82E16814130771 used from someone online for $360 shipped. Is this a good deal?
|
# ¿ Jun 26, 2013 22:08 |
|
Alereon posted:A brand new GTX 770 (a better GTX 680) is $399.99 shipped. However, that's still a 2GB card, it's probably not very smart to spend more than $300 on a card with only 2GB of RAM, because you'll have an awesome fast high-end card that doesn't have enough memory to play new games with the settings turned up. I would think the smart choice would be to either buy a 4GB GTX 770 or a 2GB GTX 760. Well, if I can get close to what I want for my old cards then I figure the GTX 680 would actually cost me about $240-$260. After selling my old cards of course. Which puts it in the GTX 760 price range, but out performs it.
|
# ¿ Jun 27, 2013 03:28 |
|
So I saw the overclocking guide for the 700 series posted. Is the 600 series any different? I don't think I need to right bow, but I would eventually like to overclock. Can I just follow the guide for my 680?
|
# ¿ Jul 12, 2013 21:01 |
|
So I haven't updated my drivers in a bit, sitting on 314.22. I think the reason why I didn't update was because there were problems with the cards not down clocking while not in use among other things. Should I update or stay where I am? Single GTX 680.
|
# ¿ Sep 1, 2013 07:13 |
|
Anyone have a preferred program, preferably free, that can edit ShadowPlay videos? I tried the default windows movie maker program, but it can't read them. They play just fine in VLC with k-lite media codec pack.
|
# ¿ Nov 6, 2013 05:26 |
|
Only thing I can suggest is making sure you're running in full screen and not like windowed mode or full screen windowed mode. Shadowplay doesn't seem to work on programs that aren't full screen.
|
# ¿ Nov 7, 2013 08:51 |
|
If ASIC actually means anything like it should, would we not be able to compare say an EVGA Classified or Super Clock vs say a regular EVGA card of the same model? Because if super clock cards and what not are supposed to be a higher bin, we could start making decent logical guesses to how ASIC works.
|
# ¿ Nov 9, 2013 08:32 |
|
CrazyB posted:Can you setup an AMD card with a Nvidia as Physx still? I can only find forum threads on how to do it in Windows 7 and they are dated a while back so I'm not sure if it's something you can do with Windows 8 or with new PhysX stuff. Alereon posted:nVidia blocks this from working so its a pain in the rear end, but here's how you can hack the drivers to get it working if you want to put forth the effort. This was when I was running an HD 4870 X2(Dual GPU card), but I have used Hybrid PhysX before and it worked. Some games require messing with the PhysX files to get it to work. Aka deleting one file and maybe replacing another file with a modified one, but it worked. I did a few YT videos of it back in the day when it was a major thing and Nvidia wasn't trying to actively block it with developer help. More games are not being able to be played with Hybrid PhysX because Nvidia is actively pushing developers to block it, which means there is no file workaround. It's hard coded into the game. For instance, the new Batman single player game won't start PhysX however the multiplayer will start PhysX. Here is a list of games that DON'T work ARMA 3 Assasins Creed IV Batman: Arkham Origins Star Citizen Star Trek Star Trek: D-A-C CellFactor: Combat Training Witcher 3: The Wild Hunt CellFactor: Revolution City of Villains Call of Duty: Ghosts Hawken The Great Kulu Mortal Combat Complite Edition Warframe Warmonger - Operation: Downtown Destruction Velvet Assassin The Secret World
|
# ¿ Jan 10, 2014 02:14 |
|
veedubfreak posted:That's what I meant. Should you not also clean your pumps? If your heatsink got that gunked up from the new radiators, I would hate to think what happened to the pumps.
|
# ¿ Jan 12, 2014 06:16 |
|
So the 800m series is supposedly launching this February, what would be a good time frame to actually start seeing laptops released with the 60/70 variants? Hopefully, they ship with something better than a 128-bit memory bus.
|
# ¿ Jan 25, 2014 16:30 |
|
So, if nvidia isn't going to step up the VRAM in their consumer line because "the bottom line" won't that make AMD the goto for multiple display setups? It just seems like Nvidia is shooting themselves in the foot.
|
# ¿ Jan 31, 2014 06:16 |
|
MMOs are CPU bound, but usually because their processes aren't very parallel friendly. Rift for instance, can use multiple CPU cores, but the main process doesn't run in a multi-threaded environment well. So to increase FPS in Rift, you want a faster single core speed. I doubt that this is something that could be fixed with Mantle, but I would like to be wrong. I don't see any reason for Blizzard to try it with WoW as it is not as graphically intense as other MMOs. I think for RTS games the best you can do to help frame rates is to just limit the number of units on a map. With Supreme Commander, an older game I know, 8 players with a 200 unit cap would drag the frames down fast even if you weren't looking at anything but water. SlayVus fucked around with this message at 08:23 on Feb 3, 2014 |
# ¿ Feb 3, 2014 08:20 |
|
I only remember the GPU that could be overvolted with a pencil.
|
# ¿ Feb 19, 2014 22:04 |
|
EVGA bins their GPUs really extensively. The Classified line up is their second highest bin. The KingPin edition is their highest binned GPUs.
|
# ¿ Feb 21, 2014 10:11 |
|
So EVGA Precision X starts the EVGA Voltage Control exe when it starts. VoltCont uses 17.8% CPU. Everytime I change the settings it loads another VoltCont exe. Which means another process eating up another 17.8% CPU for no apparent reason. Anyone have any clues. There seem to be similar cases going on in the EVGA forums with no response from EVGA.
|
# ¿ Feb 22, 2014 15:05 |
|
Straker posted:I understand perfectly well, point is merely that gox dying yesterday didn't kill off demand for AMD cards. price on other exchanges went down but not as drastically, a lot of these people are desperate to make any amount of money without working, and even if they were smart enough to walk away it's only been a day or two, so there's that How could you actually expect an online exchange called Magic The Gathering Online Exchange to be reliable? /Sarcasm On the subject of VRAM on GPUS, Nvidia has got to step up the amount on their enthusiast line up. 2 or 3GB on a high end enthusiast card is not enough for maxing new games on Surround when you're talking 6.2 million to 12.2 million pixels. A single 4K display has from 7 million to 16.38 million pixels. With three 4K WHXGA(5120x3200) displays in surround you're talking FORTY NINE MILLION pixels. SlayVus fucked around with this message at 06:20 on Feb 27, 2014 |
# ¿ Feb 27, 2014 06:17 |
|
veedubfreak posted:I bought this PSU probably 5 years ago when I ran my TRI-SLI 285 setup, I think this might be a little of your PSU woes. It's 5 years old and probably saw a high load a few hours at a time. I doubt my 4/5 year old Corsair 1000HX could do 1000w, probably only like 940-950w now. Biggest load I put on it SLI'd 460s or my HD 4870 X2 with 8800GTX PhysX. Don't bust a cap man! No one deserves getting a cap bust on them!
|
# ¿ Mar 4, 2014 21:54 |
|
Happy_Misanthrope posted:With the rumoured "Bing Edition", $15 OEM licenses to sub-$250 PC's, Nadella's focus on services and the realities of the market, I think $100-$200 Windows licenses will not exist in short order, certainly not for consumers. There's no way a $100 OS license will be tenable as a product that any significant number of consumers would purchase in the future with so many computing options at their disposal now. If Microsoft is going to do such a short development cycle on the OS front I feel they need to drop their price to like $50-$60. 7 to 8 I feel was too fast to justify the $100 price tag when you look at XP to Vista/7.
|
# ¿ Mar 7, 2014 06:18 |
|
Michymech posted:I heard mention of giant cases? I think mine fits that description (can is for scale reference) My case us pretty big and heavy by itself. 26"x24.25"x11.5" - 33 Lbs. Silverstone Raven RV01
|
# ¿ Mar 12, 2014 05:17 |
|
So the GM107 GTX 860m performs the same as a GTX 760m. However the GK104 GTX 860m seems to be a GTX 775m with 6 SMX instead of 7. The 870m is a 775m with an increased clock speed. Same goes for the 880m and 780m. From what i saw on rumor sites and researching the 800 series I figured the 800 wouldn't offer much in the ways of a speed boost.
|
# ¿ Mar 13, 2014 16:36 |
|
Pros of SLI and CrossFire
Cons of SLI and CrossFire
SlayVus fucked around with this message at 16:24 on Apr 8, 2014 |
# ¿ Apr 8, 2014 16:22 |
|
veedubfreak posted:Hey Agreed, what speed were you getting on your normal 780 before you sold it? I'm going to pop the one I got from microcenter in over the weekend and see how far it goes on air and decide if I should get a waterblock for it or not. My 290s don't overclock for poo poo and since I'm going back to single card, just wondering if an overclocked 780 or 290 is the one to keep. I tried 1100 on my core last night and started getting artifacts and driver crashes after just 2-3 games. Temps aren't my issue, the 290 just doesn't overclock for poo poo without giving it tons of voltage. Why not wait for the 6GB versions? Don't you have a high res setup?
|
# ¿ Apr 10, 2014 19:02 |
|
Ignoarints posted:and likely complete bullshit specs. They also look like they are using a single aluminum heatsink under all that fan.
|
# ¿ Apr 14, 2014 05:15 |
|
What's the release schedule for the 6gb 780s/Tis? Tri-SLI 6GB Tis would rock.
|
# ¿ May 11, 2014 02:59 |
|
Anyone know what causes frame freezing in Shadow Play videos? A couple months ago, my videos of gameplay would still play audio, but the video would get all garbled up thrn continue playing a few seconds later.
|
# ¿ May 19, 2014 08:07 |
|
So I really thought that my GTX 680 would be able to handle Wolfenstein and Watch_Dogs full tilt, but sadly I am limited by the VRAM on my card. A measly 2GB of VRAM doesn't seem like it's going to cut it in this day and age of console. Wolfenstein eats upwards of 1990~ MB of VRAM. Watch_Dogs hits 2020 MB of VRAM usage at 1080p with max settings. SlayVus fucked around with this message at 15:33 on May 27, 2014 |
# ¿ May 27, 2014 15:31 |
|
Khagan posted:Might as well go to GDDR7 in keeping with the odd number theme. We had GDDR4, it just didn't stick around because GDDR5 came out pretty quickly afterwards. AMD used GDDR4 on at least two dozen of their cards. The latest one was the HD 4850.
|
# ¿ Jul 19, 2014 13:43 |
|
First GPU I had was a FX 5200. Didn't play Tribes Vengeance all that well. When I finally built my first computer, AMD Athlon X2 5200 system, I went all out with SLI 7600 GT's. Not even a year later, the, GTX 8800 came out. I bought one from my local PC shop for $600.
|
# ¿ Sep 8, 2014 07:40 |
|
Factory Factory posted:Maybe building upwards is less relevant to GPUs and CPUs than to NAND. Mobile power needs disfavor re-using older processes, but they also don't rule it out. All I'm saying is that there's some more to be had. Maybe not much. Maybe none from process nodes. Maybe progress has slowed. But there's a little more progress yet to be had. Doesn't 3D building allow for more transistors in a smaller package though? I thought this was why Intel was going to build CPUs in 3d. Rime posted:At this point the vast gains need to come in the form of code being written in a non-lazy and non-lovely fashion anyways. There's oceans of performance left to be wrung out of the hardware we have today, hundreds of miles worth of fancy-rear end instruction sets that nobody bothers to use, favoring poo poo that dates back to the 90's or worse. It's depressing to see how much CPU and GPU power is just left sitting on the table, unused, each generation. Freaking computer games! Always using two cores when we have up to 16 available. Maybe current gen consoles will make multi threaded cpu processing more likely on PC since they resemble PCs so closely. SlayVus fucked around with this message at 00:37 on Sep 9, 2014 |
# ¿ Sep 9, 2014 00:34 |
|
So if I want a quiet, not likely to buzz, small GTX 970, which one should I go for?
|
# ¿ Nov 14, 2014 02:17 |
|
Radio Talmudist posted:I just realized why my R9 290's fans were going crazy. It's my case. I removed the side panel and now they're barely ramping up at all even under heavy load. Front fans set as intake. Rear fan set as output. Top fans as out. Side fans set as input. Edit: How long ago did the MSI 970 Gold come out? I can't find any info on the card except for press releases. I am wondering if the full copper heat sink is worth it over the regular MSI aluminum heat sink.
|
# ¿ Nov 20, 2014 05:15 |
|
Radio Talmudist posted:Do you guys have case fan recommendations? I mean in terms of manufacturers. Is there a CFM I should be aiming for, or some other factor I need to consider? Air Penetrator fans from Silver Stone provide good directional air volume. Noctua makes industrial, very high static pressure fans. Delta makes really loud, really high CFM fans. It's all about what you want.
|
# ¿ Nov 21, 2014 01:04 |
|
The best I've gotten so far with my new 4790k@4.5(Stock Cooler) and GTX 970 is 9988 on Fire Strike. I'm still trying to learn the ins and out of GPU overclocking, but I started at a score of 9521 in FS.
|
# ¿ Nov 22, 2014 18:29 |
|
joe football posted:I am running at stock clocks and have 8 gb of ram. My graphics score on my 9988 run is 11,612. You're 10,774. However, your Physics leaves something to be desired at 6390 compared to my sultry 12,188.
|
# ¿ Nov 22, 2014 18:43 |
|
|
# ¿ Apr 27, 2024 16:08 |
|
smushroomed posted:I'm only getting 11k firestrike score with 970 sli and a 4670k @ 4.5 Check your SLI settings. Something going on there I think. I just did a 10,993 run with a single GTX 970/4790k.
|
# ¿ Nov 23, 2014 04:50 |