Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
SlayVus
Jul 10, 2009
Grimey Drawer

slidebite posted:

What's Zotac like for a manufacturer for 6xx series? I've never heard of them until recently.

I've had a Zotac GTX 460 1GB for almost 17 months now with no problem. I have it in SLI with a Sparkle GTX 460 1GB that I traded an HD 4870 X2 for. The Sparkle was about 6 months old when I traded for it. Now, it's about 3 years old.

Adbot
ADBOT LOVES YOU

SlayVus
Jul 10, 2009
Grimey Drawer

iuvian posted:

Multi-card setups in general are dumb as heck, the appeal of a single card solution is less noise/power/heat issues. Although I still had to deal with driver issues and sli profiles even with a dual gpu single slot card. Never again.

Yeah, the biggest draw of single card, dual gpus is that it allows you do get increased GPU performance in a smaller area. You can put in two 7990s into just two PCI-E slots and get Quad-CFX. Where as, I believe, you can only do Tri-CFX with single card solutions. Same thing for SLI. You can get get Tri-SLI in single card solutions, but dual GPU cards allow you to go Quad-SLI.

However, as far as I am aware, the performance gain from a 2nd to 3rd card was something like 40-50% with Titan. Then imagine if you were to do quad-SLI with Titan. You would only get probably something in the range of 20-30% increase in performance for only another $1000. Edit:I don't know effective Tri-SLI is on Titan now, but when it first came out it was less than a 30% increase in performance when going from SLI to Tri-SLI. Where as going from a single card to a SLI setup with Titan was only about an 80% increase in performance.

SlayVus fucked around with this message at 13:32 on Apr 26, 2013

SlayVus
Jul 10, 2009
Grimey Drawer

Mad_Lion posted:

Re: Dual GPU cards.

The only one I ever owned was the Voodoo 5500 back in the day, and I don't remember anything about SLI profiles. It just worked, and you got about double the performance of a single 4500 card. Since NVidia bought 3DFX (including the SLI technology), why is it that NVidia cards require SLI profiles and tweaking now? Is it just that games have gotten more complicated, or what?

Some games, the way they are coded, can't actually run in SLI. They get worse performance over a single GPU. Even a game like that will have a SLI profile that will disable SLI. There are also different forms of rendering the frames. I think AMD has more ways to do it than Nvidia does though.

These are the basic ones

  • GPUs go back and forth rendering frames. GPU 1 renders frame 1 while GPU 2 is rendering frame 2. Alternate frame rendering
  • GPUs render half the screen each, vertical. GPU 1 renders left side of frame 1 while GPU 2 randers right side of frame 1. Scissors
  • GPUs render half the screen each, horizontal. GPU 1 renders top half of frame 1 while GPU2 renders bottom half of frame 1. Scissors
  • GPUs render the screen in a checkerboard fashion. The screen is split into a checkerboard and each GPU renders every other board piece. Checkerboard
  • GPUs render the scan-lines of the frame. GPU 1 renders every even numbers scan-line. GPU 2 renders every odd number scan-line. This was the original frame rendering technique of the 3dfx Voodoo2 which was bought by Nvidia. Scan-Line Interleave now stands for Scalable Link Interface.

SlayVus fucked around with this message at 17:28 on Apr 26, 2013

SlayVus
Jul 10, 2009
Grimey Drawer
Whats the best way to go about determining the performance difference between video cards that are several generations apart? Like an HD 7000 series and an HD 4000? They hardly use the same benchmarks any more and I would think you would need to rerun the older cards on the newer drivers to make sure that they were given a fair shot.

SlayVus
Jul 10, 2009
Grimey Drawer

Zotix posted:

I have a hard time finding what my stock(from EVGA) numbers are.

According to Newegg

Core Clock: 967 MHz
Boost Clock: 1020 MHz

SlayVus
Jul 10, 2009
Grimey Drawer
So I am buying a http://www.newegg.com/Product/Product.aspx?Item=N82E16814130771 used from someone online for $360 shipped. Is this a good deal?

SlayVus
Jul 10, 2009
Grimey Drawer

Alereon posted:

A brand new GTX 770 (a better GTX 680) is $399.99 shipped. However, that's still a 2GB card, it's probably not very smart to spend more than $300 on a card with only 2GB of RAM, because you'll have an awesome fast high-end card that doesn't have enough memory to play new games with the settings turned up. I would think the smart choice would be to either buy a 4GB GTX 770 or a 2GB GTX 760.

Well, if I can get close to what I want for my old cards then I figure the GTX 680 would actually cost me about $240-$260. After selling my old cards of course. Which puts it in the GTX 760 price range, but out performs it.

SlayVus
Jul 10, 2009
Grimey Drawer
So I saw the overclocking guide for the 700 series posted. Is the 600 series any different? I don't think I need to right bow, but I would eventually like to overclock. Can I just follow the guide for my 680?

SlayVus
Jul 10, 2009
Grimey Drawer
So I haven't updated my drivers in a bit, sitting on 314.22. I think the reason why I didn't update was because there were problems with the cards not down clocking while not in use among other things. Should I update or stay where I am? Single GTX 680.

SlayVus
Jul 10, 2009
Grimey Drawer
Anyone have a preferred program, preferably free, that can edit ShadowPlay videos? I tried the default windows movie maker program, but it can't read them. They play just fine in VLC with k-lite media codec pack.

SlayVus
Jul 10, 2009
Grimey Drawer
Only thing I can suggest is making sure you're running in full screen and not like windowed mode or full screen windowed mode. Shadowplay doesn't seem to work on programs that aren't full screen.

SlayVus
Jul 10, 2009
Grimey Drawer
If ASIC actually means anything like it should, would we not be able to compare say an EVGA Classified or Super Clock vs say a regular EVGA card of the same model? Because if super clock cards and what not are supposed to be a higher bin, we could start making decent logical guesses to how ASIC works.

SlayVus
Jul 10, 2009
Grimey Drawer

CrazyB posted:

Can you setup an AMD card with a Nvidia as Physx still? I can only find forum threads on how to do it in Windows 7 and they are dated a while back so I'm not sure if it's something you can do with Windows 8 or with new PhysX stuff.

Alereon posted:

nVidia blocks this from working so its a pain in the rear end, but here's how you can hack the drivers to get it working if you want to put forth the effort.

This was when I was running an HD 4870 X2(Dual GPU card), but I have used Hybrid PhysX before and it worked. Some games require messing with the PhysX files to get it to work. Aka deleting one file and maybe replacing another file with a modified one, but it worked. I did a few YT videos of it back in the day when it was a major thing and Nvidia wasn't trying to actively block it with developer help.

More games are not being able to be played with Hybrid PhysX because Nvidia is actively pushing developers to block it, which means there is no file workaround. It's hard coded into the game. For instance, the new Batman single player game won't start PhysX however the multiplayer will start PhysX.

Here is a list of games that DON'T work

ARMA 3
Assasins Creed IV
Batman: Arkham Origins
Star Citizen
Star Trek
Star Trek: D-A-C
CellFactor: Combat Training
Witcher 3: The Wild Hunt
CellFactor: Revolution
City of Villains
Call of Duty: Ghosts
Hawken
The Great Kulu
Mortal Combat Complite Edition
Warframe
Warmonger - Operation: Downtown Destruction
Velvet Assassin
The Secret World

SlayVus
Jul 10, 2009
Grimey Drawer

veedubfreak posted:

That's what I meant.

Anyhoo, I cleaned out my cpu block last night because my loop had poo poo for waterflow even with dual 655 pumps. It was jammed up with what I assume was flux from the new radiators. I also put my old fan controller back in, as the new fan controller apparently can't run 3 fans per channel and undervolt without overheating itself. My cpu temps playing Bf4 on ultra at 7880x1440 never go over 49c now, and the gpus never go over 45 now. Water temp is a steady 28c :)

Lesson, don't be lazy when building your loop. Flush your poo poo out properly :)

Should you not also clean your pumps? If your heatsink got that gunked up from the new radiators, I would hate to think what happened to the pumps.

SlayVus
Jul 10, 2009
Grimey Drawer
So the 800m series is supposedly launching this February, what would be a good time frame to actually start seeing laptops released with the 60/70 variants?

Hopefully, they ship with something better than a 128-bit memory bus.

SlayVus
Jul 10, 2009
Grimey Drawer
So, if nvidia isn't going to step up the VRAM in their consumer line because "the bottom line" won't that make AMD the goto for multiple display setups? It just seems like Nvidia is shooting themselves in the foot.

SlayVus
Jul 10, 2009
Grimey Drawer
MMOs are CPU bound, but usually because their processes aren't very parallel friendly. Rift for instance, can use multiple CPU cores, but the main process doesn't run in a multi-threaded environment well. So to increase FPS in Rift, you want a faster single core speed.

I doubt that this is something that could be fixed with Mantle, but I would like to be wrong. I don't see any reason for Blizzard to try it with WoW as it is not as graphically intense as other MMOs.

I think for RTS games the best you can do to help frame rates is to just limit the number of units on a map. With Supreme Commander, an older game I know, 8 players with a 200 unit cap would drag the frames down fast even if you weren't looking at anything but water.

SlayVus fucked around with this message at 08:23 on Feb 3, 2014

SlayVus
Jul 10, 2009
Grimey Drawer
I only remember the GPU that could be overvolted with a pencil.

SlayVus
Jul 10, 2009
Grimey Drawer
EVGA bins their GPUs really extensively. The Classified line up is their second highest bin. The KingPin edition is their highest binned GPUs.

SlayVus
Jul 10, 2009
Grimey Drawer
So EVGA Precision X starts the EVGA Voltage Control exe when it starts. VoltCont uses 17.8% CPU. Everytime I change the settings it loads another VoltCont exe. Which means another process eating up another 17.8% CPU for no apparent reason. Anyone have any clues. There seem to be similar cases going on in the EVGA forums with no response from EVGA.

SlayVus
Jul 10, 2009
Grimey Drawer

Straker posted:

I understand perfectly well, point is merely that gox dying yesterday didn't kill off demand for AMD cards. price on other exchanges went down but not as drastically, a lot of these people are desperate to make any amount of money without working, and even if they were smart enough to walk away it's only been a day or two, so there's that :)

If anything, the hysteria kinda started fading away as soon as December when litecoins lost about half their value...

How could you actually expect an online exchange called Magic The Gathering Online Exchange to be reliable? /Sarcasm

On the subject of VRAM on GPUS, Nvidia has got to step up the amount on their enthusiast line up. 2 or 3GB on a high end enthusiast card is not enough for maxing new games on Surround when you're talking 6.2 million to 12.2 million pixels.

A single 4K display has from 7 million to 16.38 million pixels. With three 4K WHXGA(5120x3200) displays in surround you're talking FORTY NINE MILLION pixels.

SlayVus fucked around with this message at 06:20 on Feb 27, 2014

SlayVus
Jul 10, 2009
Grimey Drawer

veedubfreak posted:

I bought this PSU probably 5 years ago when I ran my TRI-SLI 285 setup,

I think this might be a little of your PSU woes. It's 5 years old and probably saw a high load a few hours at a time. I doubt my 4/5 year old Corsair 1000HX could do 1000w, probably only like 940-950w now. Biggest load I put on it SLI'd 460s or my HD 4870 X2 with 8800GTX PhysX.

Don't bust a cap man! No one deserves getting a cap bust on them!

SlayVus
Jul 10, 2009
Grimey Drawer

Happy_Misanthrope posted:

With the rumoured "Bing Edition", $15 OEM licenses to sub-$250 PC's, Nadella's focus on services and the realities of the market, I think $100-$200 Windows licenses will not exist in short order, certainly not for consumers. There's no way a $100 OS license will be tenable as a product that any significant number of consumers would purchase in the future with so many computing options at their disposal now.

If Microsoft is going to do such a short development cycle on the OS front I feel they need to drop their price to like $50-$60. 7 to 8 I feel was too fast to justify the $100 price tag when you look at XP to Vista/7.

SlayVus
Jul 10, 2009
Grimey Drawer

Michymech posted:

I heard mention of giant cases? I think mine fits that description (can is for scale reference)


My case us pretty big and heavy by itself.

26"x24.25"x11.5" - 33 Lbs.

Silverstone Raven RV01

SlayVus
Jul 10, 2009
Grimey Drawer
So the GM107 GTX 860m performs the same as a GTX 760m. However the GK104 GTX 860m seems to be a GTX 775m with 6 SMX instead of 7. The 870m is a 775m with an increased clock speed. Same goes for the 880m and 780m.

From what i saw on rumor sites and researching the 800 series I figured the 800 wouldn't offer much in the ways of a speed boost.

SlayVus
Jul 10, 2009
Grimey Drawer
Pros of SLI and CrossFire
  • Higher minimum framerates over even the single fastest GPUs in some situations
  • Higher average framerates for graphically demanding games
  • You can purchase a 2nd card later to add to SLI/CFX with at usually cheaper prices when you want more performance
  • Two lower-end cards in SLI/CFX can provide the same framerates as one higer-end card with the right parts permitting, for cheaper sometimes
  • More capable of handling multi-monitor and very high resolution setups

Cons of SLI and CrossFire
  • Requires a more suitable power supply to handle the extra video card(s)
  • Requires purchase planning when buying or building a computer, without the right motherboard SLI/CFX is useless
  • Some games don't work with SLI/CFX
  • Some games require driver fixes for stability
  • Some games perform worse with SLI/CFX
  • Video cards can be harder to find sometimes later in their life, if you had planned to upgrade later
  • Screen tearing and micro-stuttering can occur

SlayVus fucked around with this message at 16:24 on Apr 8, 2014

SlayVus
Jul 10, 2009
Grimey Drawer

veedubfreak posted:

Hey Agreed, what speed were you getting on your normal 780 before you sold it? I'm going to pop the one I got from microcenter in over the weekend and see how far it goes on air and decide if I should get a waterblock for it or not. My 290s don't overclock for poo poo and since I'm going back to single card, just wondering if an overclocked 780 or 290 is the one to keep. I tried 1100 on my core last night and started getting artifacts and driver crashes after just 2-3 games. Temps aren't my issue, the 290 just doesn't overclock for poo poo without giving it tons of voltage.

Why not wait for the 6GB versions? Don't you have a high res setup?

SlayVus
Jul 10, 2009
Grimey Drawer

Ignoarints posted:

and likely complete bullshit specs.

They also look like they are using a single aluminum heatsink under all that fan.

SlayVus
Jul 10, 2009
Grimey Drawer
What's the release schedule for the 6gb 780s/Tis? Tri-SLI 6GB Tis would rock.

SlayVus
Jul 10, 2009
Grimey Drawer
Anyone know what causes frame freezing in Shadow Play videos? A couple months ago, my videos of gameplay would still play audio, but the video would get all garbled up thrn continue playing a few seconds later.

SlayVus
Jul 10, 2009
Grimey Drawer
So I really thought that my GTX 680 would be able to handle Wolfenstein and Watch_Dogs full tilt, but sadly I am limited by the VRAM on my card. A measly 2GB of VRAM doesn't seem like it's going to cut it in this day and age of console.

Wolfenstein eats upwards of 1990~ MB of VRAM. Watch_Dogs hits 2020 MB of VRAM usage at 1080p with max settings.

SlayVus fucked around with this message at 15:33 on May 27, 2014

SlayVus
Jul 10, 2009
Grimey Drawer

Khagan posted:

Might as well go to GDDR7 in keeping with the odd number theme.

We had GDDR4, it just didn't stick around because GDDR5 came out pretty quickly afterwards.

AMD used GDDR4 on at least two dozen of their cards. The latest one was the HD 4850.

SlayVus
Jul 10, 2009
Grimey Drawer
First GPU I had was a FX 5200. Didn't play Tribes Vengeance all that well. When I finally built my first computer, AMD Athlon X2 5200 system, I went all out with SLI 7600 GT's. Not even a year later, the, GTX 8800 came out. I bought one from my local PC shop for $600.

SlayVus
Jul 10, 2009
Grimey Drawer

Factory Factory posted:

Maybe building upwards is less relevant to GPUs and CPUs than to NAND. Mobile power needs disfavor re-using older processes, but they also don't rule it out. All I'm saying is that there's some more to be had. Maybe not much. Maybe none from process nodes. Maybe progress has slowed. But there's a little more progress yet to be had.

Doesn't 3D building allow for more transistors in a smaller package though? I thought this was why Intel was going to build CPUs in 3d.

Rime posted:

At this point the vast gains need to come in the form of code being written in a non-lazy and non-lovely fashion anyways. There's oceans of performance left to be wrung out of the hardware we have today, hundreds of miles worth of fancy-rear end instruction sets that nobody bothers to use, favoring poo poo that dates back to the 90's or worse. It's depressing to see how much CPU and GPU power is just left sitting on the table, unused, each generation. :(

Freaking computer games! Always using two cores when we have up to 16 available. Maybe current gen consoles will make multi threaded cpu processing more likely on PC since they resemble PCs so closely.

SlayVus fucked around with this message at 00:37 on Sep 9, 2014

SlayVus
Jul 10, 2009
Grimey Drawer
So if I want a quiet, not likely to buzz, small GTX 970, which one should I go for?

SlayVus
Jul 10, 2009
Grimey Drawer

Radio Talmudist posted:

I just realized why my R9 290's fans were going crazy. It's my case. I removed the side panel and now they're barely ramping up at all even under heavy load.

I know nothing about airflow and how to remedy this.

Front fans set as intake. Rear fan set as output. Top fans as out. Side fans set as input.

Edit: How long ago did the MSI 970 Gold come out? I can't find any info on the card except for press releases.

I am wondering if the full copper heat sink is worth it over the regular MSI aluminum heat sink.

SlayVus
Jul 10, 2009
Grimey Drawer

Radio Talmudist posted:

Do you guys have case fan recommendations? I mean in terms of manufacturers. Is there a CFM I should be aiming for, or some other factor I need to consider?

Air Penetrator fans from Silver Stone provide good directional air volume. Noctua makes industrial, very high static pressure fans. Delta makes really loud, really high CFM fans. It's all about what you want.

SlayVus
Jul 10, 2009
Grimey Drawer
The best I've gotten so far with my new 4790k@4.5(Stock Cooler) and GTX 970 is 9988 on Fire Strike. I'm still trying to learn the ins and out of GPU overclocking, but I started at a score of 9521 in FS.

SlayVus
Jul 10, 2009
Grimey Drawer

joe football posted:

I am running at stock clocks and have 8 gb of ram.

So I decided to switch the card to my other pci slot and...got a 1000 point improvement? :confused:



It's almost high enough that I could stop worrying and start overclocking, but not quite
Your Physics score is dragging you down.

My graphics score on my 9988 run is 11,612. You're 10,774.

However, your Physics leaves something to be desired at 6390 compared to my sultry 12,188.

Adbot
ADBOT LOVES YOU

SlayVus
Jul 10, 2009
Grimey Drawer

smushroomed posted:

I'm only getting 11k firestrike score with 970 sli and a 4670k @ 4.5

Any ideas?

Check your SLI settings. Something going on there I think. I just did a 10,993 run with a single GTX 970/4790k.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply