Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Gigabyte is one of those companies that, down the line, quietly replaces the reference design in their lineup with a semicustom card with a bare-minimum VRM and other corners cut. They probably single-handedly led to a prior erroneous belief that the GeForce 570 wasn't a good overclocker in general, because all the folks saying that happened to be using cut-down cards.

So a company with a history of bad power delivery on motherboards decides to cut quality on its video cards' power delivery. Why yes, let's buy the poo poo out of that.

Adbot
ADBOT LOVES YOU

TheRationalRedditor
Jul 17, 2000

WHO ABUSED HIM. WHO ABUSED THE BOY.

Factory Factory posted:

Gigabyte is one of those companies that, down the line, quietly replaces the reference design in their lineup with a semicustom card with a bare-minimum VRM and other corners cut. They probably single-handedly led to a prior erroneous belief that the GeForce 570 wasn't a good overclocker in general, because all the folks saying that happened to be using cut-down cards.

So a company with a history of bad power delivery on motherboards decides to cut quality on its video cards' power delivery. Why yes, let's buy the poo poo out of that.
I can't speak to any of the previous generation and wouldn't bother attempting to argue with a self-styled expert who was present at the time, but writing off all their products out of hand as unanimously bad is just as shortsighted as refusing to believe scandalous rumors after substantial evidence of past instances of wrongdoing has come to light.

Now, I'm not for supporting crooked business practices. Hell, I even had a bum Gigabyte mobo once for my Q6600. I wouldn't buy another from them, but when considering the relative virtues of 670s as appraised by paid experts, their product consistently stood out. And thus far, it certainly doesn't perform like a component with its corners cut and wings clipped. However, it's safe to say that it's a good thing that the state of things are not "down the line" yet! :xd:

TheRationalRedditor fucked around with this message at 00:52 on Sep 4, 2012

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Well, the logic is the same as with the SSD thread's stance on OCZ: Other companies provide similar products at competitive prices but without the shenanigans, so even if the particular product isn't one you're statistically likely to regret, there's still little reason to give that company your business.

I'm also personally upset with Gigabyte because they told me that a motherboard giving out-of-spec +12V and +5V after taking in-spec power from the PSU was perfectly fine and normal. They said this both times I RMA'd the board because some chips had burnt out, causing my NAS's OS to no longer boot with no recent backups (because the USB 3.0 controller had also burnt out weeks before).

TheRationalRedditor
Jul 17, 2000

WHO ABUSED HIM. WHO ABUSED THE BOY.
I actually completely agree with the SSD's thread stance because I've had OCZ products twice years back (performance RAM and a HSF), and they both ate poo poo and failed within months.

On that same note of consumer disillusionment, that personal anecdote of yours suddenly explains why your last post is was so fiery. 2 RMAs, that's awful and I totally understand the seething. Judging from the overwhelmingly positive reviews of this GPU generation, it seems like Nvidia's performance tier video cards are the only thing Gigabyte does right, as they appear to have one of the most promising 660Ti's as well.

Personally speaking, the 670 is everything I was hoping for thus far. I'd recommend it to someone shopping around. At the same time, I'd never tell anyone not to boycott a manufacturer who has burned them more than once on components (we're talking technology here and we all know its potentially fickle foibles so the old axiom becomes "Fool me thrice, shame on me!")

HonorableTB
Dec 22, 2006
Ugh, I wish I had bought my GTX 670 a bit earlier because I hate having to wait for federal holidays to end. Good news is that it should be here tomorrow. On a second note, I'll be selling my GTX 470 at a fair price, if anyone is looking to upgrade and doesn't want to drop $300+ on a new GPU. It runs everything I've been playing on maximum settings and I've taken good care of it.

craig588
Nov 19, 2005

by Nyc_Tattoo
After buying a dual fan Gigabyte 670 I wouldn't buy a second one. The fan bracket wasn't solid enough and allowed the fans to vibrate. It's not something you'd normally hear, but I already have a Noctua D14 with low speed fans so it was really annoying. I made a little brace from a cut up credit card and epoxied it to the fan bracket and the edge of the card and it quieted up.

I'd say go with the MSI aftermarket cooler if you want large slow fans, they've been doing it longer and probably have more bugs worked out.

I thnk the card landscape gets a lot more interesting when the MVK Tech guys finish up their bios reverse engineering. From what it looks like right now, power target %s are entirely arbitrary and there's no reason one card couldn't have 100% be 200 watts while anothers 122% is only 175 watts. Excepting, of course the limitation of the power delivery system of the card itself. It explains how some people are able to overclock and mess with voltages for days and barely break 80%, but other people need 145% to not even hit the same speeds. The fallout will be revealing what cards are designed to handle a lot of power and what are built to be just enough. I have a feeling my Gigabyte will be on the lower end of the scale, it's constantly pegging out at 122% and pulling voltage and clock speed. (Not like below stock or anything really crazy, but at under 60C it will only break ~1280mhz when it's dropping the voltage to 1.137 or even lower and if you force it up it'll drop way down to around 1000mhz)

Verizian
Dec 18, 2004
The spiky one.
So I've had this Palit GTX 670 Jetstream for a few days now and while there's an obvious improvement over my old AMD 6870 I figured it was time to run a benchmark of some kind.

Unfortunately both Batman AC and BF3 are refusing to run at all so for now here's a quick furmark benchmark while I run drivercleaner, sweeper, and do a clean install if that doesn't fix it.

Dammit what happened to all the old apps that cleaned up residual driver files? Did they all turn to poo poo?

Verizian fucked around with this message at 04:37 on Sep 4, 2012

KillHour
Oct 28, 2007


FISHMANPET posted:

I'm pretty sure the cameras are all IP cameras, so it's just a matter of opening however many browser windows is necessary.

I'm not sure what you mean by needing a video engineer. What kind of inputs do you think I'm talking about?

You're viewing a ton of IP cameras by opening them in browser windows? :wtc:

How many cameras are you trying to do? There's no way you can decode enough video streams to fill 16 monitors at anything resembling a decent frame rate with a single PC.

You want this with 4 PCs each driving 4 monitors:

http://www.milestonesys.com/productsandsolutions/xprotectaddonproducts/xprotectsmartwall/

https://www.youtube.com/watch?v=07J7mtTDuYQ


Edit:

Out of curiosity, I tried to see if my computer (heavily overclocked i5 3570K @ >4.5GHz -w- 16GB of RAM) could decode 16 HD H.264 streams. I have a 720p version of Hunger Games, so I put it on my SSD and opened up as many copies as I could. After 6 or so, I started dropping frames and the GUI was being sluggish. After 12, the computer was barely usable and as you can see, the stats were blinking in and out. The video was tearing pretty badly, as well.



TL;DR: Don't try to run a 16 screen video wall with 1 computer. Bad things happen.

KillHour fucked around with this message at 07:35 on Sep 4, 2012

Boten Anna
Feb 22, 2010

For shits and giggles I tried with a copy of Black Swan using VLC and while Final Fantasy XIV was still running (a hog of a game even though I'm standing in my small inn room). I have a 3770K and a GTX 670.

Everything started artifacting heavily though my computer was usable at 16, 12 was choppy but I've seen people think worse is acceptable.

I closed FFXIV and tried again and it is almost watchable but there is still artifacting which I think is an issue with disk I/O (256GB Crucial M4 SSD notwithstanding) as it doesn't start until I pick random seek points. It seems I could do 13 videos smoothly; still kind of choppy but not as bad as 12 with XIV open.

I think you'd need two of my computers to run a 16 screen video wall well, but it'd be cheaper to use four lesser specced ones I think.

Boten Anna fucked around with this message at 09:39 on Sep 4, 2012

KillHour
Oct 28, 2007


Boten Anna posted:

For shits and giggles I tried with a copy of Black Swan using VLC and while Final Fantasy XIV was still running (a hog of a game even though I'm standing in my small inn room). I have a 3770K and a GTX 670.

Everything started artifacting heavily though my computer was usable at 16, 12 was choppy but I've seen people think worse is acceptable.

I closed FFXIV and tried again and it is almost watchable but there is still artifacting which I think is an issue with disk I/O (256GB Crucial M4 SSD notwithstanding) as it doesn't start until I pick random seek points. It seems I could do 13 videos smoothly; still kind of choppy but not as bad as 12 with XIV open.

I think you'd need two of my computers to run a 16 screen video wall well, but it'd be cheaper to use four lesser specced ones I think.

Do note that HD movies are generally 720P, as well. The monitors for video walls tend to be at least 1920x1080, which has more than double the amount of pixels.

Edit: Also, it takes 160Mbps of network bandwidth to stream 16 1080p cameras. Hope you have a beefy network.

KillHour fucked around with this message at 13:25 on Sep 4, 2012

Number19
May 14, 2003

HOCKEY OWNS
FUCK YEAH


I'm hoping that someone here might know the answer to this:

We're trying to make CUDA available on one of our Bamboo agents to help speed up compile times on a component. This agent doesn't run in an interactive session and therefore doesn't actually see a CUDA device as available. CUDA works as expected when a user is logged in and the build process is triggered manually.

The card in the agent workstation is a GTX 560Ti. It's nothing amazing but gets the job done. Everything I've read seems to indicate that I need a Quadro card instead which can be put into CUDA-only mode. From there the automated build process should be able to detect the CUDA device and use it.

Is this right? Is there any way to make a GeForce card behave in this manner? I can get a Quadro if I have to but I'd like to get a proof of concept working first before I ask for new equipment.

Lowclock
Oct 26, 2005
Is it normal for GPU-Z to be telling me my new 660ti is at PCI-E 3.0 x2? From stuff I've read, it should go up to 16 when I run the test or something else intensive, but it always says x2. I'm running a 3570k and Sabertooth Z77 motherboard, so it should be capable. Is this just a bug in the beta 660ti driver or GPU-Z, or am I missing something or have something set wrong?

InstantInfidel
Jan 9, 2010

BEST :10bux: I EVER SPENT
Which slot did you put it in on your motherboard? If I'm remembering right, you always want to start with the highest (the one closest to the CPU on most boards) as it can actually provide the 16x bandwidth.

However, it appears that your MOBO, having it in either of the two 3.0 slots should give 16x bandwidth, so that might just be GPUz being lovely.

edit: did you really spend $240 on a motherboard?

Lowclock
Oct 26, 2005
Yeah I have it in the top slot and all that. It says 2x in the bios now that I look too, and I even tried updating the bios of the card and board to no change. Even if I had it in the wrong slot, it should be able to do at least 4x.

Wozbo
Jul 5, 2010
You, you arent plugging it into the super small one up top above the huge 16x right? the light brown right?

The other things that could be: If you didn't seat the card right. You might not have it fully inserted so it is running in 2x because that's all it has full contacts for.

I'd suggest unseating fully and reseating the card.

Martello
Apr 29, 2012

by XyloJW
I also suggest taking several photos of the inside of your case, focusing on getting the whole card into the photos including where it's plugged into the board and the power connectors. That way it will be easy for people to see what you're doing wrong, if anything.

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride
Tried using a different program to check it? Maybe GPUZ is reading it wrong.

buglord
Jul 31, 2010

Cheating at a raffle? I sentence you to 1 year in jail! No! Two years! Three! Four! Five years! Ah! Ah! Ah! Ah!

Buglord
My ATi Radeon HD 5870 is on it's way towards death. Home remedies can only keep this thing ticking for so long and I really think we're at the end of the line now. While it was a beautifully fast and strong card for me, I had nightmares to no end with the drivers. I'm a little scared of choosing ATi again because it's been my only experience with their brand and NVIDIA cards were generally problem free.

What Nvidia cards are out right now that are comparable to the 5870 in terms of performance? I'd rather "replace" the card instead of upgrade, as funds are a little low.

craig588
Nov 19, 2005

by Nyc_Tattoo
How low? Even aproximately lateral replacements are going to be over 200$ unless you want to get into buying used previous generation cards. Nvidia doesn't really have a card worth considering below 300$ right now. An AMD 7850 would be a cheaper option that performs slightly better than your current card. Unfortunately anything below the Nvidia 660 or the AMD 7850 will be a dramatic performance drop compared to your current card.

Lowclock
Oct 26, 2005
I tried a few other programs, and other cards and slots too, and always get 2x in the top slot. If I move it to the middle one it shows 8x like it should, and benches higher, and it shows up as 16x in another motherboard. I tried messing with bios settings for a while, which didn't really make any difference either. I think it actually might be something wrong with the motherboard.

Boten Anna
Feb 22, 2010

Avocados posted:

My ATi Radeon HD 5870 is on it's way towards death. Home remedies can only keep this thing ticking for so long and I really think we're at the end of the line now. While it was a beautifully fast and strong card for me, I had nightmares to no end with the drivers. I'm a little scared of choosing ATi again because it's been my only experience with their brand and NVIDIA cards were generally problem free.

What Nvidia cards are out right now that are comparable to the 5870 in terms of performance? I'd rather "replace" the card instead of upgrade, as funds are a little low.

The reasons you've mentioned are exactly why I've been loyal to nVidia for some time now. I'm really enjoying my GTX 670 but that's probably a bit out of your price range at the moment.

If you can hang on and save up, you'll probably be happy with a 660 Ti or greater for quite some time.

movax
Aug 30, 2008

Lowclock posted:

I tried a few other programs, and other cards and slots too, and always get 2x in the top slot. If I move it to the middle one it shows 8x like it should, and benches higher, and it shows up as 16x in another motherboard. I tried messing with bios settings for a while, which didn't really make any difference either. I think it actually might be something wrong with the motherboard.

Yeah, it's possible there's a cold joint somewhere on one of the muxes or Tx-side caps on the mobo. If you can get Asus to advance RMA you can run it at x8 without a huge performance loss.

Grim Up North
Dec 12, 2011

Avocados posted:

What Nvidia cards are out right now that are comparable to the 5870 in terms of performance? I'd rather "replace" the card instead of upgrade, as funds are a little low.

Not out right now, but the GTX 660 should launch in less than a week, be comparable to a Radeon HD 7850 and have a MSRP of $229.

Echostorm
Apr 7, 2003

I've written a small monograph upon the subject...
I have a Palit 560 GTX Ti that seems to have broken down. While it still works it constantly runs hot and when I put it under load, even trying to refresh my windows score it crashes my box. Power isn't an issue. I've got an 850 watt PS.

I would like to replace it with something slightly less demanding and that doesn't sound like a jet taking off when I try to play Skyrim or Civ V.

I understand that the 560s were well regarded and that maybe this is just a bad card as it started off working fine over the first year.

Thoughts?

Disgustipated
Jul 28, 2003

Black metal ist krieg

Echostorm posted:

I have a Palit 560 GTX Ti that seems to have broken down. While it still works it constantly runs hot and when I put it under load, even trying to refresh my windows score it crashes my box. Power isn't an issue. I've got an 850 watt PS.

I would like to replace it with something slightly less demanding and that doesn't sound like a jet taking off when I try to play Skyrim or Civ V.

I understand that the 560s were well regarded and that maybe this is just a bad card as it started off working fine over the first year.

Thoughts?
Have you tried blowing out the fan/heatsink with a can of compressed air?

Echostorm
Apr 7, 2003

I've written a small monograph upon the subject...

Disgustipated posted:

Have you tried blowing out the fan/heatsink with a can of compressed air?

Yep, took out the whole card and made sure it was clear. 5 well placed case fans (all new), liquid cooled CPU, nothing else is running above 42 cel.

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride
Hmm, maybe the heatsink is a little loose or something? Just a random thought, but this sounds like maybe symptoms of a bad thermal interface.

If nothing else, you could always get a 660Ti. Since it sounds like your case has good airflow if you do get a new card I'd pick up one with a non reference cooler, like a Twin Frozr or similar.

Revitalized
Sep 13, 2007

A free custom title is a free custom title

Lipstick Apathy
So the last time I actually 'upgraded' a component in my desktop was like... 6 or 7 years ago I think. I got a new EVGA SC 560 (vanilla 560 I assume) to replace my 9600GT, and I was just wondering, do the standard procedures still apply?

(Uninstall NVIDIA drivers, use DriverCleaner or whatever is fancy these days, turn off/replace card, turn on and install drivers)

Mainly I'm just wondering about the second step with the 3rd party driver cleaner software, but is that overall about right?

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

Revitalized posted:

So the last time I actually 'upgraded' a component in my desktop was like... 6 or 7 years ago I think. I got a new EVGA SC 560 (vanilla 560 I assume) to replace my 9600GT, and I was just wondering, do the standard procedures still apply?

(Uninstall NVIDIA drivers, use DriverCleaner or whatever is fancy these days, turn off/replace card, turn on and install drivers)

Mainly I'm just wondering about the second step with the 3rd party driver cleaner software, but is that overall about right?
Install new card, update the drivers. Really, that's all that should be necessary if you're coming from the same manufacturer.

Revitalized
Sep 13, 2007

A free custom title is a free custom title

Lipstick Apathy

Happy_Misanthrope posted:

Install new card, update the drivers. Really, that's all that should be necessary if you're coming from the same manufacturer.

Going from Gigabyte to EVGA.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Install the drivers from nVidia directly, and do a "Clean Installation."

The whole bullshit you used to have to go through when doing so much as changing manufacturers is pretty much a thing of the past, hell, even switching teams entirely you just have to make sure to uninstall the display drivers of the previous card before removing it and plugging in the new card powered by the competing GPU maker. It's only when you have multiple display drivers installed at once that you run into problems.

Further, in your case, you can and should go ahead and install the new drivers now, really; just switching cards, they'll take care of the "back end" when they see you've swapped out. (Then once you've actually moved the card out and put the new one in, do a clean install anyway just to be sure, but that's basically crossing your heart or fingers or whatever superstition you prefer at that point, just something I do when moving generations that have notable feature changes.)

Lowclock
Oct 26, 2005
My problem did end up being a bad motherboard. I got it exchanged and now

x 16 3.0.

Gotta overclock it again.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Speaking of overclocking, man, it's crazy what will show some peculiar instability in an overclock. I've been rock-solid with Furmark (not a great one to test this generation's long clocks, but still), Heaven, 3dmark11, Crysis 2 DX11, Metro 2033 DX10 and DX11, and every DX9 game like it's nothing - but I recently got back into Starcraft II, and the transparency effect of motherships crashed my overclock. Had to bump it down by 4MHz on the core, 2.5MHz on the VRAM. Now it's stable.

I know it may seem dumb but I always go for "worst case" as my stable clocks. It's just really odd what those worst-case scenarios are going to be, you never know what game is going to be the one that taxes your card's rendering pathways in just the right way to gently caress you over. With my GTX 580, it was DX10 that was the limiting factor, it'd eat DX9 at 960MHz core/4100MHz VRAM; DX11 at 930MHz core/4100MHz VRAM; but DX10 titles weren't stable unless it was set to 925MHz core, 4080MHz VRAM.

I guess it shouldn't be strange, but it is. I've got additional AA going on, but still, SC2 is supposed to be CPU limited, I guess that's barring a fleet full of Carriers each holding 8 fighters entering and exiting transparency really really fast on the campaign map where you're fighting endless zerg as a last stand.

So, now my stable overclock on the GTX 680, with no issues in any games that I can find yet (and absolutely no artifacts, that's easy to weed out in initial testing before you start using it in games :v:) turns out to be, on top of the EVGA SC+ built-in boosts, +76MHz core/+295MHz VRAM. Just a few MHz made all the difference for stability. Still a good overclock, I'm not griping in the least, it's just a little frustrating to think you're solid and then some DX9 game that isn't exactly known for its graphical complexity just tosses an engine quirk curve-ball at you and you don't have a stable OC after all.

Daedalus1134
Sep 14, 2005

They see me rollin'


craig588 posted:

I'm pretty sure it has almost always been true except when 3D cards were a relatively new concept. I can remember as far back as the Geforce 2 having a "halo" tier for around 400$ and a cheaper version with 25% lower stock clock speeds that only had 32mb of memory vs 64, but otherwise were identical cards.


It's a page back, but I love that video :allears:

https://www.youtube.com/watch?v=V6VPjF7BaE4

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
:stare:

I can honestly say that I thought the phrase "halo tier" had a different etymology.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Here's Skyrim running on the Intel Haswell GT3 GPU.

https://www.youtube.com/watch?v=uohmFVIAASU

Left: HD 4000 (IVB GT2), 1366x768 Medium detail preset
Right: Haswell GT3, 1920x1080 High detail preset

Those chips are running at the same TDP. Haswell takes the IVB execution unit uarch and doubles the performance per watt.

Next year, "good enough" computing will mean Skyrim at 1080p/high detail :circlefap:

Rigged Death Trap
Feb 13, 2012

BEEP BEEP BEEP BEEP

Factory Factory posted:

Here's Skyrim running on the Intel Haswell GT3 GPU.

https://www.youtube.com/watch?v=uohmFVIAASU

Left: HD 4000 (IVB GT2), 1366x768 Medium detail preset
Right: Haswell GT3, 1920x1080 High detail preset

Those chips are running at the same TDP. Haswell takes the IVB execution unit uarch and doubles the performance per watt.

Next year, "good enough" computing will mean Skyrim at 1080p/high detail :circlefap:

:catstare:
Give me one NOW.

Whale Cancer
Jun 25, 2004

Me and another guy have very similar setups: P8Z77 mobos, I5 3570k's and 8 gigs of ram. The main difference is he is running an MSI 660ti and I'm running the EVGA 660ti. We both ran Heaven 3.0 to get a comparison.

Here are his results with the MSI 660


Here are my results with the EVGA 660


As you can see he beats me in everything except Min FPS where my card doubles his which I found very interesting and couldn't figure out why. Another guy is running basically the same setup except with a 2500k chip and the Gigabyte 660 and his results are very very similar to the MSI card.

Berk Berkly
Apr 9, 2009

by zen death robot

Rigged Death Trap posted:

:catstare:
Give me one NOW.

Avatar/Post combo win.

Haswell is due out around June/July next year? That almost feels too good to be true. At that point I'm curious if we will even have cards like the 7750 or even 7770 when you can just get very decent quality graphics without a discrete card at native resolution 1080p.

quote:

Me and another guy have very similar setups: P8Z77 mobos, I5 3570k's and 8 gigs of ram. The main difference is he is running an MSI 660ti and I'm running the EVGA 660ti. We both ran Heaven 3.0 to get a comparison.

Whatever the difference is I like the results your setup better. A min 30FPS means you should hold up much better during the harshest/demanding points of gameplay. Anything over 60FPS starts to have greatly diminishing returns in terms of visual experience, so the tens of frames difference in the hundreds is trivial.

Berk Berkly fucked around with this message at 14:31 on Sep 12, 2012

Adbot
ADBOT LOVES YOU

Whale Cancer
Jun 25, 2004

Berk Berkly posted:


Whatever the difference is I like the results your setup better. A min 30FPS means you should hold up much better during the harshest/demanding points of gameplay. Anything over 60FPS starts to have greatly diminishing returns in terms of visual experience, so the tens of frames difference in the hundreds is trivial.

I completely agree. I just don't know why my evga card is keeping the min fps so much higher.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply