Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
craig588
Nov 19, 2005

by Nyc_Tattoo
I meant volume in terms of space. Noise is caused by turbulence and has a lot to do with with the construction of the blades and where the air is getting forced to go. There's also stuff like bearing whine and housing vibration but for most circumstances they'll be overwhelmed by the sound of air getting forced to change directions. To get really thorough though, in general you probably could make blowers quieter because they can move air through restrictions better so a baffling system would able to be more powerful than with a conventional axial fan. Also the benefit of high pressure letting you put it on the far end of a duct somewhere. It doesn't really work out like that with computers though, a large low speed thin fan fits nicer than a blower and noise reducing baffles would into the space available on a videocard.

Adbot
ADBOT LOVES YOU

Rosoboronexport
Jun 14, 2006

Get in the bath, baby!
Ramrod XTreme
I just went from slightly-underclocked GT9800 to GTX 660, and let me tell you -- the performance increase is huge compared to previous update (7900GT->9800GT) I did. I can increase detail level and resolution/AA and it's still noticeably faster than before. And it only uses slightly more power than the previous card. On the other hand, it's a testament to G92 chip that it could run most games at playable frame rates almost 5 years after it's release.

chippy
Aug 16, 2006

OK I DON'T GET IT

Rosoboronexport posted:

I just went from slightly-underclocked GT9800 to GTX 660, and let me tell you -- the performance increase is huge compared to previous update (7900GT->9800GT) I did. I can increase detail level and resolution/AA and it's still noticeably faster than before. And it only uses slightly more power than the previous card. On the other hand, it's a testament to G92 chip that it could run most games at playable frame rates almost 5 years after it's release.

I went even further than that, 8800GTX -> GTX 680, I was utterly blown away. Having said that I too was really impressed at how well the 8800 was still holding up. The only thing that really pushed me to replace it was getting a new monitor at 1900 x 1200 which was really taxing it, and occasional artefacting I was getting from overheating, which was mostly alleviated by giving it a drat good clean.

GRINDCORE MEGGIDO
Feb 28, 1985


Does anybody have a 7850 card, and have a problem where the fans spin up every 10 seconds or so when the card is in long idle (black screen) ?

I have an MSI twin frozr IV 7850 that does this, and not convinced getting a replacement is going to solve it (seems very much like the Radeon Spin up bug).

buglord
Jul 31, 2010

Cheating at a raffle? I sentence you to 1 year in jail! No! Two years! Three! Four! Five years! Ah! Ah! Ah! Ah!

Buglord
If i'm going to get a GTX 660 should I just fork over more for at 600Ti then? I'm coming from an HD5870.

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride

Tab8715 posted:

Are the blower style fans actually quieter than their standard counter parts?

No, they are louder as a general rule.

Coffee Jones
Jul 4, 2004

16 bit? Back when we was kids we only got a single bit on Christmas, as a treat
And we had to share it!

Factory Factory posted:

Some consoles already enjoy heterogeneous compute. The PS3, for example: the Cell processor is a big execution core (CPU-like) with a number of coprocessors (GPU-like) on a single die. And the PS3 is a real powerhouse for it, what with the US military building compute clusters out of them and all.
I think that sort of research has moved onto CUDA cores at this point. What with Nvidia/ATI having actual support for this instead of PS3 linux being an afterthought.


quote:

Right now, for a CPU to send data and commands to a GPU, that requires a context switch. That is, a thread must be halted and stored, the GPU thread must be read from cache (or worse, RAM) and resumed, and then data must be copied from CPU-addressed RAM (i.e. main system RAM) to the GPU's video RAM.
I think many games on XBOX use a dedicated core for handling this
Oh so in a modern PC a GPU still has to go through the Northbridge.

It'd be cool if a GPU could send commands to storage to retrieve textures to be sent to its VRAM.

Wozbo
Jul 5, 2010
FYI guys Gigabyte's OC Guru fucks with the blizzard launcher something fierce and causes it to crash.

TheRationalRedditor
Jul 17, 2000

WHO ABUSED HIM. WHO ABUSED THE BOY.

Wozbo posted:

FYI guys Gigabyte's OC Guru fucks with the blizzard launcher something fierce and causes it to crash.
Yeah I tried using it like two times, then switched to MSI Afterburner, and just yesterday switched to EVGA's Precision-X which is widely considered to be the best because its voltage control slider actually seems to work.

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance

wipeout posted:

Does anybody have a 7850 card, and have a problem where the fans spin up every 10 seconds or so when the card is in long idle (black screen) ?

I have an MSI twin frozr IV 7850 that does this, and not convinced getting a replacement is going to solve it (seems very much like the Radeon Spin up bug).

I'm wondering if I'm having this issue with both of my Sapphire HD7850 cards. I have one in my main i5-2500K rig and one in my i3-2120 rig hooked up to my HDTV. After the screen goes black I hear the fans spin up for about a second or two every 10-20 seconds. If I set the monitor to never go into sleep mode it stops but then I have to manually turn off the monitor or my HDTV.

Both cards overclock to 1050MHz without issue so I can't complain too much. I just wonder why AMD never released a HD7850 GHz Edition but I guess that would decrease the value of the HD7870.

GRINDCORE MEGGIDO
Feb 28, 1985


spasticColon posted:

I'm wondering if I'm having this issue with both of my Sapphire HD7850 cards. I have one in my main i5-2500K rig and one in my i3-2120 rig hooked up to my HDTV. After the screen goes black I hear the fans spin up for about a second or two every 10-20 seconds. If I set the monitor to never go into sleep mode it stops but then I have to manually turn off the monitor or my HDTV.

Sounds like similar thing -
I'm more worried about when I come to sell the card, if it ends up being returned.

Some guy on MSI forum updated the vid card bios and fixed it, but this one is latest apparently and still does it (diff. model card to theirs). Might be worth seeing if yours has latest bios though?

If you ever fix it somehow please let me know. I wonder if I return it if the new card will do same.

Mr.Hotkeys
Dec 27, 2008

you're just thinking too much
I have a GTX570 and I'm trying to run a second monitor to watch Netflix but when I start up my computer with the second plugged in I don't see anything. The first monitor, my main one, is plugged in through HDMI and the second is plugged in through DVI (through a converter to VGA). Does anyone have any idea what I need to do to get this to work?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Try the other DVI-I port, maybe? Sometimes only one has the analog pins hooked up despite looking like both are capable because it saves a tenth of a cent per video card to order DVI-I in bulk instead of both DVI-I and DVI-D ports.

Mr.Hotkeys
Dec 27, 2008

you're just thinking too much
Problem is I can't, because the DVI>VGA converter is too huge and would block the HDMI port. But if that's the solution I guess I'll have to sand it down or something.

Also I should clarify I don't get video from either monitor when I plug the second into the DVI port, don't know if that changes anything. Might just be the video card trying to stay in single monitor mode and prioritizing the DVI port over the HDMI.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Might be a bum port. Also, buhhh? What does this converter look like? A DVI-I port only needs a tiny pin adapter for VGA.

Mr.Hotkeys
Dec 27, 2008

you're just thinking too much
Like your normal DVI adapter but the wide parts at the ends that give it that hourglass shape are wider and stylized enough for it to not work with the micro HDMI plug. It's honestly just a bunch of plastic I could sand off.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Ah. The ones I'm used to are much more svelte.

Mr.Hotkeys
Dec 27, 2008

you're just thinking too much
And cheap, apparently.



loving radioshack

texting my ex
Nov 15, 2008

I am no one
I cannot squat
It's in my blood
I haven't really followed hardware stuff since 680 was released, is there any date / real info about the next line, the GTX 700 / Radeon 8000 series ?

Rigged Death Trap
Feb 13, 2012

BEEP BEEP BEEP BEEP

Skilleddk posted:

I haven't really followed hardware stuff since 680 was released, is there any date / real info about the next line, the GTX 700 / Radeon 8000 series ?

Apparently there will be no 7xx until after March 2013, the first release being the 780.

Radeon 8xxx will be released after Computex in June. Apparently they will bring back the XT suffix for the 8970, they will also make 8930s out of the more crappy 8950 silicon. The two cards will also sport 3GB of memory by default.

However, they will finish them before schedule to release them in case Nvidia does something drastic or amazing.

Wozbo
Jul 5, 2010
Also, I believe the 8xx from nvidia series will be the first to have its own dedicated mini cpu for talking to the CPU (If I'm not mistaken, this is to make things like context switching easier and do the whole GPGPU thing).

GRINDCORE MEGGIDO
Feb 28, 1985


Wozbo posted:

Also, I believe the 8xx from nvidia series will be the first to have its own dedicated mini cpu for talking to the CPU (If I'm not mistaken, this is to make things like context switching easier and do the whole GPGPU thing).

Got any more info on that, sounds interesting?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
AMD lifted the NDA on the GPU portion of the A10-5800K "Trinity" APU.

The NDA is still present on the CPU portion, and AMD isn't saying more on pricing than "about the same as a Core i3," so it's a bit difficult to interpret the results exactly. Nevertheless, that's some really healthy performance from the IGP, far higher than HD 4000. Unless the chip is adopted in volume, it probably won't have much impact on the desktop market, though - it performs a bit under the really-weak GeForce GT 640, suggesting that a Core i3 plus any worthwhile GPU (GT 650+ or HD 7750+will still be a huge performance increase for not a lot more money.

Mobile market, now you're talking.

Wozbo
Jul 5, 2010
http://www.xbitlabs.com/news/cpu/display/20110119204601_Nvidia_Maxwell_Graphics_Processors_to_Have_Integrated_ARM_General_Purpose_Cores.html

This is the first article that came up (there are many more, look up Maxwell architecture), but basically its automating off all the stuff that you have to do on a gpu but with a cpu + some nice things like preemption. I think they are also going to skip a fab step and go smaller, but I'm a bit lazy to look right now. If this pans out the way they say it does, its going to make somthing like 8k UHD resolution viable with crysis <foo> going full blast. More likely it will be a 30 - 50% boost on the first generation, with the "tock" generation right after adding another 20 - 30% on top of normal gains as they figure out what to optimize. If I remember correctly, its currently on track to be something like 8x the power of the current 6xx series, but not out till late 2014-2015. Would be cool to see prototypes on the new consoles, so we all get some awesome graphics for the next xxx years.

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

Factory Factory posted:

AMD lifted the NDA on the GPU portion of the A10-5800K "Trinity" APU.

The NDA is still present on the CPU portion, and AMD isn't saying more on pricing than "about the same as a Core i3," so it's a bit difficult to interpret the results exactly. Nevertheless, that's some really healthy performance from the IGP, far higher than HD 4000. Unless the chip is adopted in volume, it probably won't have much impact on the desktop market, though - it performs a bit under the really-weak GeForce GT 640, suggesting that a Core i3 plus any worthwhile GPU (GT 650+ or HD 7750+will still be a huge performance increase for not a lot more money.

Mobile market, now you're talking.

Take the HD4000 numbers and multiply it by 1.5x because that will be at least where Haswell GT2 would stand, with CPU power and power management that AMD can only dream off. Suddenly things don't look so rosy anymore on the AMD side when you can see the 2nd last bastion of AMD crumbling: iGPU performance. The last bastion being price.

lllllllllllllllllll
Feb 28, 2010

Now the scene's lighting is perfect!
Incredible. Suddenly everything seems possible.

syzygy86
Feb 1, 2008

Palladium posted:

Take the HD4000 numbers and multiply it by 1.5x because that will be at least where Haswell GT2 would stand, with CPU power and power management that AMD can only dream off. Suddenly things don't look so rosy anymore on the AMD side when you can see the 2nd last bastion of AMD crumbling: iGPU performance. The last bastion being price.

Sure Haswell will have better performance than the HD4000, but Haswell isn't due out until what, spring 2013? It's not really fair to compare a chip due out next month to what will be out in 6+ months.

The AMD Trinity chips seem like they'd be good for HTPC usage, and the mobile versions are competitive with the mobile HD4000 in games.

InstantInfidel
Jan 9, 2010

BEST :10bux: I EVER SPENT
Intel has had demo Haswell chips out for some time now, they're been at a couple of big shows. They're not anywhere near consumer-ready yet, but the chip exists and it works as well as they say it does, so the comparison is quite a bit more valid.

Proud Christian Mom
Dec 20, 2006
READING COMPREHENSION IS HARD
The writing is on the wall: Intel is coming

InstantInfidel
Jan 9, 2010

BEST :10bux: I EVER SPENT
Anandtech just put up their Alienware M18X R2 review, and if you're the kind of person who really, really wants SLI'd 680Ms then it's the machine for you. I didn't realize just how beefy those things are: they're essentially 670's with a lower clock, and there are two of them. In a laptop. :drat:

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

syzygy86 posted:

Sure Haswell will have better performance than the HD4000, but Haswell isn't due out until what, spring 2013? It's not really fair to compare a chip due out next month to what will be out in 6+ months.

The AMD Trinity chips seem like they'd be good for HTPC usage, and the mobile versions are competitive with the mobile HD4000 in games.

Trinity and Llano are overkill for strictly HTPC use, and the iGPU is laughably weak for gaming compared to a cheap 7750/7770. It does have a point in laptops, but AMD always have a very limited selection in this segment and it is very possible to find a much more capable SB/IB + discrete GPU laptop at around the same prices.

http://www.dailytech.com/IDF+2012+Haswell+GT2GT3+Processors+Run+Skyrim/article27656.htm

quote:

The GT3 is running "Skyrim" at 1920x1080 resolution with High settings, while the HD 4000 GPU next door is running the same game at the same frame rate, but at Medium settings and a 1366x768 resolution.

2x the pixels + high over medium = more than 2x the performance over HD4000, way beyond Trinity territory. If I was Nvidia even I would be scared if Intel can keep pulling off 50%+ GPU performance every tick or tock.

I don't see why it isn't valid to compare Trinity to Haswell. It took AMD 1.5 years to release Trinity after Llano for a minor iGPU increase, which means we are very likely to get stuck with Trinity level AMD performance for the next 1.5 years, too. Haswell will crush it like a bug till then.

Palladium fucked around with this message at 07:28 on Sep 28, 2012

texting my ex
Nov 15, 2008

I am no one
I cannot squat
It's in my blood

Wozbo posted:

http://www.xbitlabs.com/news/cpu/display/20110119204601_Nvidia_Maxwell_Graphics_Processors_to_Have_Integrated_ARM_General_Purpose_Cores.html

This is the first article that came up (there are many more, look up Maxwell architecture), but basically its automating off all the stuff that you have to do on a gpu but with a cpu + some nice things like preemption. I think they are also going to skip a fab step and go smaller, but I'm a bit lazy to look right now. If this pans out the way they say it does, its going to make somthing like 8k UHD resolution viable with crysis <foo> going full blast. More likely it will be a 30 - 50% boost on the first generation, with the "tock" generation right after adding another 20 - 30% on top of normal gains as they figure out what to optimize. If I remember correctly, its currently on track to be something like 8x the power of the current 6xx series, but not out till late 2014-2015. Would be cool to see prototypes on the new consoles, so we all get some awesome graphics for the next xxx years.

:stare: looks like I'm waiting. Hope my 5970 still has some juice left in it

syzygy86
Feb 1, 2008

Palladium posted:

Trinity and Llano are overkill for strictly HTPC use, and the iGPU is laughably weak for gaming compared to a cheap 7750/7770. It does have a point in laptops, but AMD always have a very limited selection in this segment and it is very possible to find a much more capable SB/IB + discrete GPU laptop at around the same prices.

Sure, but at this point I think it's only reasonable to consider gaming on integrated graphics in laptops, or in a limited fashion on desktops (which for me is the once in a while game on the HTPC). In the mobile space, the Trinity chips that came out in May look competitive overall (generally good GPU, weaker CPU), but as you mention have had a limited impact on the market.

Palladium posted:

I don't see why it isn't valid to compare Trinity to Haswell. It took AMD 1.5 years to release Trinity after Llano for a minor iGPU increase, which means we are very likely to get stuck with Trinity level AMD performance for the next 1.5 years, too. Haswell will crush it like a bug till then.

I'm not trying to say that Haswell won't be a significant improvement, Intel has clearly demonstrated that, and I'm sure it'll only get better with driver improvements and such. All I really meant is if someone is looking to purchase something sooner, Trinity can be a valid option in certain use cases/price points while Haswell is too far out.

Farecoal
Oct 15, 2011

There he go
Edit: nvm, wrong thread

Kramjacks
Jul 5, 2007

So apparently MSI was overvolting their GTX 660 Ti and 670 Power Edition cards, which gave them performance gains but also caused some systems to fail to post or get black screens after a change in load.

http://www.tomshardware.com/news/MSI-GTX-660-670-overvolting-PowerEdition,18013.html

Squibbles
Aug 24, 2000

Mwaha ha HA ha!

Kramjacks posted:

So apparently MSI was overvolting their GTX 660 Ti and 670 Power Edition cards, which gave them performance gains but also caused some systems to fail to post or get black screens after a change in load.

http://www.tomshardware.com/news/MSI-GTX-660-670-overvolting-PowerEdition,18013.html

Ugh that combined with my experience with something similar on my 570 has convinced me never to buy MSI cards (3+ RMA replacements running my card at stock voltage/settings in the course of a year).

Rap Game Goku
Apr 2, 2008

Word to your moms, I came to drop spirit bombs


Squibbles posted:

Ugh that combined with my experience with something similar on my 570 has convinced me never to buy MSI cards (3+ RMA replacements running my card at stock voltage/settings in the course of a year).

Is there any brand that hasn't had some issue like this? ASUS?

InstantInfidel
Jan 9, 2010

BEST :10bux: I EVER SPENT
EVGA, ASUS, and Sapphire usually make top-notch products.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Jesus, that's insane. I can't believe MSI would pull some poo poo like that. I hope this is a one-time blip rather than something they'll be revisiting in the future, that's like XFX's bean-counter parts harvesting combined with Gigabyte's dangerously out of spec power delivery rolled into one.

I guess "gently caress it, just overvolt it like crazy and drat the consequences" is one way to get past the hassle of carefully binning chips for given performance categories. :stare:

Sticking with EVGA, they've been good to me many times. No obfuscated over-volting and my 680 is both appropriately binned (runs perfectly well in any torture test at stock settings) AND overclocks really well, too. I'll be extremely disappointed if I ever find EVGA's gone off the reservation like that, but I trust them not to.

Thing is, I'd have said that about MSI until today.

Agreed fucked around with this message at 21:08 on Oct 1, 2012

Adbot
ADBOT LOVES YOU

Squibbles
Aug 24, 2000

Mwaha ha HA ha!

Athenry posted:

Is there any brand that hasn't had some issue like this? ASUS?

From what I've read I guess the MSI problem I had on my cards was actually happening on all 570's that used the NVidia reference design. Probably much worse in my case because the card was OC'ed from the factory. Still, poor thought by MSI to release a factory overclocked card that was virtually guaranteed to fail. I eventually just bought an ASUS 570 and have had zero problems since.

I'm kinda surprised that there was never a recall or something on those cards though. As far as I know they stopped selling them and some manufacturers started releasing newer edition cards with more VRM's or something that solved the problem but those first gen ones were just awful. Just look at the average reviews: http://www.newegg.com/Product/Product.aspx?Item=N82E16814127552

Edit: Also as for the MSI support. My first card started getting artifacts and crashing in games after a month or two, I RMA'ed it and the replacement had artifacts/crashing right out of the gate. Their support teams told me to overvolt it and even gave me a custom bios set to the higher voltage so I wouldn't have to run software to do so. That actually worked for several months before it started crashing again. The third one I didn't even try, I just gave it away to a co-worker who has since informed me that it too developed the same problem. Not sure if he bothered with the RMA process.

Squibbles fucked around with this message at 21:12 on Oct 1, 2012

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply