Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Hi, I'm Factory Factory, your co-OP for this thread. I was the technical writer to Movax's engineer. That's not a euphemism.

E: Quoting this:

movax posted:

Shadows

Factory Factory fucked around with this message at 00:22 on May 11, 2012

Adbot
ADBOT LOVES YOU

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Reserved so I have a place for news and content.

You may now poo poo up this thread.

Factory Factory fucked around with this message at 00:16 on May 11, 2012

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
You don't have a PayPal receipt or copy of the auction page that you can dig up?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Practically, it's 1) power savings and 2) CUDA/GPGPU performance. The consumer 600 series will just be worse at CUDA than the 500 series. Otherwise, you can divide the CUDA core count on a 600 series card by 2 and get a pretty good first approximation of how it relates to a 500-series card in GPU-bound scenarios.

As for the purchasing questions, not only is there a price/performance chart linked at the top of the OP that will tell you everything you need to know to answer your question. It's also quoted in the post directly above yours. But that said, SWSP said he wants to keep this thread more on info and less on buying assistance, so let's try to keep that kind of question in the system building sticky.

Nierbo posted:

Yes you're right, I just found it.
"Date Ordered: Monday 14 March, 2011"

Guess I'm forking out for a new card right? :(

They come with a two year warranty?

Factory Factory fucked around with this message at 12:26 on May 11, 2012

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
HIS cards come with a two year warranty. The confusion was in you possibly miscounting the years.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Verizian posted:

  • Why do drivers suck so much?

    Mainly an AMD thing from my experience with OpenGL loving up for idTech5 and the Windows 8 CP drivers requiring manual registry editing.
    Also heard some horror stories about nVidia drivers too but no first hand knowledge.

Everybody's drivers suck, just at different times. Intel's Windows drivers sucked until a few months ago, now they merely sip. Nvidia has constant problems, like a massive Shogun 2 performance bug for high resolutions and 3D Vision generally being a slightly bigger horror than AMD's stereoscopic 3D. AMD sucks at Rage and is filled with small bugs and performance issues that take them a bit longer than Nvidia to iron out because they don't give game devs a bunch of free hardware in return for marketing and early access for driver optimizations.

quote:

  • Aftermarket cooling and replacing stock TIM.

    Is the thermal goop really that bad and why shouldn't you just replace it with a dot of Arctic Silver?
    How do you pick a good aftermarket cooler apart from searching "Aftermarket cooling MSI 6870" then trawling opinionated blog and forum posts that often contradict each other?

With current gen cards, you will not apply better TIM than the manufacturer can, period. Get the cooler you want the first time around if you want TIM perfection.

If you have to have aftermarket cooling, Arctic Cooling makes pretty much the only good coolers.

quote:

  • Muiltiple displays and non-standard resolution.

    Are there any problems with running multiple displays at different resolutions and using them for gaming or content creation? Do you still just multiply X*Y then add them together for each screen to get the total number of pixels?
    If so, for 8-10MP total display area would you aim for a single 7970, 680, 670 or is CF/SLI required after 6MP?

For gaming (Eyefinity and Surround), the monitors must have the same resolution. For non-full-screen multi-monitor, the resolutions may be different; however, if they are, most video cards will clock up to "low 3D" clocks, which raises idle noise and power consumption significantly. You can sum up height * width resolutions for how many megapixels your card is pushing out, but that's only relevant to workload for gaming.

Suggestions for full detail/AA are in post 2, after going over model numbers. AnandTech also benchmarks at 1920x1080x3 surround, and according to its numbers, then if you are willing to drop AA and often some detail, you can get 40+ FPS with a GeForce 680. For 3x2560x or such, you will definitely need more horsepower for a high-detail gaming experience.

E: Quote != edit

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
AnandTech has a cool article up about using Thunderbolt and Virtu i-Mode together on a Wintel platform. Windows Thunderbolt isn't fully mature just yet (missing hot-add after boot, though it has hot-remove), but basically poo poo just works.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

hobbesmaster posted:

I doubt you'll see hotswapable thunderbolt much. eSATA isn't hotswapable either and desktop PCs can take the removal of hard drives better than PCIe cards.

It says right in the article that Intel is requiring hot-swap as part of the certification process, and it's just a matter of drivers. And eSATA is indeed hot-swappable, with the right drivers. You can also fudge it by forcing a "detect hardware" at the Device Manager.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
There isn't a way to turn it off, currently.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Longinus00 posted:

Why do you keep mentioning the nationality of the monitor? Why would that make any difference? Everything else that you've said is already covered in the OP.

"Korean monitors" is a catchall used in the monitor megathread for imported 27" IPS screens, factory seconds that didn't make it into 27" iMacs. There are at least four brands being imported that have not-very-memorable names. Except for Catleap. Hence they're being referred to by nationality.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

movax posted:

This is awesome, but guess who has a P67 :downs:

:smith::hf::smith:

--

Re: quiet GPUs, the biggest offender is definitely blower-style fans. Blowers are very utilitarian and robust - they work even if you cram cards right next to each other in a case with poor ventilation.



Open-air coolers (i.e. without the tight-fitting shroud) generally provide lower temperatures and much lower noise characteristics because the fans can be larger, and the heatsinks can have more surface area. However, this comes at the cost of requiring better case ventilation, since air no longer moves over the card unidirectionally. Where a blower sucks air in from the front of the case and spews it out the back, an open-air cooler sprays hot air everywhere locally (though some gets ducted out the back slots).

The most quiet, best of the best coolers are triple-slot dealies, open-air designs which take up three expansion slots on the motherboard. These have a ton of room for heatsinks and high-end fans. Many high-end boards will space their CF/SLI-capable slots far enough apart to accept these cards.

One particularly quiet card that came up recently in the system building thread is the Asus GeForce 670 Direct CU II TOP. That fucker has a load fan noise of 25 dBA. It's the quietest high-performance video card I've ever seen, and it's quieter than any current blower-based reference designs until you get down to the level of a Radeon 5670. TechPowerup reviews.

Factory Factory fucked around with this message at 00:50 on May 13, 2012

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
AnandTech will be doing a Q&A with a GPGPU sexpert, so submit some questions.

And by "sexpert," I mean he was a founder of Aegia (of PhysX fame), became one of Nvidia's top CUDA guys, and is now with AMD spearheading their hetereogeneous systems architecture, i.e. the seamless integration of highly-parallel cores (i.e. GPUs) with complex serial cores (i.e. CPUs).

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Mirror's Edge, too. Cloth, shattering glass, and partpickles.

https://www.youtube.com/watch?v=w0xRJt8rcmY

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Agreed posted:

EVGA is -KR'ing the crap out of the 600-series, too. Where's my lifetime warranty, EVGA? Come on! :mad:

Extended warranties will be things you can buy for any card. I think you can max it out at 10 years. Single retail SKU, simpler.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Soul Glo posted:

Quick question, would a 1 GB Radeon HD 7570 be okay for games around Diablo 3 caliber?

Check it out yourself.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
At the dev level, sure. But when you're targeting a console with 512 MB of RAM for both game and graphics, it's the :effort: thing to do not to change that behavior for other platforms.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
The 6770 is a VLIW5 part and 6850 is VLIW4. D3 might be one of those rare games that uses that fifth codepath and gets more mileage out of it than just having more SIMD cores.



Whatever the hell Special Functions is, maybe it's important?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

4 Day Weekend posted:

I just read the SLI bit in the OP. Just to clarify, they need the same model (ie GF110) only? Memory/clock speed doesn't affect compatibility?

BFE's mistaken. SLI require both the same GPU (e.g. GF110), the same number of active SMs/SMXs (i.e. CUDA cores) and ROPs, the same memory bus bitwidth, and the same amount of memory.

Examples:
    GeForce 460 768MB + GeForce 460 (current SKU)
  • Same GPU: No (GF104 + GF114)
  • Same CUDA cores: Yes (336)
  • Same memory bus: Yes (192 bit)
  • Same memory amount: No (768MB + 1GB)
  • SLI: No
    GeForce 570 + GeForce 560 Ti-448
  • Same GPU: Yes (GF110)
  • Same CUDA cores: No (480 + 448)
  • Same memory bus: Yes (320 bit)
  • Same memory amount: Yes (1.25GB)
  • SLI: No
    GeForce 460 1024MB + GeForce 460 768MB
  • Same GPU: Yes (GF104)
  • Same CUDA cores: Yes (336)
  • Same memory bus: No (256 bit + 192 bit)
  • Same memory amount: No (1GB + 768MB)
  • SLI: No
    GeForce 580 1.5GB + GeForce 580 3GB
  • Same GPU: Yes (GF110)
  • Same CUDA cores: Yes (512)
  • Same memory bus: Yes (384 bit)
  • Same memory amount: No (1.5GB vs. 3GB)
  • SLI: No
However, you can use the Coolbits software package to soft-off the features of the better GPU until they are both at the lowest common denominator, but this is not an automatic thing like it is with AMD, and Nvidia does not recommend doing it. This probably is because Coolbits hasn't been updated since 2004.

Factory Factory fucked around with this message at 14:50 on May 17, 2012

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Holy poop on a stick. Nvidia announced Tesla hardware based on GK104 and GK110.

Full details aren't out yet, but it looks like the full GK110 die is based on 2880 Kepler CUDA cores (1440 Fermi equivalent) in 15 SMXs, eight times the number of FP64-enabled cores per SMX vs. GK104, triple the L2 cache vs. GK104, 32-thread symmetric multithreading at the card level (Hyper-Q, like hyperthreading on Intel processors), the ability for GPU threads to spawn additional GPU threads without CPU intervention, a 384-bit memory bus, and 7.1 billion loving transisters per chip.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
It might even run Crysis.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Biggest human being Ever posted:

It's passively cooled too, looks like a good choice for a HTPC.

They're passively cooled as long as they're in a 110 dBA forced-air server enclosure, sure.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Pretty sure that was sarcasm.

--

In other news, AnandTech will have an article on Nvidia GeForce Grid later today. It's, well, cloud-based gaming, like OnLive. The idea is to run PC games at full details in a server farm and pipe I/O to any device - console, PC, Mac, tablet, etc.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Well, Intel's WiDi uses the QuickSync engine to compress the frame buffer to h.264. Ivy Bridge's QuickSync is fast enough to compress 1080p30 and push it over 802.11n. The idea is there and it exists, it's just a matter of figuring it out over lower bandwidth.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Raster unit (Render OutPut unit/Raster Operations Pipeline); it's basically the final conversion stage between texture mapping/shaders and final color value for a pixel. This was more important to know when ROPs, texture units, and shaders all came in equal numbers on each GPU, which they no longer do.

AFAIK, no GPUs within one family change ROPs unless they change SM(X)/CUDA cores, too, so it's possible to get by without knowing anything about them, really.

Edit for your edit: No problems I can think of.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
DX11.1 isn't going to be a huge release, mostly behind-the-scenes stuff for performance and API integration. It will include stereoscopic 3D support, though, so, hypothetically, every game will get S3D without fiddly vendor-specific implementations.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Plus I'm sure we'll get the same stilted and/or recycled motion capture we always do. Why can't more games use Euphoria?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
N.B. you have to order the GTX 555 version of the X51 to get a 330W power brick, which can juuuuuust handle a non-overclocked GeForce 670, which has a 170W TDP/141W PowerTune target.

You might prefer checking out Star War Sex Parrot's posts in the system building thread about the SilverStone Sugo-based mini-ITX box he built; it's a similar volume to a console, though not a similar shape the way an X51 is.

--

E: Hey guess what! Nvidia is repackaging the lovely Fermi cards as GeForce 600 series! Again!

GeForce GT 610
  • GF119 (1 SM/48 Fermi core)
  • Formerly known as GeForce GT 520
  • Similar specs to GeForce GT 620 (OEM variant)
  • Outclassed by Intel HD 4000
GeForce GT 620
  • Either GF108 or GF117, who knows? 2 SM/96 Fermi core
  • Formerly known as GeForce GT 530 (OEM variant)
  • Not related to GeForce GT 620 (OEM variant), which has half the cores
  • lovely 64-bit memory bus
GeForce GT 630
  • GF108 (2 SM/96 Fermi core), DDR3 and GDDR5 variants
  • Formerly known as GeForce GT 440
  • Not related to GeForce GT 630 (OEM variant), a GK107 Kepler-based card
:downsbravo:

E2: The GeForce GT 610 costs $60 shipped at Newegg :wtc: That must be the same price-performance curve as the $110 Radeon 6450 with 2GB of VRAM.

Factory Factory fucked around with this message at 04:34 on May 20, 2012

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Lord, Metro2033. I discovered - hey, not only do we have quote linking, but we've got reply drafts and character counters? - I discovered that the big performance hog to a Radeon 6850 CF setup was depth of field, of all things, and that the game ran very smoothly once I turned that off.

I also discovered that my graphics overclock is stable for Furmark but not for Metro. Good lord, that game. :gonk:

Whatever your video card, it will have the poo poo kicked out of it by Metro 2033.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
DX9: yes
DX10: yes
DX11 with advanced DoF: yes
DX11 without ADoF and decent framerates: nnnnnope.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Yeah, well, I'm also two-loops-of-Unigine Heaven stable, too. Frickin' Metro.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
System building/parts picking thread is ^^^^ thataway.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
There is none; low-end Nvidia cards are terrible values. And this still isn't the parts-picking thread.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
A few modern games, too, like LA Noire. The facial animation system is a 30FPS video normal map, basically, so the engine locks in sync with that.

Also, this is neither here nor there, but every time I type "LA Noire" I very nearly typo "LA Norse," and I imagine the most wonderful Skyrim mashup.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Why not go for full SSAA then? Render internally at 10x resolution and then downscale. :getin:

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
2GB of RAM doesn't seem to limit the 680 in SLI when working with 3x1920x1080, so I don't think a single 2560x1440 monitor will provide any VRAM issues whatsoever.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
The FaH guys don't have a client that will fold on a 680, actually, so all that is moot until there's a software update.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
They can using Lucid Virtu. Otherwise they cannot, as there's no DP connection from the video card to the Thunderbolt adapter.

https://www.youtube.com/watch?v=O1t7Rc9qFgI

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
AnandTech has a dual feature on an industrial PC and the "Cedar Trail" Intel Atom that runs it. Cedar Trail is a rarity among Intel chips - its IGP is not Intel-designed, but rather an IP block from PowerVR, the SGX 545. The GPU is branded as the GMA 3600/3650, depending on clock speed.

The PowerVR Series 5 (a.k.a. SGX) is one of those low-power-optimized architectures. It is entirely DirectX 10.1 compliant, but like Intel's HD Graphics GPUs, much is accomplished in efficient fixed-function hardware. It's used extensively in smartphones and tablets, such as the iPhone 4, first iPad, Palm Pre, BlackBerry Playbook, and Samsung Galaxy S. It runs at very low clockspeeds - the SGX 545 is the most powerful unit in Series 5, and its as-designed clockspeed is only 200 MHz. Intel runs the GPU at up to 650 MHz in Cedar Trail, however.


Block diagram of an SGX GPU

You might think it's odd that a phablet GPU is being put in a netbook platform. Well, this third-gen Atom core is identical to the first-gen Atom core, just clocked higher; it's not much faster than top-end phablet CPUs at this point, so why not give it graphics to match?


The Cedarview SoC, including the Cedar Trail CPU core and PowerVR GPU IP block

That second block diagram and other material promise a lot from this GPU and its associated hardware for this update to the Atom platform:
  • DirectX 10.1 support
  • Hardware-accelerated HD video decoding
  • Twice the performance of previous generation Atom's graphics
But guess what? None of that loving works.

Intel has provided the shittiest of drivers for the GMA 3650. The launch drivers had major problems with screen tearing and stuttering... on the Goddamn desktop. The GPU can't handle a Windows desktop, regardless of settings - any resolution, Aero on or off. And the update package for newer drivers hoses your OS install and prevent you from entering Windows at all. You have to flatten and reinstall if you're updating from the launch drivers.

Once you have those new drivers installed, things are improved... somewhat. You can now display a blank desktop properly, but if you get saucy and move a window, the system lags to hell. HD video decode? A solid "Almost." 720p YouTube works. 1080p YouTube stutters all over and drops frames. Netflix and Hulu are SD-only.

And did I mention? Windows 7 32-bit only. No 64-bit, no other OS, not even Linux.

Intel can write better drivers. The phone version of Cedar Trail, Medfield, has fantastic Android support, and HD 4000 is a paragon of functionality compared to GMA 3650. But they have not written better drivers. GMA 36x0's drivers being as they are for a shipping product from Intel in 2012 is just crazy.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
The chip is targeting industrial appliances and $200 netbooks, here. It's still unacceptable, but context, people, context. It's not like they're replacing their entire product line top to bottom with this stuff and forcing you to buy it.

$200 netbooks. That's the price new.

Adbot
ADBOT LOVES YOU

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
It's DDR SDRAM. 6.008 "GHz" = 3.004 GHz * two transfers per clock.

GDDR5 is actually a bit more complex than that, but that's the gist of it.

Complex part: GDDR5 has two different clocks: command clock (CK) and write clock (WCK). For hypothetical 6 GHz GDDR5, CK runs at 1.5 GHz (1/4) and WCK runs at 3 GHz. CK and WCK are synchronized, so you can think of GDDR5 as being able to do four I/Os per command. Good for highly parallel, bandwidth-intensive workloads like graphics!

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply