Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Animal
Apr 8, 2003

I'm not giving my money to scalpers. My 560 Ti 448 is being a trooper at 1440p, the only game it can't play smoothly is Metro. I'm just playing the less demanding games in my backlog, I have plenty.

Adbot
ADBOT LOVES YOU

Alpha Mayo
Jan 15, 2007
hi how are you?
there was this racist piece of shit in your av so I fixed it
you're welcome
pay it forward~

Agreed posted:

Taking a lot of willpower not to just buy one right now but gently caress scalpers.
http://www.evga.com/products/prodlist.asp?family=All%20Graphics

The official eVGA shop tends to have some in stock for MSRP. Like right now it has the base GTX 670 for $399, though all the other 670 models are sold out.

Star War Sex Parrot
Oct 2, 2003

Newegg had a ton in stock just yesterday. I really don't think they're hard to find.

Star War Sex Parrot
Oct 2, 2003

poo poo, EVGA's is in stock right now on Newegg.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
The difference here between AMD and NVIDIA is that AMD's 7970 was pushed down a notch by NVIDIA's new cards..

... Whereas NVIDIA cannibalised 680 sales with the 670. Still, it's good for those who waited a bit.

Josh Lyman
May 24, 2009


Factory Factory posted:

DX11.1 isn't going to be a huge release, mostly behind-the-scenes stuff for performance and API integration. It will include stereoscopic 3D support, though, so, hypothetically, every game will get S3D without fiddly vendor-specific implementations.
Speaking of which, has anyone tried 3D on the Nvidia cards with just red/blue glasses? I know they give them out for free at trade shows, but I was hoping some place locally might have them for like $3, and all the Blockbusters are closed :(.

Josh Lyman fucked around with this message at 14:45 on May 19, 2012

Aws
Dec 5, 2005
Anybody having issues with the 12.4 Catalyst? On my HD 5870 I've noticed two things. First, my CCC settings are completely ignored by games that previously didn't ignore them. I have 33 games installed and I tested on a fair chunk of them, and none of them have the settings applied. Second, my shadows are hosed in every game. If they're not flickering all the time then they do this.



Notice the striping. It's very visible on his face and neck. This isn't a lovely JPG; the BMP looks the same.

I've ruled out any hardware issues; everything runs as smoothly as it ever did, and I decided to Furmark while I was out for a few hours today. No issues.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

That particular example is because of the way they do shadows in TW2 iirc. Happens on my GTX 580, too.

No idea what's up with the drivers not overriding correctly, that's such an off-and-on thing with both companies' control panels that I've abandoned the control panel entirely and just use nVidiaInspector to force things at a deeper level than the control panel.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Aws posted:

Anybody having issues with the 12.4 Catalyst? On my HD 5870 I've noticed two things. First, my CCC settings are completely ignored by games that previously didn't ignore them. I have 33 games installed and I tested on a fair chunk of them, and none of them have the settings applied.
Are you setting a profile for each game? If you don't then the default profile (which may not be your global settings) will apply.

Dotcom656
Apr 7, 2007
I WILL TAKE BETTER PICTURES OF MY DRAWINGS BEFORE POSTING THEM

Aws posted:

Anybody having issues with the 12.4 Catalyst? On my HD 5870 I've noticed two things. First, my CCC settings are completely ignored by games that previously didn't ignore them. I have 33 games installed and I tested on a fair chunk of them, and none of them have the settings applied. Second, my shadows are hosed in every game. If they're not flickering all the time then they do this.

*snip*

Notice the striping. It's very visible on his face and neck. This isn't a lovely JPG; the BMP looks the same.

I've ruled out any hardware issues; everything runs as smoothly as it ever did, and I decided to Furmark while I was out for a few hours today. No issues.

I showed this post to a friend of mine who works with Nvidia on driver development and testing, just to see what he said.
The hashing pattern on the TW2 screenshot is dithering done on the shadow map by the game to approximate soft edges. As for the shadow flickering, that could be a number of things and a short example video would be better for him to see what might be happening.

But if you get the artifacting in more than just TW2 it might be bad VRAM, since that usually results in fixed pattern artifacting.

Aws
Dec 5, 2005

Agreed posted:

That particular example is because of the way they do shadows in TW2 iirc. Happens on my GTX 580, too.

No idea what's up with the drivers not overriding correctly, that's such an off-and-on thing with both companies' control panels that I've abandoned the control panel entirely and just use nVidiaInspector to force things at a deeper level than the control panel.
It generally worked consistently in the many years I've been running ATI/AMD cards. Some games supported them and some games didn't. It's just strange that now it doesn't work in anything. I don't think it's the TW2's rendering; I played it when it first came out and there were no issues then. It's in every single game, even ones that worked perfectly before. I've been playing Skyrim with no issues since release, and it's only with the 12.4 drivers that the shadows suddenly got hosed.

Alereon posted:

Are you setting a profile for each game? If you don't then the default profile (which may not be your global settings) will apply.
Nope. Just a single global profile.

Dotcom656 posted:

I showed this post to a friend of mine who works with Nvidia on driver development and testing, just to see what he said.
The hashing pattern on the TW2 screenshot is dithering done on the shadow map by the game to approximate soft edges. As for the shadow flickering, that could be a number of things and a short example video would be better for him to see what might be happening.

But if you get the artifacting in more than just TW2 it might be bad VRAM, since that usually results in fixed pattern artifacting.
You have some cool friends. The dithering is something that was usually fixable by setting Anti-Aliasing mode to Adaptive Multi-Sample from the default Multi-Sample in CCC. It's just that, as I said, that's completely ignored by everything now. Bad VRAM is kind of exciting though; I was putting off a 7970 because it's not financially responsible and I can't really afford it, but if my 5870 breaks, hey, it's a mandatory expense. A man's gotta have a GPU, doesn't he?

I'm going to downgrade to 12.3 to rule out bad VRAM.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Aws posted:

Nope. Just a single global profile.
Try saving a per-application profile and see if that makes a difference.

ijyt
Apr 10, 2012

Avoided blowing a load of money on a 680 and bought a second ASUS HD5850 for £80/$125. That should tide my hardware itch over long enough, at least until Haswell is released.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
I thought it might be interesting to post a Geforce GTX 670 trip report from the perspective of a long-time ATI and then AMD user. I bought an EVGA GTX 670 2GB, non-superclocked version, which is basically a stock card with some minimal tweaks to the fan and the exhaust venting:

The first thing that struck me was that, out of the box, image quality is TERRIBLE. I noticed a lot of texture shimmering and thin lines were mangled badly by antialiasing. After spending a few minutes digging through the nVidia Control Panel to disable all their optimizations and Gamma Correction for Antialiasing, quality came up to what I was expecting. Coverage Sample AntiAliasing (CSAA) is loving amazing. 16X CSAA (4 geometry samples and 12 coverage samples) looks drat near flawless and isn't too much slower than 4X MSAA. I can play less demanding games in 32X CSAA (8 geometry samples and 24 coverage samples), which looks amazing. I can't wait for more demanding games to come out with support for TXAA.

Noise levels are a lot better than I was expecting, even when I maxed out the TDP. Definitely not something I'd notice with headphones, and I think it's actually quieter at idle than my Sapphire Radeon HD 4850 which had a reasonably quiet stock cooler. The blower had a bit of bearing whine when I first booted up after installing the card. I was worried that I might have to RMA it, but it stopped within about a minute and I haven't heard it since, so I think it's fine.

Overclocking is more difficult than it would seem at first due to Boost and the TDP cap. It's hard to test your overclock, as heavy load will hit the TDP cap before the boost cap, so it won't test the higher clockspeeds. This results in overclocks that passed torture tests fine failing under more moderate gaming where there's TDP headroom for it to boost higher. I've also found memory overclocking to have a larger performance impact than expected. Reviews are correct that it doesn't have much impact on average framerates, but it raises minimum framerates and makes valleys in framerate graphs shallower.

Metro 2033 is still a goddamned hog, though :(

Aws
Dec 5, 2005
Downgraded to 12.3. Issue is still there. I tried the per-application profile, and it doesn't change anything.

I'm stumped. There's not a whole lot for me to do. A year ago, I started writing down changes I make to anything on my computer (because I like to gently caress with things and I learned this is a really helpful thing to do) and this time it's really not my fault.

Oh well. I guess I'll wait and see what happens.

This is so exciting. Nothing has broken for so long; it takes me back to 2002 when my trusty 9700 would overheat in my lovely case and I couldn't afford anything better, so I had to keep the case open and blow a giant floor fan into my case. Ah, nostalgia.

Aws
Dec 5, 2005

Alereon posted:

I thought it might be interesting to post a Geforce GTX 670 trip report from the perspective of a long-time ATI and then AMD user.
As a longtime ATI/AMD user, how would you compare nvidia's control panel with AMD?

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Aws posted:

As a longtime ATI/AMD user, how would you compare nvidia's control panel with AMD?
In terms of managing settings, especially per-game settings, nVidia's is far better. I have noticed more UI bugs, things like the control panel not knowing which setting I'm hovered over, linked settings not updating properly, and other weirdness in nVidia's conrol panel, but I am using a beta driver (301.34). AMD's panel scaling options seem better and more effective, even though they do require you to be in a non-native resolution to expose them. I haven't done much with the other settings, though the Digital Vibrance option Desktop Color Settings was an easy and effective way to make my older secondary display look less washed out.

Dotcom656
Apr 7, 2007
I WILL TAKE BETTER PICTURES OF MY DRAWINGS BEFORE POSTING THEM
Anyone else having hard locking issues in Metro 2033 with a GTX 670? I'm running 301.34 drivers, and about 30 seconds after starting the game (or rather 10 seconds after all splash screens end) the game just hard locks. Everything freezes and I cant move my mouse cursor. I can bring up task manager and end the process and that's about it.

EDIT: Just pulled up metro again (I didn't end the process this second time) and its working now. Not sure what that's all about.

EDIT 2: Restarted the game to apply some graphics changes and its still acting weird. Maybe it hates my extended desktop?

Dotcom656 fucked around with this message at 23:12 on May 19, 2012

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Alereon posted:

In terms of managing settings, especially per-game settings, nVidia's is far better. I have noticed more UI bugs, things like the control panel not knowing which setting I'm hovered over, linked settings not updating properly, and other weirdness in nVidia's conrol panel, but I am using a beta driver (301.34). AMD's panel scaling options seem better and more effective, even though they do require you to be in a non-native resolution to expose them. I haven't done much with the other settings, though the Digital Vibrance option Desktop Color Settings was an easy and effective way to make my older secondary display look less washed out.

You have to get nVidiaInspector. It allows for much more robust management of game profiles, including the ability to override certain flags and thus enable different types of AA, or allow SSAO, etc., in games that wouldn't support them with stock settings. It also gives you access to every nVidia AA mode, including supersampling AA as well as sparse-grid supersampling options that you can adjust to taste for the perfect balance of performance and incredible appearance.

On modern games on my GTX 580 I usually run, at 1080p, 8xCSAA + 2x or 4x Sparse Grid supersampling, and it is image quality heaven. But you can do a lot of things there, and while it does assume some knowledge on the part of the user, it gives you a great deal of power as well to tweak your games' and adjust profiles... Very powerful tool, I get so much out of my card thanks to it and I can only imagine with a 670/680 you'd get so much more.

Dotcom656
Apr 7, 2007
I WILL TAKE BETTER PICTURES OF MY DRAWINGS BEFORE POSTING THEM
Does anyone else with a GTX 670 get terrible performance with diablo 3 at 2560x1440? I was thinking there was something wrong with my PC at first because Fraps wouldn't open at all. (It turns out you have to change your PC date to may 17th to make it work.)

But this coupled with Metro hard locking is making me wonder if something is broken on my end.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Dotcom656 posted:

Anyone else having hard locking issues in Metro 2033 with a GTX 670? I'm running 301.34 drivers, and about 30 seconds after starting the game (or rather 10 seconds after all splash screens end) the game just hard locks. Everything freezes and I cant move my mouse cursor. I can bring up task manager and end the process and that's about it.

EDIT: Just pulled up metro again (I didn't end the process this second time) and its working now. Not sure what that's all about.

EDIT 2: Restarted the game to apply some graphics changes and its still acting weird. Maybe it hates my extended desktop?
What card do you have? That sounds like the issue the EVGA GTX 670 SuperClocked cards were recalled for. That's exactly what happens to me when I have the card overclocked too high.

Agreed posted:

You have to get nVidiaInspector. It allows for much more robust management of game profiles, including the ability to override certain flags and thus enable different types of AA, or allow SSAO, etc., in games that wouldn't support them with stock settings. It also gives you access to every nVidia AA mode, including supersampling AA as well as sparse-grid supersampling options that you can adjust to taste for the perfect balance of performance and incredible appearance.
Cool, thanks, I'll check it out.

Dotcom656
Apr 7, 2007
I WILL TAKE BETTER PICTURES OF MY DRAWINGS BEFORE POSTING THEM

Alereon posted:

What card do you have? That sounds like the issue the EVGA GTX 670 SuperClocked cards were recalled for. That's exactly what happens to me when I have the card overclocked too high.
Cool, thanks, I'll check it out.

MSI reference GTX670 with a slight factory overclock.
http://www.newegg.com/Product/Product.aspx?Item=N82E16814127675
That card for a specific link.

Star War Sex Parrot
Oct 2, 2003

Alereon what settings did you fiddle with? I notice the texture shimmering on my 680 and wouldn't mind cleaning that up a bit.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

For one, set the basic setting to High Quality on the Performance<------->Quality slider to disable image-compromising optimizations. Also, consider manually using nVidiaInspector to use at least 2xSGSSAA (below the regular AA options, and not the same thing as transparency Multisampling, the one below it) - it really helps with shimmer, especially in deferred rendering engines, in my experience.

Kinkajou
Jan 6, 2004

So I'm thinking of buying a PC as a my main gaming/emulator/media center device and I really like the look and size of the Alienware x51. Only thing I'm worried about is the upgrade potential for the GPU in a year or two. With UE4 right around the corner, I don't want to lock myself out by buying now. I don't know a whole lot about GPU trends, but should we expect midsized/midtier versions of the 680 technology in the next year?

Dotcom656
Apr 7, 2007
I WILL TAKE BETTER PICTURES OF MY DRAWINGS BEFORE POSTING THEM

Kinkajou posted:

So I'm thinking of buying a PC as a my main gaming/emulator/media center device and I really like the look and size of the Alienware x51. Only thing I'm worried about is the upgrade potential for the GPU in a year or two. With UE4 right around the corner, I don't want to lock myself out by buying now. I don't know a whole lot about GPU trends, but should we expect midsized/midtier versions of the 680 technology in the next year?

You can get a lot more bang for your buck by building your own PC,But to answer your question the GTX 660 should be here in the next 3 months, and that's the mainstream performance card.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
N.B. you have to order the GTX 555 version of the X51 to get a 330W power brick, which can juuuuuust handle a non-overclocked GeForce 670, which has a 170W TDP/141W PowerTune target.

You might prefer checking out Star War Sex Parrot's posts in the system building thread about the SilverStone Sugo-based mini-ITX box he built; it's a similar volume to a console, though not a similar shape the way an X51 is.

--

E: Hey guess what! Nvidia is repackaging the lovely Fermi cards as GeForce 600 series! Again!

GeForce GT 610
  • GF119 (1 SM/48 Fermi core)
  • Formerly known as GeForce GT 520
  • Similar specs to GeForce GT 620 (OEM variant)
  • Outclassed by Intel HD 4000
GeForce GT 620
  • Either GF108 or GF117, who knows? 2 SM/96 Fermi core
  • Formerly known as GeForce GT 530 (OEM variant)
  • Not related to GeForce GT 620 (OEM variant), which has half the cores
  • lovely 64-bit memory bus
GeForce GT 630
  • GF108 (2 SM/96 Fermi core), DDR3 and GDDR5 variants
  • Formerly known as GeForce GT 440
  • Not related to GeForce GT 630 (OEM variant), a GK107 Kepler-based card
:downsbravo:

E2: The GeForce GT 610 costs $60 shipped at Newegg :wtc: That must be the same price-performance curve as the $110 Radeon 6450 with 2GB of VRAM.

Factory Factory fucked around with this message at 04:34 on May 20, 2012

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
What the gently caress is even the point of releasing this mess? To rip people off when they could be using Intel HD graphics for free?

Berk Berkly
Apr 9, 2009

by zen death robot

HalloKitty posted:

What the gently caress is even the point of releasing this mess? To rip people off when they could be using Intel HD graphics for free?

Yes?

This is a marketing change. Marketing isn't about telling a consumer what his most prudent or efficient choice. Its about offering as many temptations as possible to fish money from them, if not attempting to persuade them in the absence of or in the face of more economically sound wisdom.

is that good
Apr 14, 2012
It's for people who want to take their old beige boxes and let them watch HD video, supposedly. Though I'd more generally say it's to clear out old stock.

Muslim Wookie
Jul 6, 2005
Bought a 670 last week, have yet to play a game with it.

Hopefully I'll get a chance tonight and I'll have a look at nvidianinspector too. Metro2033 sounds like the go to test game?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Lord, Metro2033. I discovered - hey, not only do we have quote linking, but we've got reply drafts and character counters? - I discovered that the big performance hog to a Radeon 6850 CF setup was depth of field, of all things, and that the game ran very smoothly once I turned that off.

I also discovered that my graphics overclock is stable for Furmark but not for Metro. Good lord, that game. :gonk:

Whatever your video card, it will have the poo poo kicked out of it by Metro 2033.

Kramjacks
Jul 5, 2007

Factory Factory posted:

Lord, Metro2033. I discovered - hey, not only do we have quote linking, but we've got reply drafts and character counters? - I discovered that the big performance hog to a Radeon 6850 CF setup was depth of field, of all things, and that the game ran very smoothly once I turned that off.

I also discovered that my graphics overclock is stable for Furmark but not for Metro. Good lord, that game. :gonk:

Whatever your video card, it will have the poo poo kicked out of it by Metro 2033.
I wonder what kind of hardware requirements Metro: Last Light will have. Maybe it will be like from Crysis to Crysis 2 where and there will be much better optimization.
https://www.youtube.com/watch?v=iCVREJyyZWA

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Factory Factory posted:

I also discovered that my graphics overclock is stable for Furmark but not for Metro. Good lord, that game. :gonk:

Have you not done the fun stuff of going through and seeing whether it's stable for DX9, DX10, and DX11 at the same clocks? That's where it gets good! And by good of course I mean :suicide:

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
DX9: yes
DX10: yes
DX11 with advanced DoF: yes
DX11 without ADoF and decent framerates: nnnnnope.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

That's half the fun of overclocking GPUs, since every game engine is basically a huge quagmire of hacks that skirt or just outright ignore some pretty significant standards, and drivers get released all the time specifically to sort of address those hacked-in elements and can grab ludicrous performance gains out of games. E.g. something wrong with Skyrim's "Gamebryo PLUS" shadows and nVidia's Fermi architecture (especially higher end cards, where you'd have the horsepower to turn 'em up in the first place), and especially indoors for some reason - a new driver gave between 30% and 40% performance improvement. That's hilarious, isn't it? Pre-driver, might as well be using a GTX 280, post-driver oh that's where my performance ought to be.

And since they're all special flowers, the way they ask your card to behave can be pretty dramatically different. For example, I'm stable at 920mhz in DX10 in Metro 2033, or at 925mhz in DX11 in Crysis 2 (930mhz in DX11 Crysis 2!), or 920mhz in DX10 or DX11 in S.T.A.L.K.E.R. CoP with Atmosfear 3 and Absolute Nature plus third-party shaders... I can crank it up to 960mhz for some hot texture transcoding action in RAGE without any crashing or artifacts (edit: so we're clear, I'm pretty sure memory bandwidth has a lot to do with texture transcoding speed as well, so don't ignore that if you like RAGE and are still playing it despite the sad lack of continued support from id), and ~950mhz or so for S.T.A.L.K.E.R. SHoC in DX9.

Lovely stuff, engines. Unreal 4 looks really cool, but I'm up idtech's rear end because I feel like they have a ton of creativity and they (were, at least, pre-layoffs) working really closely with FXAA development to try to bring crazy good image quality potentially even to current-gen consoles. Identified the softness and figuring out how to do a "free" MSAA pass to resolve leftover aliasing... Cool stuff. But I'm sure development will continue regardless. MLAA's been coming along, AMD/ATI may have been one-upped but at least they didn't just sit around moping about it.

Agreed fucked around with this message at 11:57 on May 21, 2012

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Factory Factory posted:

I also discovered that my graphics overclock is stable for Furmark but not for Metro. Good lord, that game. :gonk:
That's expected, when running Furmark the GPU is spending all its time being throttled to stay under the TDP cap so you're not actually testing the overclock much.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

3Dmark11 and UNIGINE Heaven are the real stability testing tools for overclocking cards because between the two of them, they actually manage to pretty much do everything in DX10/DX11 in ways that games MIGHT ACTUALLY USE (holy poo poo, a relevant 3Dmark?? Where did the good old days of pointless e-peen go? Don't worry, it's not perfect, it's just a hell of a lot more useful than it used to be).

I don't even bother with Furmark or OCCT or EVGA OC Scanner X, bunch of total bollocks that will as Alereon rightly points out simply show you the power throttling safety features of your card unless you're using modified firmware (and hopefully after-market cooling, because otherwise you're going to cook your card or cards).

Fire up 3Dmark11 for a few runs on Performance mode for quickest results there - it's a bit of a chore since you have to do it manually, but if you can make it through the whole run ten times, you're *probably* stable.

Or, let UNIGINE Heaven 3.0 go on its merry loop and you'll find out real quick what kind of stability your overclock has. Heaven is my favorite, both because they've consistently sorta one-upped Futuremark for relevance despite it being a freeware product; because there's no manual dicking around required, it auto-loops through a scene where every camera hard change is testing/showing off something new and DX11, so if you want to, feel free to just let it run for awhile and if you come back to a driver crash you know to reduce clocks; and you can monitor it to see if you've got shader artifacts, geometry issues, etc. and help narrow down what isn't working right.

Of course then you go play fancy games and it turns out that you need to dial clocks back some more somewhere because stress test utilities are only useful to a point... It's just a further point than Furmark, I guess.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Yeah, well, I'm also two-loops-of-Unigine Heaven stable, too. Frickin' Metro.

Adbot
ADBOT LOVES YOU

movax
Aug 30, 2008

Linux 3.4 dropped today, its relevance to this thread is that it brings some support for newer GPUs.

quote:

1.2. GPU drivers
1.2.1. GPU: Early support of Nvidia GeForce 600 'Kepler'
Nvidia announced new Kepler GPUs (GeForce 600 Series) on 22 March, and that was the day the Nouveau team asked to get basic modesetting support (no 3D, etc) for it merged in the main kernel. A quote from a Nouveau developer: "Its quite amazing that nouveau can support a GPU on its launch day even if its just unaccelerated modesetting". External firmware and updated graphic software stack are required. Code: (commit)

The Nouveau driver has also been "unstaged" and now it's considered ready for widespread use.

1.2.2. GPU: Support for AMD Radeon 7xxx and Trinity APU series
The newest GPU and APUs from AMD (Radeon 7xxx and Trinity APU series) are supported in this version. Code: (commit)

1.2.3. GPU: Support of Intel Medfield graphics
This release adds experimental support for the GMA500 Medfield graphics. Medfield is a embedded architecture targeted for smartphones. Code: (commit)

LKML
Human Readable

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply