Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Wozbo
Jul 5, 2010
I think it's really going to be "Jaguar + <foo>" where foo might be instruction sets etc that are more tailored to the logic we see in games which would be a lot easier than "3570k + <foo>." Just a hunch.

Adbot
ADBOT LOVES YOU

roadhead
Dec 25, 2001

wipeout posted:

I wonder what priorities made them choose the Jaguar approach.

AMD falling all over themselves to do it cheaper - and having APUs available for prototype/dev kits immediately? Just wild speculation on my part.

movax
Aug 30, 2008

roadhead posted:

AMD falling all over themselves to do it cheaper - and having APUs available for prototype/dev kits immediately? Just wild speculation on my part.

I guess AMD would be a one stop shop for your CPU & GPU needs as well.

Proud Christian Mom
Dec 20, 2006
READING COMPREHENSION IS HARD
At the very worst this is good for AMD and thus everyone else.

bull3964
Nov 18, 2000

DO YOU HEAR THAT? THAT'S THE SOUND OF ME PATTING MYSELF ON THE BACK.


Space Gopher posted:

Well, for one, the current generation of consoles has a hard time running at native 1080p. The standard trick is to render the 3D scene at 720p (or sometimes even less!), scale it to 1080p, and then put 2D UI stuff over the top of that. There's plenty of room to play with GPGPU processing; neither current console supports it. And, in the rest of the system, more RAM means nicer textures, larger levels, and so forth. A faster CPU (and that GPGPU integration) means more sophisticated procedural animation and more complex gameplay. There's plenty of room for improvement.

I don't doubt that there is room for improvement. I'm just saying, when was the last time visuals were the selling point of a game? They are going to have to do more to the next gen console line than stuff more HP under the hood. If the increased processing power doesn't translate to anything other than traditional gameplay with more spit and polish, I don't think the uptake is going to be all that swift.

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

bull3964 posted:

I don't doubt that there is room for improvement. I'm just saying, when was the last time visuals were the selling point of a game?

Pretty much all the loving time on PC, and probably even consoles. Skyrim comes to mind.

SocketSeven
Dec 5, 2012
But when was the last time visuals have really improved on a game?

Even with high quality texture packs and a kick rear end PC, skyrim still leaves a lot to be desired. The biggest thing that I can see skyrim using in particular is proper fluid dynamics so we could have actually flowing rivers and falling waterfalls. That would be truly revolutionary.

Everything we have now is about more polygons, more anti-aliasing tricks, reflections, HDR and crap like that, and with all AAA titles tied to 7 year old consoles requirements, (not to mention standard PC hardware specs instead of top of the line) we don't see the ultra high poly models and lighting that we want.

For gods sake, we can still barely get games where you leave footprints or get WET when you fall in water. Plants are still sprites.

RAGE had an interesting idea with their megatexturing thing, but in practice it didn't work out.

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

SocketSeven posted:

But when was the last time visuals have really improved on a game?

When they weren't forced to downscale the totality of their engine to work on archaic consoles with 350MB of RAM to work with between CPU and GPU?

You guys are arguing against needing new consoles because we don't have any better graphics, when it's the other way around. :psyduck:

Shaocaholica
Oct 29, 2002

Fig. 5E
How is nvidia optimus supposed to work for apps that detect GPUs? Is there a chance an app won't find the discrete GPU because its off?

I recently downloaded the trial of PhotoZoom Pro 5 which is supposed to be GPU accelerated. I set it to use the discrete GPU in nvidia settings as well. When I run the app, it does an optimization step which I'm assuming is compiling OpenCL for the target GPU.

Well when I run the app on an image, the GPU load on the nvidia GPU (620M) is zero. I'm wondering if it can't find the 620M for whatever reason.

Proud Christian Mom
Dec 20, 2006
READING COMPREHENSION IS HARD
Skyrim is a pretty awesome example of just what fans can do graphically for a game if not bottlenecked by console hardware, let alone game developers.

bull3964
Nov 18, 2000

DO YOU HEAR THAT? THAT'S THE SOUND OF ME PATTING MYSELF ON THE BACK.


Jan posted:

Pretty much all the loving time on PC, and probably even consoles. Skyrim comes to mind.

Most commercials for games involve elaborate pre-rendered scenes of scripted events. Real gameplay and graphics hardly EVER make it any sort of marketing materials.

About the only game in 2012 that I remember graphics being any part of the marketing was Far Cry 3 touting the lush environments.

quote:

You guys are arguing against needing new consoles because we don't have any better graphics, when it's the other way around. :psyduck:

It's not that so much as "Are the vast majority of mainstream gamers going to drop $500+ for a system and accessories for improved graphics."

I get that PC gamers are frustrated at the bottleneck that consoles tend to place on games and that SO much more could be done with modern hardware. I just don't know that it's a hugely marketable feature nor something that's going to greatly increase profit with the game studios.

bull3964 fucked around with this message at 19:43 on Jan 23, 2013

Killer robot
Sep 6, 2010

I was having the most wonderful dream. I think you were in it!
Pillbug

Jan posted:

When they weren't forced to downscale the totality of their engine to work on archaic consoles with 350MB of RAM to work with between CPU and GPU?

You guys are arguing against needing new consoles because we don't have any better graphics, when it's the other way around. :psyduck:

Yeah, that's pretty much it. High end game development is so expensive now that the only people who have reason to sink that kind of cash into single-platform development rather than maximize their target audience are console manufacturers looking to grow market share through exclusives. And multiplatform development means an end result defined by the weakest link. The end result is a market where technology advance has been hampered by a console generation lasting so long and developers having to trim things back to fit - sometimes unsuccessfully, as seen in the Skyrim DLC.

Shaocaholica
Oct 29, 2002

Fig. 5E

Shaocaholica posted:

How is nvidia optimus supposed to work for apps that detect GPUs? Is there a chance an app won't find the discrete GPU because its off?

I recently downloaded the trial of PhotoZoom Pro 5 which is supposed to be GPU accelerated. I set it to use the discrete GPU in nvidia settings as well. When I run the app, it does an optimization step which I'm assuming is compiling OpenCL for the target GPU.

Well when I run the app on an image, the GPU load on the nvidia GPU (620M) is zero. I'm wondering if it can't find the 620M for whatever reason.

Ok, so maybe this is optimus trying to be too smart. I have my app setup to use optimus but when I start the app the optimus systray notifier says there are no apps using the discrete GPU.

Do apps need to register themselves somehow in order for optimus to assign the discrete GPU to them? Maybe the app isn't coded for optimus compatibility?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
I'm terribad at Optimus because I have literally never used laptop switching graphics except once for a week with an Asus laptop I returned because they weren't working... BUT...

I believe that Optimus keeps track of which GPU to use via the driver itself. If you right-click on an icon, it should present the option to run on a particular GPU. Absent that, the Nvidia control panel will let you add the executable to the list of "always run with dGPU" programs.

delta534
Sep 2, 2011

Factory Factory posted:

I'm terribad at Optimus because I have literally never used laptop switching graphics except once for a week with an Asus laptop I returned because they weren't working... BUT...

I believe that Optimus keeps track of which GPU to use via the driver itself. If you right-click on an icon, it should present the option to run on a particular GPU. Absent that, the Nvidia control panel will let you add the executable to the list of "always run with dGPU" programs.

It sounds like he has done that, but I think the issue is a flaw with Optimus. I know switching does not trigger yet with OpenGl and OpenCL calls and since the program does not do any rendering I don't think forcing it will work since Optimus is trying to be smart and save battery life. I think the Intel HD 4000 opencl performance is close enough or better to the OpenCL performance of the 620m that it is better to run it on the Intel HD 4000.

Shaocaholica
Oct 29, 2002

Fig. 5E
OpenCL on a 620M would be similar to HD4000? What about the dedicated memory of the 620M(1G)?

Anyway, the company rep says that they will have a new build of their app soon which will have a manual picker for the GPU to use and to compile OpenCL against.

Edit:

Seems like the 620M should be considerably faster than the HD4000 (3517U)

http://clbenchmark.com/compare.jsp?config_0=13283286&config_1=13671906

Shaocaholica fucked around with this message at 00:26 on Jan 24, 2013

delta534
Sep 2, 2011

Shaocaholica posted:

OpenCL on a 620M would be similar to HD4000? What about the dedicated memory of the 620M(1G)?

Anyway, the company rep says that they will have a new build of their app soon which will have a manual picker for the GPU to use and to compile OpenCL against.

Edit:

Seems like the 620M should be considerably faster than the HD4000 (3517U)

http://clbenchmark.com/compare.jsp?config_0=13283286&config_1=13671906



That a comparison of just the cpu but it being a ultra low voltage version could put the 620m ahead.

Here is the 620m vs the HD4000. I'm not sure if it's the ULV version or not.
http://clbenchmark.com/compare.jsp?config_0=13283286&config_1=11977159

Right now dedicated dd3 memory does not make as much of a difference anymore with the large l3 cache that the HD 4000 has access to and the main memory is fairly fast dd3 memory as well.

Shaocaholica
Oct 29, 2002

Fig. 5E
Oh interesting, thanks. Still, I think the ULV part may be a big factor like you said. I'll test both manually and report.

Edit: Well ran my own tests using clbenchmark and the results are mixed. The 620M is way better in one test while the ULV HD4000 is way better in another test. The rest are mixed but very close. Very interesting considering that the 620M will smoke the HD4000 in anything thats a game.

Shaocaholica fucked around with this message at 02:33 on Jan 24, 2013

Wozbo
Jul 5, 2010
Walp guys, the "leaked" info was fake:

http://x-surface.tumblr.com/post/41282771026/x-surface-dont-believe-everything-you-read

Shaocaholica
Oct 29, 2002

Fig. 5E

That reminds me that I haven't been watching the news on next gen consoles for a while now. Are there any leaks that are confirmed or any actual statements? Are we going to finally get 8GB+ of main memory and force all game devs to go 64bit?

The Illusive Man
Mar 27, 2008

~savior of yoomanity~

Shaocaholica posted:

That reminds me that I haven't been watching the news on next gen consoles for a while now. Are there any leaks that are confirmed or any actual statements? Are we going to finally get 8GB+ of main memory and force all game devs to go 64bit?

The closest thing to 'factual' information we've had was the leaked design document from 2010 back in June, and the Durango dev kit that got onto eBay a couple months later that had specs "similar to a high-end PC". That said, it seems we're at the point that next-gen specs can be safely speculated on given currently available PC hardware specs and what's reasonable to expect in a ~$400 console, unless Sony pulls another Sony and has some overengineered Cell processor-type technology waiting in the wings.

One thing to note about Wozbo's post - while I'm in no way endorsing the journalistic chops of most gaming sites, as far as the supposed specs of the next gen Xbox go, the faked rumor substantiated the majority of the specs, only embellishing the CPU's clock speed. The software and X-Surface parts were obviously complete bullshit on the hoaxer's part, but you can't completely discredit the (mostly reasonable) specs just because some guy made up a rumor about an Xbox tablet.

Though, again, given the state of the majority of games 'journalism', it's pretty advisable to take all of this with a grain of salt.

Alpha Mayo
Jan 15, 2007
hi how are you?
there was this racist piece of shit in your av so I fixed it
you're welcome
pay it forward~
Finally picked up a 7870 Myst (Tahiti LE), coming from a GTX 560.

Upsides:
+The performance is a straight upgrade over the 7870, I'd say about 10% at stock and can meet the 7950 Boost with a little overclocking.
+Got it for $235, same price as the 7870's so an excellent value. Plus came with FC3 and Sleeping Dogs
+Good variety of video outputs. DVI, HDMI and 2 DisplayPort. Also came with a DisplayPort->HDMI adapter.
+It can be Crossfired with 7950's. I don't plan on doing that because of the power usage, but I guess it does leave an option open in the future.

Downsides:
-Card is HUGE. Probably the biggest card I've ever had. Had to re-arrange my drives/PSU in my case to get it to fit.
-Uses more power. I don't really care about that since my PSU can handle it.
-The big one: loving fan is loud. It does its job well - I don't even break 70C running Furmark. Making a custom Fan profile in Afterburner fixes this issue somewhat though
-Voltage locked apparently, but I've also heard it is just an issue of software not recognizing the card so who knows.

Overall I'm happy with it. The regular 7870's make more sense though if you want something quieter/smaller and fairly close in performance.

Alpha Mayo fucked around with this message at 09:07 on Jan 27, 2013

teh z0rg
Nov 17, 2012
Where are the GTX780s? Where are they?

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

teh z0rg posted:

Where are the GTX780s? Where are they?
The Geforce Titan is coming in March for $899 and is based around the GK110 GPU that powers the Tesla K20-series. We don't really know anything about cards beyond that, as far as I know.

Animal
Apr 8, 2003

I am playing around with Adaptive VSynch on my 670 (1440p). How do you guys go about it? I am thinking focrcing Adaptive on the nvidia CP, and choosing VSynch off in the games so as not to cause any type of conflict.

Endymion FRS MK1
Oct 29, 2011

I don't know what this thing is, and I don't care. I'm just tired of seeing your stupid newbie av from 2011.
Speaking of Vsync, is there any way to enable it without also getting the horrendous mouse acceleration?

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

Endymion FRS MK1 posted:

Speaking of Vsync, is there any way to enable it without also getting the horrendous mouse acceleration?

That depends on the way a game engine handles its updates. If they do it intelligently, then VSync won't affect mouse input. If they do it badly, you're SOL and nothing you choose will do anything about it.

uhhhhahhhhohahhh
Oct 9, 2012

Endymion FRS MK1 posted:

Speaking of Vsync, is there any way to enable it without also getting the horrendous mouse acceleration?

Yes, change the Maximum pre-rendered frames setting in the nvidia control panel from 3 to 1 or 0. I just have mine on 'Use the 3D application setting' with Adaptive vSync and I've not noticed an issue.

I used to run D3DOverrider to force triple buffering and always had that setting on 1 to stop mouse lag.

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

bull3964 posted:

I don't doubt that there is room for improvement. I'm just saying, when was the last time visuals were the selling point of a game? They are going to have to do more to the next gen console line than stuff more HP under the hood. If the increased processing power doesn't translate to anything other than traditional gameplay with more spit and polish, I don't think the uptake is going to be all that swift.

When I play games for the past few years I don't ask myself "could the graphics be better in this game? "but rather things like "Geez, how goddamn tedious SWTOR was", "does D3 has to run like poo poo on a 2500K + HD5850 despite how mediocre the visuals are?" or "why the gently caress are there so many stupid intro screens?"

Palladium fucked around with this message at 14:58 on Jan 28, 2013

Gucci Loafers
May 20, 2006

Ask yourself, do you really want to talk to pair of really nice gaudy shoes?


What's with the delay of the next gen ATI/Nvidia cards? Shouldn't they be out by now or should we have some kind of update?

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

Tab8715 posted:

What's with the delay of the next gen ATI/Nvidia cards? Shouldn't they be out by now or should we have some kind of update?

The 680 didn't come out till March 2012

zer0spunk
Nov 6, 2000

devil never even lived

Don Lapre posted:

The 680 didn't come out till March 2012

I want to replace my 680, but not with a $900 card that is basically a 690 if the "titan" stuff is true. I thought we moved past the ludicrous "ultra" line Nvidia?

Endymion FRS MK1
Oct 29, 2011

I don't know what this thing is, and I don't care. I'm just tired of seeing your stupid newbie av from 2011.

uhhhhahhhhohahhh posted:

Yes, change the Maximum pre-rendered frames setting in the nvidia control panel from 3 to 1 or 0. I just have mine on 'Use the 3D application setting' with Adaptive vSync and I've not noticed an issue.

I used to run D3DOverrider to force triple buffering and always had that setting on 1 to stop mouse lag.

Forgot to mention, I have a 7950.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

zer0spunk posted:

I want to replace my 680, but not with a $900 card that is basically a 690 if the "titan" stuff is true. I thought we moved past the ludicrous "ultra" line Nvidia?

They arnt going to replace the architecture every year.

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

Endymion FRS MK1 posted:

Forgot to mention, I have a 7950.

RadeonPro has some special VSync settings, namely an equivalent to Adaptive VSync. Give it a try.

mayodreams
Jul 4, 2003


Hello darkness,
my old friend

zer0spunk posted:

I want to replace my 680, but not with a $900 card that is basically a 690 if the "titan" stuff is true. I thought we moved past the ludicrous "ultra" line Nvidia?

May I ask why? I have a 680 and a 2550x1600 monitor and I don't have any problems.

zer0spunk
Nov 6, 2000

devil never even lived

Don Lapre posted:

They arnt going to replace the architecture every year.

GeForce GTX 480 March 26, 2010 GF100
GeForce GTX 580 9 November 2010 GF110
GeForce GTX 680 March 22, 2012 GK104

The titan thing falls in line with the 480/580 release schedule.

I'm hoping whatever the real 8xxx line is from AMD gets them to fast track, but who knows.

mayodreams posted:

May I ask why? I have a 680 and a 2550x1600 monitor and I don't have any problems.

For that reason, I'm on a 2560x1600 panel as well and it gets taxed in certain situations, but nothing worth going to SLI for. Hence I'm hoping the next gen single card solution is the last card I'll need to buy for a few years. Plus, smaller fab, less power consumption, and everything that goes with it.

Edit: and hopefully less gimped CUDA wise, because the Tesla stuff is out of my range.

zer0spunk fucked around with this message at 18:46 on Jan 28, 2013

Gucci Loafers
May 20, 2006

Ask yourself, do you really want to talk to pair of really nice gaudy shoes?


mayodreams posted:

May I ask why? I have a 680 and a 2550x1600 monitor and I don't have any problems.

The current video cards make my room way too goddamn warm.

mayodreams
Jul 4, 2003


Hello darkness,
my old friend

zer0spunk posted:


For that reason, I'm on a 2560x1600 panel as well and it gets taxed in certain situations, but nothing worth going to SLI for. Hence I'm hoping the next gen single card solution is the last card I'll need to buy for a few years. Plus, smaller fab, less power consumption, and everything that goes with it.

Well, from what I've read, Big Kepler isn't that great for gaming. Yes, it has more CUDA cores (at least in Quadro/Telsa dressing), but that does not equate to gaming performance. I doubt the clocks will be comparative with the GK104, and it certainly won't bee cooler if it has more transistors, which I'm sure it will.

GK104 (GTX 680) - Clocks in MHz -> (1006 Base, 1058 Avg Boost, 1110 Max) and 195W TDP
GK110 (K20) - Clock is 745MHz and 225W TDP

Edit: This is also the first time Nvidia has not started a new arch with the big monolithic die, so you cant compare the other releases to this one, because the 680 was the reduction.

mayodreams fucked around with this message at 19:10 on Jan 28, 2013

Adbot
ADBOT LOVES YOU

incoherent
Apr 24, 2004

01010100011010000111001
00110100101101100011011
000110010101110010

Tab8715 posted:

The current video cards make my room way too goddamn warm.

My 2 260s used to keep my room warm in the winter (unbearable in the summer). Sometimes I wonder if my 670 is even on.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply