Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
One Eye Open
Sep 19, 2006
Am I awake?

Endymion FRS MK1 posted:

The magic of AMD's new driver team.

It probably broke something else.

On my laptop, the AMD Vision Engine control centre that came with 13.1 didn't recognise my discrete GPU and somehow borked the (non-AMD :psyduck:) wireless drivers. Uninstalling it separately and reinstalling the 12.10 version fixed it though. I have a feeling it may be HP's (the laptop's manufacturer) fault though for having a weird setup - the latest drivers on their site for it are for 11.7.

Adbot
ADBOT LOVES YOU

The Illusive Man
Mar 27, 2008

~savior of yoomanity~
Catalyst 13.1 has really fixed a range of issues for me in Win 8, and performance seems better/more consistent across the board too with the games I've tried (Skyrim, Borderlands 2, and even Far Cry 3 seems a bit improved). Not bad!

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

One Eye Open posted:

On my laptop, the AMD Vision Engine control centre that came with 13.1 didn't recognise my discrete GPU and somehow borked the (non-AMD :psyduck:) wireless drivers. Uninstalling it separately and reinstalling the 12.10 version fixed it though. I have a feeling it may be HP's (the laptop's manufacturer) fault though for having a weird setup - the latest drivers on their site for it are for 11.7.

Unfortunately, it looks like that's a thing. AnandTech tested six Enduro laptops they had on hand, and while all of them could get the drivers installed somehow, half couldn't do it from the CCC updater tool and one couldn't do it from a downloaded installer.

Of course, some installs required modifying the driver :rolleye: :sigh:

Endymion FRS MK1
Oct 29, 2011

I don't know what this thing is, and I don't care. I'm just tired of seeing your stupid newbie av from 2011.
Wait, people actually used the CCC updater? I always hear about that breaking things. Uninstall via Add/Remove Programs, then run the new driver's .exe is the way to go.

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

Endymion FRS MK1 posted:

Wait, people actually used the CCC updater? I always hear about that breaking things. Uninstall via Add/Remove Programs, then run the new driver's .exe is the way to go.

You mean: Uninstall using Add/Remove programs, wait for the uninstaller to inevitably crash, use driver sweeper to clean up what's left, reboot, then run the new installer executable.

Endymion FRS MK1
Oct 29, 2011

I don't know what this thing is, and I don't care. I'm just tired of seeing your stupid newbie av from 2011.
Up until I uninstalled the 12.11 betas for 13.1, I never had the installer crash on me. Once AMD fixes their uninstaller program for Windows 8 (from what I can tell it works fine on W7, but in 8 it deletes power plan settings and screws something else up majorly :psyduck:) I'll just use that.

SocketSeven
Dec 5, 2012
So after getting my CPU overclocked and stable, I've now got my EVGA GTX660's installed, SLI'ed and running.

However, I started getting some really weird artifacts during 3dmark testing. all monitors would still display, system would be stable, but everything had a pink hue. I was able to get things back to normal by using keyboard shortcuts to set the system to use a single display, then setting it back to extended desktop mode. I was thinking it was a ram problem, but then I remembered my power management issues with my previous EVGA 570. The power management system allowed it to underclock the card down to like 16mhz, and then scale it up or down according to load. Never an issue with gaming, but for some reason puttering around on the desktop would cause the display drivers to crash and restart dozens of times an hour depending on what I was doing. Thankfully this never brought on a full BSOD or anything.

In my current setup, turning off power management (EVGA calls 'turning off power management' 'enabling K-boost'), raises my idle power usage by over 100 watts according to my kill-a-watt. Turning k-boost off and letting the drivers scale the power load up and down no longer seems to cause issues with the graphics driver crashing entirely whenever I start Firefox, like my 570 did, but clearly it has issues when running rapidly going between high and low power states (like between 3d mark tests).

After setting K-boost to on in EVGA's precision X utility, everything seems stable, except for the whole wasting hundreds of watts of power at idle and increased card temps.

I'd like to be able to disable K-boost and let the cards adaptively underclock themselves and save my poor electric bill, but I'm lost on where to even begin fiddling with this stuff because EVGA's documentation appears to have been written by their marketing department. :psyduck:

Should I not even bother, and just let my PC be a space heater? Or is there some place I should be researching to tweak things so this crap doesn't happen.

craig588
Nov 19, 2005

by Nyc_Tattoo

SocketSeven posted:

After setting K-boost to on in EVGA's precision X utility, everything seems stable, except for the whole wasting hundreds of watts of power at idle and increased card temps.

I'd like to be able to disable K-boost and let the cards adaptively underclock themselves and save my poor electric bill, but I'm lost on where to even begin fiddling with this stuff because EVGA's documentation appears to have been written by their marketing department. :psyduck:

Should I not even bother, and just let my PC be a space heater? Or is there some place I should be researching to tweak things so this crap doesn't happen.

Mod your bios so it has (effectively) no power limits. :smug: My card used to drop down to sub 900MHz levels depending on the load, now I've never seen it drop below 1200MHz and it's rare for it to even break from 1300Mhz and I keep full dynamic clock and voltage support.
Really unless you have really awful stock power limits you probably can ignore it though.

Have you tried each card by itself? I don't think I've ever heard of the dynamic clocking causing any sort of artifacts at all, it sounds like there might be a hardware issue.

Oh, idea: In EVGA Precision use the adjust voltage button and max it all the way. That's still a perfectly safe voltage because it's what your card jumps to in 3d mode until it hits power limits and it lowers voltage and clock speed in order to stay within range. All the setting does is set a minimum voltage so you'll lose a lot more clock speed because the cards wont have the option of lowering voltage so they'll only be able to drop clocks. If the problem goes away like that then you definitely have a bad card and need to test them individually to find out which one to RMA.

Another idea: What power supply do you have? If you're near the limit it could be reacting poorly to the rapidly changing load.

craig588 fucked around with this message at 14:22 on Jan 21, 2013

Goon Matchmaker
Oct 23, 2003

I play too much EVE-Online

craig588 posted:

Another idea: What power supply do you have? If you're near the limit it could be reacting poorly to the rapidly changing load.

I'm going to agree with this. From what you're describing it sounds like your power supply can't handle the changes in load and craps out causing voltage or whatever going to your video cards to drop resulting in a crash.

SocketSeven
Dec 5, 2012
I've got an 850 watt Antec with 4 12V rails. I have dedicated a 12V rail to each graphics card, one to the motherboard, and the last one to 2 hard drives, an SSD, and a DVD-writer.

I don't think a lack of power is my issue, since my entire setup never pulls more then 600 watts and that includes all the monitors, the PC itself, and an Onkyo 5.1 surround sound home theater tuner. I'm measuring this with a Kill-A-Watt plugged into the UPS that everything is hooked up to.

I think it's part of EVGA, or Nvidia's scheme to dynamically under and overclock the cards. I've NEVER had any luck with it working right when I had my GTX570, and ultimately just disabled it entirely with some registry hack and bit the bullet of wasting power.

I'm going to do some testing right now, and see if I can replicate these issues for you all to see.

Nvidia display control panel is set to defaults. Adaptive power management, multi display performance mode, and card number 1 is powering 3 displays (of varying resolutions and sizes because I broke the bank on the PC build. 2 are on DVI, one is on HDMI)

SLI is configured to maximize 3d performance, and physx is set to auto select.

In Precision X, sync is set to off, as both cards have different stock clocks, (#1 has a boost clock of 1100mhz or so, and #2 has a boost clock of 1050)

K-boost is set to off, and adjusting the voltages doesn't seem to do anything, I'm assuming because the clocks and volts are being adaptively managed.

As I'm sitting here typing this up, Card 1 is underclocked to 810mhz and 912mv. temp is 40c, load is 0-2%
Card #2 is clocking in at 324mhz, 850mv, and a temp of 29C, with a load of 0-1%

Sounds great, right? I'm saving the environment and more importantly, money on my power bill. I can putter around all day on my desktop with it setup like this. The cores and voltages happily adjust themselves and everything seems great. Playing videos works, my games seemed to work, nothing at all seemed wrong.

Except that after running 3dmark, my display looks like this.




As horrible as this looks, the system is still completely stable. The problem was remedied by using the windows+P key, to switch to single monitor mode, then back to extended desktop.

Changing the power management setting in the Nvidia control panel between adaptive and maximum performance does nothing to stop this from happening.

Now, Let's enable K-Boost.

The first obvious change is that both graphics cards are no longer adjusting their clock speeds or voltages so dramatically. They run at full boost mode all the time. Both cards appear to be running OVER their stated boost speeds in Precision X, by about 75mhz, and both are running at 1175mv
The second significant change is that I am pulling 40-60 watts more power at idle. Both cards also now idle at 42-43c.

However, 3dmark completes with no visual problems whatsoever.

Switching the Nvidia control panel between adaptive and prefer maximum power gains a minor boost in 3dmark scores. Enabling Sync raises it a bit more.

My only conclusion that I can draw is that I have to suck up using extra power at idle, and that Nvidia or EVGA cannot engineer a system to handle adaptive clock changes worth a poo poo.

SocketSeven fucked around with this message at 17:34 on Jan 21, 2013

craig588
Nov 19, 2005

by Nyc_Tattoo

SocketSeven posted:

In Precision X, sync is set to off, as both cards have different stock clocks, (#1 has a boost clock of 1100mhz or so, and #2 has a boost clock of 1050)
That could be a problem. Try syncing them to the slowest cards speed.

quote:

K-boost is set to off, and adjusting the voltages doesn't seem to do anything, I'm assuming because the clocks and volts are being adaptively managed.
Something's wrong there. I just confirmed it pushes the voltage up even at idle. All the setting does is set a minimum dynamic voltage, the (indicated) voltage should never drop below what it's set to.

quote:

Both cards appear to be running OVER their stated boost speeds in Precision X, by about 75mhz

Rated boost speeds are kind of a lie. There's a whole range of possibilities depending on temperature and load.

You never ever should have to use the K boost setting to be stable at stock speeds. It's a really great way to bypass all of the dynamic clock changes for when someone wants to benchmark without putting serious effort into bios editing, but it's terrible for pretty much everything else. Unfortunately It's really looking like one of your cards is bad.

craig588 fucked around with this message at 17:53 on Jan 21, 2013

SocketSeven
Dec 5, 2012
This wouldn't happen to have something to do with the fact that one of the cards is "superclocked" from the factory, and the other is not would it?

Sync seems to adjust card speeds just fine, but since the clock speed setting is an offset, it just offsets both of them by #, rather then setting them to the same value.

I'm going to try a complete driver re-install. The system didn't seem to need one when I swapped out the 570 for the 660, or when I added the second one either. However, since I've clearly got issues, I'm going to give it a shot.

Is there anything in my system bios I should be poking at, since I'll be rebooting a bunch anyway?

SocketSeven
Dec 5, 2012
So I pulled out ALL the Nvidia drivers, The Intel integrated GPU drivers, uninstalled Precision X, Then reinstalled both Precision X and the Nvidia drivers from EVGA's website (They're out of date but whatever, EVGA says they work).

poo poo works fine now. I'm even getting better 3Dmark scores.

I am simply a moron of the highest caliber, who believes we live in a magical future where unified drivers actually mean they'll work with anything.
:suicide:

craig588
Nov 19, 2005

by Nyc_Tattoo
I still have such paranoia about that that I reformat my computer anytime I change a videocard or motherboard.

SocketSeven
Dec 5, 2012
Having grown up with dos, win 3.1, 95, 98, and XP, uninstalling everything related to any hardware change was standard operating procedure.

Windows 7 is so good at Just working I had forgotten the rules.

Always a fresh install of drivers and related software. ALWAYS.

Oh well. Having a permanent digital record of being an idiot is better then an RMA any day. RMA's cost money, and hopefully someone can learn from my bad example.

Thanks so much for talking me through this. I would have driven myself up a wall messing with other stuff or just ignored it entirely. You've saved me a lot of energy and frustration.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
News is spreading that VGLeaks has posted what they claim to be final Xbox 720 specs, featuring a CPU with eight 1.6Ghz Jaguar cores (the descendent of the Bobcat cores used in the E-series low-power APUs) and Radeon HD 8770 graphics. I'm rather skeptical of this because it seems like giving up on per-thread CPU performance and relying totally on many slow cores is a proven-wrong approach, but we shall see. Similar rumors are spreading about the Playstation 4, including that that is a fully-integrated APU based on the Radeon HD 7870. A 7870 wins a matchup against an 8770, but by how much will depend on clockspeeds, power, and efficiency. The 7870 has 67% more shaders and up to twice the memory bandwidth, but we'll have to see what the actual deployed configuration is.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Alereon posted:

News is spreading that VGLeaks has posted what they claim to be final Xbox 720 specs, featuring a CPU with eight 1.6Ghz Jaguar cores (the descendent of the Bobcat cores used in the E-series low-power APUs) and Radeon HD 8770 graphics. I'm rather skeptical of this because it seems like giving up on per-thread CPU performance and relying totally on many slow cores is a proven-wrong approach, but we shall see. Similar rumors are spreading about the Playstation 4, including that that is a fully-integrated APU based on the Radeon HD 7870. A 7870 wins a matchup against an 8770, but by how much will depend on clockspeeds, power, and efficiency. The 7870 has 67% more shaders and up to twice the memory bandwidth, but we'll have to see what the actual deployed configuration is.

I'd be skeptical too. 8 weak cores? This sounds like the exact opposite of what you'd want in a games console. How will backwards compatibility be handled? .. and so on.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
Digital Foundry also says their "trusted sources" confirm the PS4 and Xbox 720 are Jaguar-based, but I have no idea how legit they are. If it's true, they better have implemented amazing Turbo or I can't see this going well.

HalloKitty posted:

I'd be skeptical too. 8 weak cores? This sounds like the exact opposite of what you'd want in a games console. How will backwards compatibility be handled? .. and so on.
Have there ever been any statements confirming backwards compatibility? My assumption would be that they would port most recent/popular games to the new platforms then just give you a free copy if you owned it before, not that they would try to make them play games from the last generation, but I could be wron.

Endymion FRS MK1
Oct 29, 2011

I don't know what this thing is, and I don't care. I'm just tired of seeing your stupid newbie av from 2011.
Although, maybe with a "moar cores!" approach game developers will be forced to multithread things and we end up with PC ports that are actually able to take advantage of quad core (plus hyperthreading?) systems to the fullest, ushering in a new age of extremely efficient PC gaming?

Probably not, but I'm optimistic :v:

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.
Nah, you'd be surprised at how much the consoles influence game development. There's a reason most PC games these days are touted "PC ports".

If anything, having both consoles use essentially the same architecture will make PC games from next console gen far better. The PS3's Cell processor was an awesome piece of engineering, but a very specialised one which pretty much required a hand coded effort to fully harness its power. These specs, on the other hand, are much closer to traditional PC multicore development. So odds are that the barrier between PS3 or 360 exclusives will be a lot thinner now, and by extension PC versions would be much easier to come up with.

But yeah, architecture aside, PC development is way more complicated, even if drivers are supposed to hide all the ugly PC complexities. And, more importantly, the PC market is still much smaller than that of the consoles, which is really what will dictate the amount of effort developers are willing to invest in the end.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Well, for the sake of argument, let's say the eight-core thing IS true. Each 1.6 GHz Jaguar core, while very much lacking compared to, say, an Ivy Bridge CPU, isn't exactly nothing.

Let's crib from AnandTech's recent look at a first-gen Core 2 Duo in 2013:



The E-350 there at the bottom is Bobcat. It's about 2/3 the single-thread performance of a Core 2 Duo in within-cache situations, and actually faster per clock than Bulldozer.

It's tough to estimate the single-threaded performance of an Xbox 360 in a meaningful way. It's a 3.2 GHz triple-core system with 1-input-4-output SIMD vector units, optimized for game logic and 3D graphics tasks. It's restricted to in-order execution but has 2 thread per core symmetric multithreading (like Hyperthreading). 1 MB of L2 cache is shared between all three cores - about 2/3 Bobcat's per-core L2. So... y'know, this is the kind of core that sounds an awful lot like a higher-clocked, SIMD-flavored Intel Atom.

For the record, the Xbox 360 GPU uses an eDRAM buffer, much like Intel's upcoming Haswell GT3e. I guess there's nothing new under the sun. Anyhoo, here's the whitepaper I'm referencing in case anyone else feels they can do a better job at this than I can.

So a 1.6 GHz Jaguar core is likely not fast, per se, but it's at least competitive and probably a huge bump up from an Xbox 360 core. AMD claims 10% faster clocks than Bobcat and 15% better IPC, plus the cache goes from 512 KB per core dedicated to 2 MB per four cores shared. You get a 2-wide instruction pipeline in an out-of-order design, and that likely beats a 2-thread SMT in-order core pretty handily. Plus taking the CPU off of graphics duty means that you get a 4-pipe ALU and two-pipe AVX-enabled FPU dedicated solely to logic or what have you, to go with your GPGPU-enabled Radeon cores. You can then throw a lot of cores in and task out a lot of different threads - assuming a lot of hard sync work is done in R&D and plugged into the SDK and HDK so it's not another PS3, theoretically superpowerful if it weren't so drat hard to code for.

So then you'd end up with a system that has obvious limits in single-thread tasks, but which has a lot to throw at a wide variety of "entertainment hub" tasks - core logic on a pair of cores, AI and sound on two more, GPU control on a fifth, Kinect on a sixth, OS and network stack and streaming to a simple receiver device like a set-top box or a tablet used as a controller on whatever's left. And you get all of this with dirt-cheap pre-made cores. No need to design and fab a big fat custom CPU, and no need to design Haswell- or Krait-grade fine-grained power domains to shut off unneeded cores; these cores as lightweight enough to be handled with coarse, per-core control. That makes the VRM cheaper, too. And it keeps cooling manageable and quiet. And hopefully avoids RRoDs.

But at the end of the day, yes, you come up to problems with per-core performance. Latency spikes - may be an issue. Really complex AI? Right out. But then, who plays StarCraft at a console? That's a game built for a different kind of system, not a couch device with a controller. Large-scale multiplayer? That may remain the exclusive domain of the PC. And Microsoft would be okay with that - they sell PCs and PC accessories, too. Video encode/decode? Stick in a cheap transcoder chip. Hell, throw in four, one for each player. Why task the CPU with that at all?

So what else does a many-core system do? Well, at a fundamental level, it optimizes the system towards the kinds of tasks and games that tablets and smartphones handle with aplomb. An iPad 3 is compute-competitive with an Xbox 360, and as tablet games have become a big thing, indie games have spread from the PC to other gaming markets. You don't necessarily need a lot of compute beef to have a lot of fun, so a crucial question for the next generation of consoles is "Can we compete with a $200 Google Nexus 7 or a $330 iPad Mini?" Extra compute can only go so far; it needs to be less expensive up-front than $300 with no persistent storage or $500 for a giant black space heater with a dinky hard drive. Those price/feature combos compete with computers, not tablets. We're decently into the age of good-enough compute for consumer devices; why not so with consoles, as well?

So with all these games out on the market being designed for phones and tablets with many weak processors, you can provide their developers with a similar hardware ecosystem to work with. This makes it easier to port their games over to a game console, only as much of a chore as it is to port an Xbox 360 game to a PC - different and with differing capabilities, but with core similarities that let you avoid re-inventing the wheel for each port, especially with cross-developed engines like UE3 or Unity. And yet things are still way better than the last generation by a goodly amount, in terms of graphics power and multitasking - the former being the console's biggest task, and the latter being its new second life.

The Illusive Man
Mar 27, 2008

~savior of yoomanity~

Alereon posted:

Have there ever been any statements confirming backwards compatibility? My assumption would be that they would port most recent/popular games to the new platforms then just give you a free copy if you owned it before, not that they would try to make them play games from the last generation, but I could be wron.

None whatsoever, considering they haven't even said a word about the next-gen consoles' existence, let alone any broader details. I'd expect if anything we'll see something resembling the 360's BC capability - software-based emulation of some really popular titles, but nothing more. Sony's philosophy of 'all playstation games playable on all playstations' died with the 40 GB PS3 and the realities of cost-cutting.

I'm really curious to know if Eurogamer's leaked PS4 specs are accurate - using the same CPU in both the Xbox and PS4 is good for gamers in general, but there could be quite a performance gulf down the road if the GPU specs are accurate for both systems. Also very curious how the 4 GB DDR5 vs. 8 GB DDR3 memory tradeoff will fare.

Josh Lyman
May 24, 2009


Endymion FRS MK1 posted:

Although, maybe with a "moar cores!" approach game developers will be forced to multithread things and we end up with PC ports that are actually able to take advantage of quad core (plus hyperthreading?) systems to the fullest, ushering in a new age of extremely efficient PC gaming?

Probably not, but I'm optimistic :v:
Isn't the issue with PC gaming more to do with different hardware and not being to able to optimize for one GPU as opposed to CPU power?

Yaos
Feb 22, 2003

She is a cat of significant gravy.

Josh Lyman posted:

Isn't the issue with PC gaming more to do with different hardware and not being to able to optimize for one GPU as opposed to CPU power?
They have to work around possible bugs that can come up with different configurations, in addition to having to go through 3rd party APIs. In a 2011 interview about Rage (the game) John Carmack mentioned they had to talk to Nvidia, ATI, and Intel so they could get Rage working correctly on the PC. They did fix the issues though, but it's just another hurdle.

bull3964
Nov 18, 2000

DO YOU HEAR THAT? THAT'S THE SOUND OF ME PATTING MYSELF ON THE BACK.


Given the current gen console lifespan, I don't know how Sony or Microsoft could get away with NOT providing backward compatibility.

Let's face it, no matter how good things look on paper, not many people are saying "I feel so bound by the hardware limitations of the current generation consoles!" (aside from PC gamers, that is.)

The leap from the xbox to the xbox 360 and the PS2 to the PS3 was much higher than the leap we are going to see from the current gen to next gen. A new $400 console (at launch) is a much easier sell when you can tell people "That will also replace your aging, noisy, Xbox 360 elite that somehow hasn't managed to RROD yet."

Ultimately, I would think BC would best be solved by some sort game streaming down the road. If both were using some sort of Nvidia chipset, I could easily see that happening, but since they are AMD, not so much.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Why not? AMD has its own streaming game plans, and GCN hardware has a standalone video encode/decode engine the same as Kepler and Sandy/Ivy Bridge

bull3964
Nov 18, 2000

DO YOU HEAR THAT? THAT'S THE SOUND OF ME PATTING MYSELF ON THE BACK.


Factory Factory posted:

Why not? AMD has its own streaming game plans, and GCN hardware has a standalone video encode/decode engine the same as Kepler and Sandy/Ivy Bridge

I actually didn't know that AMD had their own streaming plans.

Red_Mage
Jul 23, 2007
I SHOULD BE FUCKING PERMABANNED BUT IN THE MEANTIME ASK ME ABOUT MY FAILED KICKSTARTER AND RUNNING OFF WITH THE MONEY

bull3964 posted:

Let's face it, no matter how good things look on paper, not many people are saying "I feel so bound by the hardware limitations of the current generation consoles!" (aside from PC gamers, that is.)

IDK, the PS3 and 360 are starting to show their age. Games haven't been doing much "revolutionary" simply on account of there is only so much performance you can eke out of a ps3 or 360 before the framerate goes to poo poo. Some dev mentioned in an interview about Black Ops 2 that the disc only has something like 4 megs of free space left they've packed the textures as well as they can (while still maitaining a AAA fidelity and a 30fps floor). The 360s obvious visible shortcoming lately is using DVDs as its storage media, leading to one of the consoles biggest releases (Halo 4) requiring a diskswap or an installed HDD to play the whole game. The PS3 only fairs slightly better with many games requiring an (slow) install to actually be played. They both start to choke on modern media formats in HD, due to the lack of dedicated decoders for them, and while the 360 has at least updated its UI to play nice with the kinect and actually feel modern in that respect, the PS3's XMB seems to actually get worse with every update.

While the original Just Cause, and something like Saints Row 3 look like they were developed for two completely different systems, the fact that the limited system RAM (and the update structure) meant that like Minecraft had to be significantly cut down is something that just about any 12 year old can point out is not modern. Playing with the Wii U has been interesting because it is clear that Nintendo has some great plans, and that a new batch of consoles have a lot of potential, but the benefits of that 1 gig of ram for the game just aren't going to show for another year or two.

Spatial
Nov 15, 2007

bull3964 posted:

not many people are saying "I feel so bound by the hardware limitations of the current generation consoles!" (aside from PC gamers, that is.)
You mean "Aside from every developer ever".

Tezzeract
Dec 25, 2007

Think I took a wrong turn...

bull3964 posted:

Given the current gen console lifespan, I don't know how Sony or Microsoft could get away with NOT providing backward compatibility.

Let's face it, no matter how good things look on paper, not many people are saying "I feel so bound by the hardware limitations of the current generation consoles!" (aside from PC gamers, that is.)

The leap from the xbox to the xbox 360 and the PS2 to the PS3 was much higher than the leap we are going to see from the current gen to next gen. A new $400 console (at launch) is a much easier sell when you can tell people "That will also replace your aging, noisy, Xbox 360 elite that somehow hasn't managed to RROD yet."

Ultimately, I would think BC would best be solved by some sort game streaming down the road. If both were using some sort of Nvidia chipset, I could easily see that happening, but since they are AMD, not so much.

It'll be hard for Sony to do backward compatibility because they're weaning off of Cell processors. Microsoft could have an easier time. But yeah streaming is one way to handle backward compatibility.

Through the funny thing is that most console gamers don't seem to care too much about BC. Sure it's a way to beef up the library, but console owners bought the system to play "current-gen" games. And are also more than happy to get HD remakes.

bull3964
Nov 18, 2000

DO YOU HEAR THAT? THAT'S THE SOUND OF ME PATTING MYSELF ON THE BACK.


Spatial posted:

You mean "Aside from every developer ever".

Devs yeah, but there's only so much graphical information that can been seen on an 40" 1080p screen at 8 feet.

Smoothing out the framerate is a definite plus. However, I'm just pointing out that last gen jumped from SD to HD and this gen is going to stay on HD so one parameter isn't changing as significantly. It's going to take a lot for people to say "We couldn't play this on the current generation of consoles?"


Tezzeract posted:

Through the funny thing is that most console gamers don't seem to care too much about BC. Sure it's a way to beef up the library, but console owners bought the system to play "current-gen" games. And are also more than happy to get HD remakes.

That depends on the segment of the market you are sampling. Some are latest and greatest, some only have the time to work through a game every other month or so and have a big backlog of current gen stuff they would want to play. Emphasis on DLC and other out of game items (Skylander collections for example), means that some people aren't going to want to give up the old so easily.

I just wonder if maybe Microsoft will be able to die shrink the current Xbox SoC and just include it wholesale in the next gen console. It was confirmed that the current slim console doesn't have a newer SoC than the iteration before it, so there's likely room to go smaller, cooler, and cheaper yet.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

One Eye Open posted:

On my laptop, the AMD Vision Engine control centre that came with 13.1 didn't recognise my discrete GPU and somehow borked the (non-AMD :psyduck:) wireless drivers. Uninstalling it separately and reinstalling the 12.10 version fixed it though. I have a feeling it may be HP's (the laptop's manufacturer) fault though for having a weird setup - the latest drivers on their site for it are for 11.7.

Okay, here's the deal on that:

Via AnandTech, 13.1 is for Enduro laptops only. It doesn't include drivers for older Dynamic Switchable Graphics laptops. Ivy Bridge or Trinity - use 13.1. Sandy Bridge or Llano - use the previous driver or a modded driver.

One Eye Open
Sep 19, 2006
Am I awake?

Factory Factory posted:

Okay, here's the deal on that:

Via AnandTech, 13.1 is for Enduro laptops only. It doesn't include drivers for older Dynamic Switchable Graphics laptops. Ivy Bridge or Trinity - use 13.1. Sandy Bridge or Llano - use the previous driver or a modded driver.

Thanks for that - shame I can't use leshcat drivers as my laptop is AMD(Phenom2 + 4 series)-AMD(6 series). I think I'll use my frankendriver a bit longer.

One Eye Open fucked around with this message at 18:40 on Jan 22, 2013

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Alereon posted:

Digital Foundry also says their "trusted sources" confirm the PS4 and Xbox 720 are Jaguar-based, but I have no idea how legit they are. If it's true, they better have implemented amazing Turbo or I can't see this going well.
Have there ever been any statements confirming backwards compatibility? My assumption would be that they would port most recent/popular games to the new platforms then just give you a free copy if you owned it before, not that they would try to make them play games from the last generation, but I could be wron.

I just figured it would be foolish in the extreme, given the massive install base of 360 games.

Xbox to 360 wasn't so crucial, as the Xbox reached fewer people in the long run.

Tezzeract posted:

It'll be hard for Sony to do backward compatibility because they're weaning off of Cell processors. Microsoft could have an easier time. But yeah streaming is one way to handle backward compatibility.

Through the funny thing is that most console gamers don't seem to care too much about BC. Sure it's a way to beef up the library, but console owners bought the system to play "current-gen" games. And are also more than happy to get HD remakes.

Thing is, we've had the 360/PS3 generation for so long, that people still think it is "current" even though we're talking about machines that are from 2005. But people are still wowed by the graphics. Indeed, GTAV is soon to hit these very ancient machines. No matter how amazing the newer consoles' graphics are, I think we've gotten past the point where people think the 3D is primitive and cheesy (see 3D PlayStation games with their warped geometry and grainy everything), and will probably still think 360/PS3 graphics of the last couple of years are acceptable, thus would probably like to play their games for a bit longer without having multiple machines.
It's not like going from SNES to the N64.

But I could be wrong, consumers may lap any old thing up. I was personally hoping on BC because 360s are notoriously unreliable (I have 2: 1 died, was repaired, then died, and the other died out of warranty), and I don't want to spend money just to buy an obsolete console in a shinier box with smaller process chipset to go alongside a new Xbox at the same time.

The old leaked document did suggest it would have a 360 subsystem in the box, but it seems a bit naïve to believe it will - clearly the costs would mount up quickly. I guess we just wait for E3.

HalloKitty fucked around with this message at 19:01 on Jan 22, 2013

Space Gopher
Jul 31, 2006

BLITHERING IDIOT AND HARDCORE DURIAN APOLOGIST. LET ME TELL YOU WHY THIS SHIT DON'T STINK EVEN THOUGH WE ALL KNOW IT DOES BECAUSE I'M SUPER CULTURED.

Tezzeract posted:

It'll be hard for Sony to do backward compatibility because they're weaning off of Cell processors. Microsoft could have an easier time. But yeah streaming is one way to handle backward compatibility.

Through the funny thing is that most console gamers don't seem to care too much about BC. Sure it's a way to beef up the library, but console owners bought the system to play "current-gen" games. And are also more than happy to get HD remakes.

Microsoft and Sony are both going from Power to x86, and the 360's GPU has some funky extra features that wouldn't be fun to emulate. On the 360, a little bit of very fast RAM (just enough for a 1080p framebuffer) is integrated directly into the GPU, which allows for nearly-free antialiasing and a few other tricks.

Streaming probably isn't going to happen. It requires a lot of infrastructure, it makes MS's/Sony's system performance depend on uncontrollable outside factors, and it gets into a lot of thorny legal issues. Instead, look for software-based backwards compatibility using high-level emulation. Microsoft already took this approach with their last major architectural transition, and Sony made some use of it on the PS3 before they dropped backwards compatibility entirely.

It is going to be a selling point, though. Yes, now that we're firmly in the 360/PS3 generation, people don't give a crap about being able to play Xbox/PS2 games. But, when a console first comes out, it's a feature people want. The hardcore gamer types are going to line up on launch day regardless, but for someone who's on the fence, "you can trade in the old one for some store credit and get a discount, but still play all your old games" is a draw.

bull3964 posted:

Devs yeah, but there's only so much graphical information that can been seen on an 40" 1080p screen at 8 feet.

Smoothing out the framerate is a definite plus. However, I'm just pointing out that last gen jumped from SD to HD and this gen is going to stay on HD so one parameter isn't changing as significantly. It's going to take a lot for people to say "We couldn't play this on the current generation of consoles?"

Well, for one, the current generation of consoles has a hard time running at native 1080p. The standard trick is to render the 3D scene at 720p (or sometimes even less!), scale it to 1080p, and then put 2D UI stuff over the top of that. There's plenty of room to play with GPGPU processing; neither current console supports it. And, in the rest of the system, more RAM means nicer textures, larger levels, and so forth. A faster CPU (and that GPGPU integration) means more sophisticated procedural animation and more complex gameplay. There's plenty of room for improvement.

Proud Christian Mom
Dec 20, 2006
READING COMPREHENSION IS HARD

Endymion FRS MK1 posted:

Although, maybe with a "moar cores!" approach game developers will be forced to multithread things and we end up with PC ports that are actually able to take advantage of quad core (plus hyperthreading?) systems to the fullest, ushering in a new age of extremely efficient PC gaming?

Probably not, but I'm optimistic :v:

This is what I'm reading(and hoping) is the point of this.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
At the same time, an i5-3570K could probably do all the work of eight Jaguar cores even without hyperthreading, just by juggling threads.

Maxwell Adams
Oct 21, 2000

T E E F S

Endymion FRS MK1 posted:

Although, maybe with a "moar cores!" approach game developers will be forced to multithread things and we end up with PC ports that are actually able to take advantage of quad core (plus hyperthreading?) systems to the fullest, ushering in a new age of extremely efficient PC gaming?

Probably not, but I'm optimistic :v:

Not only that, but in order to use the 8 gigs of ram the XBox 420 is rumored to have, the binaries will have to be 64 bit. We'll be getting highly multithreaded, 64-bit games optimized for x86 architecture.

Yaos
Feb 22, 2003

She is a cat of significant gravy.

Factory Factory posted:

At the same time, an i5-3570K could probably do all the work of eight Jaguar cores even without hyperthreading, just by juggling threads.
A bit off-topic, but this made me remember Haswell and GT3 graphics. Having good entry-level graphics in the processor blows my mind. I know AMD has something similar, but they will go bankrupt. Maybe Nvidia will buy them up and we'll get to have Nvidia x86 processors.

Now slightly on topic since we're talking about consoles. Gamasutra has an article on retail software sales for consoles and handhelds. It excludes PC sales, and I assume tablet/smartphone game sales. The summary of it is that sales peaked at 11 billion USD in 2008 and is 500 million USD above 2005 levels, the author does not attribute this to end of the life as the slow sales were still hitting during the 3DS and Vita launches.
http://www.gamasutra.com/view/news/184899/US_retail_software_in_2012_What_you_need_to_know.php#.UP9SJCc0V8E

With Valve wanting to upset the console market by getting a bunch of OEMs to build their own version of a Steambox PC, things could get interesting. Nobody really knows what it is they want to do other than get Steam in the living room and I don't think they were referring to it as a console, so I guess we'll find out as time goes on, maybe at E3?

Adbot
ADBOT LOVES YOU

GRINDCORE MEGGIDO
Feb 28, 1985


Factory Factory posted:

At the same time, an i5-3570K could probably do all the work of eight Jaguar cores even without hyperthreading, just by juggling threads.

I wonder what priorities made them choose the Jaguar approach.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply