Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Good Dumplings
Mar 30, 2011

Excuse my worthless shitposting because all I can ever hope to accomplish in life is to rot away the braincells of strangers on the internet with my irredeemable brainworms.

leftist heap posted:

I really struggle to see how the 120 number isn't a total sham.

I'd bet money they're counting season passes as "titles", lol

Adbot
ADBOT LOVES YOU

leftist heap
Feb 28, 2013

Fun Shoe

Good Dumplings posted:

I'd bet money they're counting season passes as "titles", lol

Someone somewhere has a spreadsheet that lists every possible storefront item in 2020 and they just did a count on it.

Stux
Nov 17, 2006

Gobbeldygook posted:

It's going to lean heavily on games that already run on Linux+Vulkan, so all the games on this page that fit that criteria are strong candidates

https://www.pcgamingwiki.com/wiki/List_of_Vulkan_games

I wouldn't bet on them hitting 120 games this year, but I also don't think it's implausible that they pull it off.

just not how it works. at all. its basically the same as a console platform and needs specific ports.

Wheany
Mar 17, 2006

Spinyahahahahahahahahahahahaha!

Doctor Rope

I would love this on a T-shirt or a mug

SCheeseman
Apr 23, 2003

Stux posted:

just not how it works. at all. its basically the same as a console platform and needs specific ports.

Is there confirmation from devs that this is the case? What needs to be changed, exactly?

Stux
Nov 17, 2006

SCheeseman posted:

Is there confirmation from devs that this is the case? What needs to be changed, exactly?

i mean you could probably technically do a very quick and dirty port using only high level stuff the same way you could do the same to an xbox because its running windows and has directx apis available to use, but its going to give you awful performance. ps3 and ps4 run freebsd and the ps3 had opengl but it didnt mean you could just take unix ports with opengl and throw them on there and call it good.

the stadia specs arent exactly brute force high level style like you would expect if it was simply acting like a linux pc, its a 2.7ghz cpu with what a vega 56 or something, and its "supposed" to be able to do 4k60. thats not happening without platform specific optimisations and most games will require that to run at all on a very console-like cpu spec.

SCheeseman
Apr 23, 2003

Stux posted:

the stadia specs arent exactly brute force high level style like you would expect if it was simply acting like a linux pc, its a 2.7ghz cpu with what a vega 56 or something, and its "supposed" to be able to do 4k60. thats not happening without platform specific optimisations and most games will require that to run at all on a very console-like cpu spec.

As it stands the performance that games are running at seems to be in line with what you'd expect from a PC with those same specs.

Wheany
Mar 17, 2006

Spinyahahahahahahahahahahahaha!

Doctor Rope

Holy poo poo

Stux
Nov 17, 2006

SCheeseman posted:

As it stands the performance that games are running at seems to be in line with what you'd expect from a PC with those same specs.

please both track down a current cpu that runs 2.7ghz and then install and boot destiny 2 and borderlands 3 on it

SCheeseman
Apr 23, 2003

In order to get 60fps I'd probably have to dramatically cut back geometric detail to reduce draw calls and as it turns out that's basically what they did.

60fps isn't terribly hard to achieve with games that have already been optimized for the supremely lovely Jaguar cores in current generation consoles. I have a couple of PCs with a i5 2400 /w with a GTX 1050ti I set up for my nieces, the IPC of those CPUs probably isn't far off from a 2.7ghz Ryzen (not to mention slower RAM etc) and 60fps isn't too hard to achieve on those PCs playing modern titles just by turning down graphics settings. If there's a bottleneck, it's usually the GPU.

VideoGames
Aug 18, 2003
Whoever wrote the reddit post that gave us the new title, thank you. It is perfect.

It was tough between that and Hostile Apostle's post where they said "They're not gonna let the service wither and die."

limaCAT
Dec 22, 2007

il pistone e male
Slippery Tilde

VideoGames posted:

Whoever wrote the reddit post that gave us the new title, thank you. It is perfect.

It was tough between that and Hostile Apostle's post where they said "They're not gonna let the service wither and die."

Apostle is technically right, because first Google has to give Harrison a golden parachute after making Stadia flail around like scared chickens screaming "PIVOT TO CONSUMER PIVOT TO CONSUMER" for some years.

American McGay
Feb 28, 2010

by sebmojo
I think something has to have been at least somewhat alive to then wither.

Neddy Seagoon
Oct 12, 2012

"Hi Everybody!"
Someone in the VR thread linked a picture of Kojima trying out a VR headset, and down in the comments was this little random gem;


quote:

Exciting, look forward to what creations come from the studio with VR, my excitement comes from the potential of VR games via @GoogleStadia !

Give us a linked treadmill VR headset and Death Stranding and we can all feel the struggle going West for real this time

When you gotta dream, dream big :allears:. I think they'd be very disappointed if they actually knew what any kind of latency does to someone in VR though.

Stux
Nov 17, 2006

SCheeseman posted:

In order to get 60fps I'd probably have to dramatically cut back geometric detail to reduce draw calls and as it turns out that's basically what they did.

60fps isn't terribly hard to achieve with games that have already been optimized for the supremely lovely Jaguar cores in current generation consoles. I have a couple of PCs with a i5 2400 /w with a GTX 1050ti I set up for my nieces, the IPC of those CPUs probably isn't far off from a 2.7ghz Ryzen (not to mention slower RAM etc) and 60fps isn't too hard to achieve on those PCs playing modern titles just by turning down graphics settings. If there's a bottleneck, it's usually the GPU.

an i5 2400 is below minimum spec for borderlands 3, metro exodus and rdr2 which all run on stadia. the only one its ok for is destiny 2, which would be naturally the least cpu reliant out of the four. and its not a ryzen, its an amd gpu but its an intel cpu.

also like... games being optimised for 7x1.7ghz cores on a console really doesnt have the level of knock on effect youre implying. theyre pretty heavily reliant on gpgpu to suppliment the cpu which is why they get away with it in the first place, and thats absolutely not something we see mirrored in the PC versions of the same games. its only just now becoming more common for games on pc to actually push past using 4 cores and thats not because of consoles, ps3 and 360 both had 6 cores/threads available and it didnt pass over to PC games the entire gen. also the main platform being optimised for is the ps4. running freebsd. and its own propritary low level apis. even on the xbox its not running like a pc, the implementation of directx is naturally lower level and requires hardware specific optimisation. stadia running on 2.7ghz cores isnt going to be able to just run a straight linux port and there would be absolutely no reason for them to even be specifying a specific hardware target instead of scaling to each title if they werent getting games ported to target that hardware spec.

Antigravitas
Dec 8, 2019

Die Rettung fuer die Landwirte:

I hate this kind of marketing wank where everyone pats each other on the back with grandiose quotes.

Especially here, since Vulkan 1.2 is effectively just promoting extensions that already exist into the core of the spec. You can see the Vulkan work in radv commits. Most of them are just renaming extensions to remove their prefix, all the actual work has been done months ago and exposed to applications.

WaltherFeng
May 15, 2013

50 thousand people used to live here. Now, it's the Mushroom Kingdom.
If Kojima did a VR game it would probably break your brain harder than the ending of MGS2, in a good way.

SCheeseman
Apr 23, 2003

Stux posted:

an i5 2400 is below minimum spec for borderlands 3, metro exodus and rdr2 which all run on stadia. the only one its ok for is destiny 2, which would be naturally the least cpu reliant out of the four. and its not a ryzen, its an amd gpu but its an intel cpu.

also like... games being optimised for 7x1.7ghz cores on a console really doesnt have the level of knock on effect youre implying. theyre pretty heavily reliant on gpgpu to suppliment the cpu which is why they get away with it in the first place, and thats absolutely not something we see mirrored in the PC versions of the same games. its only just now becoming more common for games on pc to actually push past using 4 cores and thats not because of consoles, ps3 and 360 both had 6 cores/threads available and it didnt pass over to PC games the entire gen. also the main platform being optimised for is the ps4. running freebsd. and its own propritary low level apis. even on the xbox its not running like a pc, the implementation of directx is naturally lower level and requires hardware specific optimisation. stadia running on 2.7ghz cores isnt going to be able to just run a straight linux port and there would be absolutely no reason for them to even be specifying a specific hardware target instead of scaling to each title if they werent getting games ported to target that hardware spec.

An i5 2400 may be below minimum specs for all those games but paired with a good enough graphics card they'll still run in excess of 40fps the majority of the time. If the i5 is constrained by anything it's by the amount of threads available due to SMT being disabled, i7s from that generation perform much better running at 60fps+ most of the time (at 3.4Ghz instead of 3.2 though, but this is an architecture from almost a decade ago with a bunch of performance killing security mitigations in place). From most reports Metro Exodus struggles to hit a consistent 60fps on Stadia anyway (edit: was wrong, it's 30fps lol)

I think you're underestimating how fast modern CPU cores are (regardless of vendor) at 2.7ghz in comparison to a 1.7ghz Jaguar core, even at the same clocks the Jaguar is going to have around half the performance. Not an expert, but I'd imagine GPGPU is more useful on consoles due to the shared memory architecture, Stadia probably has separate pools of memory just like most PCs do. There's no fancy SRAM or whatever, the API is just Vulkan, the CPU is bog standard, what is there to optimize that exclusively exists on Stadia's platform and architecture?

SCheeseman fucked around with this message at 12:32 on Jan 17, 2020

Ineffiable
Feb 16, 2008

Some say that his politics are terrifying, and that he once punched a horse to the ground...


Here's some better numbers.

If you sort this by ps4 only and oldest release date first:

https://store.playstation.com/en-us/grid/STORE-MSF77008-PS4ALLGAMESCATEG/1?direction=asc&platform=ps4&sort=release_date

You'll get about 8 pages at 25 items a page (these are games, not dlc or other content) making about 200 releases to the end of 2014. (yes I'm aware there is some 2013 stuff so this is the ps4 life up to 2014)

Stux
Nov 17, 2006

SCheeseman posted:

An i5 2400 may be below minimum specs for all those games but paired with a good enough graphics card they'll still run in excess of 40fps the majority of the time. If the i5 is constrained by anything it's by the amount of threads available due to SMT being disabled, i7s from that generation perform much better running at 60fps+ most of the time (at 3.4Ghz instead of 3.2 though, but this is an architecture from almost a decade ago with a bunch of performance killing security mitigations in place). From most reports Metro Exodus struggles to hit a consistent 60fps on Stadia anyway (edit: was wrong, it's 30fps lol)

I think you're underestimating how fast modern CPU cores are (regardless of vendor) at 2.7ghz in comparison to a 1.7ghz Jaguar core, even at the same clocks the Jaguar is going to have around half the performance. Not an expert, but I'd imagine GPGPU is more useful on consoles due to the shared memory architecture, Stadia probably has separate pools of memory just like most PCs do. There's no fancy SRAM or whatever, the API is just Vulkan, the CPU is bog standard, what is there to optimize that exclusively exists on Stadia's platform and architecture?

no they wont, i specifically picked those out because i know they wont. you can go find footage of 2400s trying to run rdr 2 and hitting 100% utilisation and becoming literally unplayable or unwatchable during cutscenes. the cpu will hit such a limit a cutscene will lock up and become seconds per frame. and its pushing an extra 700mhz. the i7-2600 runs at 3.8ghz. with 8 threads. the security patches have no impact on games or nearly any general computing, the main casaulty is xeons running a lot of storage mangement.

i think youre misunderstanding how the consoles operate because youre basing everything on it being 1.7ghz cores. consoles for the last decade and a half have benefitted from a lot more parallelisation than pc. it took a long time on pc to move to dual core ultilisation to quad core to now where we're seeing things actually not be limited and scaling over almost an arbitrary amount of cores and even then its still limited to relatively few titles. gpgpu is useful anywhere, the difference here is that it benefits a lot from specific targetting for optimisation and pc often can afford to throw raw power at a problem as it is designed entirely around that. its just the continuation of parallelisation which is naturally more useful and more suited to consoles where power consumption, thermal constraints and size are bigger considerations and where you have the luxury of expecting developers to heavily optimise to a set configuration. the ps4 especially was very very purposefully designed around gpgpu, its literally integral to why the console functions and why one of the biggest issues at launch was that the ps4 despite having a slightly lower cpu clock than the base xbone at 1.6ghz was able to vastly out pace it at what would be traditionally considered cpu heavy loads in a pc environment. everyone remembers sucker punch yelling endlessly about 140k physics particles per frame in infamous but it wasnt bluster, the ps4 still handles large physics loads like that exceptionally well. driveclub may have had many many many problems, but its ability to push out scenes with volumetrics and particles and complex dynamic weather systems that shouldve killed a 1.6ghz cpu wasnt one of them. also its not about how fast modern cpu cores are vs jaguar, because jaguar wasnt fast when it came out. its a mobile apu. the fastest desktop one they ever made capped out at 2.2ghz and was slow for the time.

the cpu isnt bog standard, theres no off the shelf intel part that runs at 2.7ghz per core (assume it either doesnt turbo or 2.7ghz is the turbo), and google and intel have been explicit that it is a custom part. the gpu as well has been confirmed as a custom part based on vega. even the OS is only "based on" debian linux and is a custom version and they have their own custom sdk. even if it wasnt the simple fact that its a Single Hardware Target means you can optimise more, by definition. but very specifically it would be required to get anything sane out of that cpu, even if its an abnormal number of 2.7ghz cores that alone would require per platform optimisation as its rare a game can take advantage of something like that if its not being optimised for, i dunno, a console. with a bunch of parallelisation. and a relatively low clock cpu. but a 2.7ghz anything isnt enough without optimisation. there is nothing on pc anywhere close to that slow, and you can say all you want that its better than a jaguar at 1.7ghz, but the jaguars at 1.7ghz are currently still running games that a 2.7ghz cpu trying to run something on pc wouldnt even boot. and the biggest question still remains: why would they care about being so explicit about a certain hardware target when this is supposed to be cloud computing. if its just running a regular linux port why would it matter, you could just throw it on anything. it would be bigger benefit to not have one hardware target to do the thing people on reddit are sure theyll do where they just constantly throw in new hardware as they go. why have a custom cpu and gpu on a custom os with its own special sdk.

univbee
Jun 3, 2004




Good Dumplings posted:

I'd bet money they're counting season passes as "titles", lol

I'll bet every time they give away a code for Borderlands Golden Keys on Twitter counts as one, for that matter.

SCheeseman
Apr 23, 2003

I can barely read your posts so I'm splitting it up to respond to specific points like a pedant, sorry.

Stux posted:

no they wont, i specifically picked those out because i know they wont. you can go find footage of 2400s trying to run rdr 2 and hitting 100% utilisation and becoming literally unplayable or unwatchable during cutscenes. the cpu will hit such a limit a cutscene will lock up and become seconds per frame. and its pushing an extra 700mhz. the i7-2600 runs at 3.8ghz. with 8 threads. the security patches have no impact on games or nearly any general computing, the main casaulty is xeons running a lot of storage mangement.
This is because in order to test performance people usually run at an uncapped framerate, locking it to 30 or 60 will usually clear those up though admittedly some games just have hosed frame pacing.

Stux posted:

i think youre misunderstanding how the consoles operate because youre basing everything on it being 1.7ghz cores. consoles for the last decade and a half have benefitted from a lot more parallelisation than pc. it took a long time on pc to move to dual core ultilisation to quad core to now where we're seeing things actually not be limited and scaling over almost an arbitrary amount of cores and even then its still limited to relatively few titles. gpgpu is useful anywhere, the difference here is that it benefits a lot from specific targetting for optimisation and pc often can afford to throw raw power at a problem as it is designed entirely around that. its just the continuation of parallelisation which is naturally more useful and more suited to consoles where power consumption, thermal constraints and size are bigger considerations and where you have the luxury of expecting developers to heavily optimise to a set configuration. the ps4 especially was very very purposefully designed around gpgpu, its literally integral to why the console functions and why one of the biggest issues at launch was that the ps4 despite having a slightly lower cpu clock than the base xbone at 1.6ghz was able to vastly out pace it at what would be traditionally considered cpu heavy loads in a pc environment. everyone remembers sucker punch yelling endlessly about 140k physics particles per frame in infamous but it wasnt bluster, the ps4 still handles large physics loads like that exceptionally well. driveclub may have had many many many problems, but its ability to push out scenes with volumetrics and particles and complex dynamic weather systems that shouldve killed a 1.6ghz cpu wasnt one of them. also its not about how fast modern cpu cores are vs jaguar, because jaguar wasnt fast when it came out. its a mobile apu. the fastest desktop one they ever made capped out at 2.2ghz and was slow for the time.
Consoles have always had lovely CPUs and made up for that difference in graphics performance. GPUs becoming more general purpose plus the constraints from poor single threaded CPU performance has pushed engine developers to better take advantage of GPUs for compute and multithread as much as possible. Most of those optimizations carry over to PC though, the major differences to developers are mostly in it's memory architecture: both the GPU and CPU can access the same pool of memory with little overhead. I'm not sure if Stadia does this or not. Probably not.

Stux posted:

the cpu isnt bog standard, theres no off the shelf intel part that runs at 2.7ghz per core (assume it either doesnt turbo or 2.7ghz is the turbo), and google and intel have been explicit that it is a custom part. the gpu as well has been confirmed as a custom part based on vega. even the OS is only "based on" debian linux and is a custom version and they have their own custom sdk. even if it wasnt the simple fact that its a Single Hardware Target means you can optimise more, by definition. but very specifically it would be required to get anything sane out of that cpu, even if its an abnormal number of 2.7ghz cores that alone would require per platform optimisation as its rare a game can take advantage of something like that if its not being optimised for, i dunno, a console. with a bunch of parallelisation. and a relatively low clock cpu. but a 2.7ghz anything isnt enough without optimisation. there is nothing on pc anywhere close to that slow, and you can say all you want that its better than a jaguar at 1.7ghz, but the jaguars at 1.7ghz are currently still running games that a 2.7ghz cpu trying to run something on pc wouldnt even boot. and the biggest question still remains: why would they care about being so explicit about a certain hardware target when this is supposed to be cloud computing. if its just running a regular linux port why would it matter, you could just throw it on anything. it would be bigger benefit to not have one hardware target to do the thing people on reddit are sure theyll do where they just constantly throw in new hardware as they go. why have a custom cpu and gpu on a custom os with its own special sdk.

Just a guess: It's a Xeon with turbo boost disabled, as performance has to remain consistent across instances and there's plenty of Xeons with 2.7Ghz base clocks. It probably has a Vega 56 connected over PCIe, though maybe they put it on the same chiplet who knows. I doubt there are any fancy new instructions, embedded RAM or anything like that which would make it any different to target. They're using Linux and Vulkan because AMD GPU driver support on mainline Linux is excellent these days and they've rolled their own SDK because all platforms have SDKs.

"Custom version of Debian" is likely just debian with kernel patches for performance, maybe some of those are custom but more likely they're what's made available by AMD and implemented in cutting edge distributions like PopOS.

SCheeseman fucked around with this message at 15:00 on Jan 17, 2020

univbee
Jun 3, 2004




In the spirit of being a complainly-pants and playing Devil's Advocate for Stadia, I actually timed how long my PS4 took to patch two games that didn't patch while it was in rest mode for some reason (Overwatch and Fortnite). Both were on the prior most recent version before this. Overwatch's patch was like a gig and Fornite's was I think 200-500 megs somewhere? Anyway, downloading the patches themselves took no time, but the "Copying..." process (actually applying the update) took a total of around 40 minutes for both games which was annoying as gently caress.

This weirdly means that the Xbox One is usually faster to update games for me, because even if they're bigger downloads, my gigabit connection can usually grab the extra data way faster than doing the same on the PS4.

It's also annoying when a game decides to do its own in-UI thing for patching, like NBA 2K20 and Street Fighter V, since these aren't automatically done, you have to explicitly launch the game for these updates to happen.

The main other annoyance is that for games with a season pass (where you get future DLC for free), you still have to take action to get this free DLC, it's not automatic. Like, whenever a new character is released for MK11, there's a patch, and also there's an unlock DLC (which is free for me) that I have to explicitly download.

If I want to bring MK11 to a friend's house who has a PS4 but not that game, it's an annoying process, since even with a disc the game has over 14 gigs of updates now. This would actually be slightly easier with an XB1 since there's a bit more flexibility for external storage (you can have two externals at once, as well as duplicates of games, and it doesn't completely lose its poo poo if you remove an external drive that isn't currently in use), so I can get a 256 gig USB key and copy the game+DLC assets there for travel. I still have to sign into my account for the DLC to be useable, though. This is technically doable on PS4 as well but it's much more of a chore because only one external is allowed and data can only exist on one drive at one time, so it becomes a crazy juggling act, especially if the friend is also using external storage. Regardless, it's generally quicker and easier to just bring your own console.


Now again, THE CLOUD actually solving these problems is all conditional on the cloud version actually being updated properly, and this is a problem to a certain degree across the board including on Stadia.

Stadia was using a stale image of NBA 2K20, forcing a lengthy update on every startup before it was finally fixed.
Stadia's Borderlands 3 is using the launch build and doesn't have any of the events or QoL updates from the other versions, which are pretty significant.

On Geforce Now the Witcher 3 has that DLC gotcha I mentioned recently. There are probably stale games abound, too, but I don't know any other specifics.

On PSNow I know of a few instances:

No longer an issue since it's no longer on the service, but GTA 5 intentionally used a stale image for streaming so you couldn't do GTA Online, only the Single Player. If you downloaded it to a PS4 you were fine, though.

Borderlands 1 (PS3 version) isn't updated so you can't export characters from it to use on the PS4 version. It's also super-easy to corrupt your save because the game is almost constantly auto-saving and if you lose connectivity while this is happening your save gets hosed.

Elder Scrolls Online is similarly not up-to-date if you try and stream it, which since it's an always-online MMO means the streaming version is literally unusable, you never get past the first screen. You're fine if you download it locally.


It actually sounds like plans for the new consoles are going to yield better results and be a much better solution to the install/update problem:

1. They are making it so you can download game assets more intelligently and selectively. For example, it will likely be possible to only install Single Player, or only install Multi-Player, if you only want to do one of those things, and also more intelligent installation for language assets, which would especially be a godsend in PAL territories where some game versions are stupid-large because there are like 8 different audio languages of the game built into the SKU. This was famously an issue with the first Watch Dogs, where on PC you could reduce the game's size from 22 gigs to 15 gigs just by deleting whatever languages' audio files you weren't using (each language had like 1.5 gigs of files), and the menu would even intelligently adjust for this. On consoles you can see this a little with Halo: MCC, which now lets you selectively install game elements.

2. They should be leveraging resources better to download and patch considerably faster than before. Likewise, more intelligent data layouts should reduce the need to do full redownloads of the game client, which was annoyingly frequent this gen (Overwatch did it a few times, as did Forza Horizon 3, Halo MCC, a fat stack of early XB1 games, Division 2, Fallout 76...)

Stux
Nov 17, 2006

SCheeseman posted:

I can barely read your posts so I'm splitting it up to respond to specific points like a pendant, sorry.

This is because in order to test performance people usually run at an uncapped framerate, locking it to 30 or 60 will usually clear those up though admittedly some games just have hosed frame pacing.

that... doesnt make the frame rate tank. it doesnt make cpu utlisation hit 100% as fps drops below 1. that isnt frame pacing. thats the cpu getting overloaded and not being able to push out a frame fast enough.

quote:

Consoles have always had lovely CPUs and made up for that difference in graphics performance. GPUs becoming more general purpose plus the constraints from poor single threaded CPU performance has pushed engine developers to better take advantage of GPUs for compute and multithread as much as possible. Most of those optimizations carry over to PC though, the major differences to developers are mostly in it's memory architecture: both the GPU and CPU can access the same pool of memory with little overhead. I'm not sure if Stadia does this or not. Probably not.

yes the ps3 with its bonkers extremely powerful for the time cell cpu backed by a 7800 which had been discontinued before it launched had a bad cpu made up for in graphics performance. the 360 with its 6 thread 3.2ghz cpu had poor cpu performance. consoles were already doing multithreading at the cpu level 15 years ago for the 360/ps3, with both consoles having and heavily utilising 6 cores/threads. the ps3 cell cpu in particular is insanely parallelised almost to a fault: it is literally architecturally designed entirely around the concept of coprocessing to the point that was ridiculous to actually use properly with an entire controller cpu controlling 6 seperate coprocessors. you are literally as wrong as it is possible to be here. even going back to the ps2 doing things in parallel was extremely important because it has a ridiculous number of different coprocessors specifically for doing different things in parallel before the idea of multiple general purpose cores on one die was viable. these optimisations do not carry over to pc. this is demonstrated pretty easily by the fact that since 2005 console games have been utilising 6 cpu threads for processing, while on pc during the same time period it was considered excessive to buy a quad core for gaming because games wouldnt use it properly. this held over to the start of this generation and is the entire reason intel was so dominant, because their focus on per core power as opposed to core count by AMD was specifically better suited to PC games due to their lackluster multicore scaling. this is literally only changing now, as we speak. its still extremely common for games to struggle to utilise a high number of cores/threads even though the ps4 and xbox one both use 7 cores + gpgpu in every game. yes stadia also does that, with hbm2. no that isnt the most major difference its part of a package of differences, see above. you have no idea what you are talking about. genuinely stop arguing if you literally have no idea what you are saying.

quote:

Just a guess: It's a Xeon with turbo boost disabled, as performance has to remain consistent across instances and there's plenty of Xeons with 2.7Ghz base clocks. It probably has a Vega 56 connected over PCIe, though maybe they put it on the same chiplet who knows. I doubt there are any fancy new instructions, embedded RAM or anything like that which would make it any different to target. They're using Linux and Vulkan because AMD GPU driver support on mainline Linux is excellent these days and they've rolled their own SDK because all platforms have SDKs.

ok so its a xeon with turbo boost disabled. so its 2.7ghz. whats your point. thats very slow. it categorically doesnt have a pci-e vega 56. its been confirmed by both google and amd as a custom chip and its memory bandwidth is too high to be a vega 56 while its CU count is too low to be a 64. because its neither, because its a custom chip because theyve said its a custom chip. like what are we even doing here what are you actually talking about. do you have any clue what youre talking about at all? theyre using their own custom OS based on debian, but its not just straight debian. this is like saying because ps4 uses a custom version of freeBSD all ps4 games should run on freeBSD. stadia does have universal memory and it isnt a secret. you can google this.

univbee
Jun 3, 2004




EDIT Oh never mind, I think this is just the ability to stream from your actual console that's now available in all territories. loving laffo that this was region-locked at all, when people were doing this on their PS4 in 2013.

univbee fucked around with this message at 15:36 on Jan 17, 2020

SCheeseman
Apr 23, 2003

God this is tedious.

I asserted that games ported to Stadia are probably straight PC ports. I think your argument is that the games couldn't possibly be running as well as they are running without hardware-specific optimizations because of Stadia's relatively poor single threaded performace due to it's low clock rate?

But the games don't run good. The newer AAA titles ones often struggle to hit their targets of either 30 or 60 and have the PC equivalent of low-medium graphics settings, about in line with what you'd expect from a PC with a 2.7Ghz 4c/8t (or however many they assign to it, I assume it differs per game) CPU paired with a Vega 56. Not a common setup admittedly so it's hard to give any proof, though I guess someone could underclock an i7 and do some benchmarks to find out.

As for specifics about Stadia's specs, I did Google it. I didn't find any solid information on it having a shared memory architecture, the official word from Stadia seems to be it has HBM2 memory (listed under GPU specs) and 16GB "total" memory (listed under a separate memory section):

e: This article makes a compelling case of it being an 8GB/8GB split.

SCheeseman fucked around with this message at 16:42 on Jan 17, 2020

Ineffiable
Feb 16, 2008

Some say that his politics are terrifying, and that he once punched a horse to the ground...


univbee posted:

In the spirit of being a complainly-pants and playing Devil's Advocate for Stadia, I actually timed how long my PS4 took to patch two games that didn't patch while it was in rest mode for some reason (Overwatch and Fortnite). Both were on the prior most recent version before this. Overwatch's patch was like a gig and Fornite's was I think 200-500 megs somewhere? Anyway, downloading the patches themselves took no time, but the "Copying..." process (actually applying the update) took a total of around 40 minutes for both games which was annoying as gently caress.

This weirdly means that the Xbox One is usually faster to update games for me, because even if they're bigger downloads, my gigabit connection can usually grab the extra data way faster than doing the same on the PS4.

It's also annoying when a game decides to do its own in-UI thing for patching, like NBA 2K20 and Street Fighter V, since these aren't automatically done, you have to explicitly launch the game for these updates to happen.

The main other annoyance is that for games with a season pass (where you get future DLC for free), you still have to take action to get this free DLC, it's not automatic. Like, whenever a new character is released for MK11, there's a patch, and also there's an unlock DLC (which is free for me) that I have to explicitly download.

If I want to bring MK11 to a friend's house who has a PS4 but not that game, it's an annoying process, since even with a disc the game has over 14 gigs of updates now. This would actually be slightly easier with an XB1 since there's a bit more flexibility for external storage (you can have two externals at once, as well as duplicates of games, and it doesn't completely lose its poo poo if you remove an external drive that isn't currently in use), so I can get a 256 gig USB key and copy the game+DLC assets there for travel. I still have to sign into my account for the DLC to be useable, though. This is technically doable on PS4 as well but it's much more of a chore because only one external is allowed and data can only exist on one drive at one time, so it becomes a crazy juggling act, especially if the friend is also using external storage. Regardless, it's generally quicker and easier to just bring your own console.


Now again, THE CLOUD actually solving these problems is all conditional on the cloud version actually being updated properly, and this is a problem to a certain degree across the board including on Stadia.

Stadia was using a stale image of NBA 2K20, forcing a lengthy update on every startup before it was finally fixed.
Stadia's Borderlands 3 is using the launch build and doesn't have any of the events or QoL updates from the other versions, which are pretty significant.

On Geforce Now the Witcher 3 has that DLC gotcha I mentioned recently. There are probably stale games abound, too, but I don't know any other specifics.

On PSNow I know of a few instances:

No longer an issue since it's no longer on the service, but GTA 5 intentionally used a stale image for streaming so you couldn't do GTA Online, only the Single Player. If you downloaded it to a PS4 you were fine, though.

Borderlands 1 (PS3 version) isn't updated so you can't export characters from it to use on the PS4 version. It's also super-easy to corrupt your save because the game is almost constantly auto-saving and if you lose connectivity while this is happening your save gets hosed.

Elder Scrolls Online is similarly not up-to-date if you try and stream it, which since it's an always-online MMO means the streaming version is literally unusable, you never get past the first screen. You're fine if you download it locally.


It actually sounds like plans for the new consoles are going to yield better results and be a much better solution to the install/update problem:

1. They are making it so you can download game assets more intelligently and selectively. For example, it will likely be possible to only install Single Player, or only install Multi-Player, if you only want to do one of those things, and also more intelligent installation for language assets, which would especially be a godsend in PAL territories where some game versions are stupid-large because there are like 8 different audio languages of the game built into the SKU. This was famously an issue with the first Watch Dogs, where on PC you could reduce the game's size from 22 gigs to 15 gigs just by deleting whatever languages' audio files you weren't using (each language had like 1.5 gigs of files), and the menu would even intelligently adjust for this. On consoles you can see this a little with Halo: MCC, which now lets you selectively install game elements.

2. They should be leveraging resources better to download and patch considerably faster than before. Likewise, more intelligent data layouts should reduce the need to do full redownloads of the game client, which was annoyingly frequent this gen (Overwatch did it a few times, as did Forza Horizon 3, Halo MCC, a fat stack of early XB1 games, Division 2, Fallout 76...)

Univbee, I think this post is a pretty good thought piece. We have to remember that soon, Stadia isn't going to be competing with ps4 and Xbox one. Soon it'll be ps5 and Xbox series x and if they have improvements across the board like the intelligent install and ultra fast ssd as well as having graphics more impressive than what Stadia offers then it only looks worse for Stadia as it's left behind in the dust. If this is what stadia is doing with 10 tf, what is it gonna be like when consoles who get increased performance for the same power will look like next year?

And we've said it so many times. The most popular games are either free to play or a popular online shooter (siege, overwatch and call of duty as examples). It seems very unlikely that Stadia will get any of those free to play games and it's going to take a long time for popular online shooters to have an impact since the audience numbers just might not be good enough for a long time.

Neddy Seagoon
Oct 12, 2012

"Hi Everybody!"

Ineffiable posted:

And we've said it so many times. The most popular games are either free to play or a popular online shooter (siege, overwatch and call of duty as examples). It seems very unlikely that Stadia will get any of those free to play games and it's going to take a long time for popular online shooters to have an impact since the audience numbers just might not be good enough for a long time.

They're a non-starter anyway; They'd be siloed into their own playerbase and can't play with the other platforms, going by Destiny 2.

Chalks
Sep 30, 2009

Neddy Seagoon posted:

They're a non-starter anyway; They'd be siloed into their own playerbase and can't play with the other platforms, going by Destiny 2.

Is it a technical requirement that they do this or are they standing by some misguided idea that minimal latency between players on the server side is so important that it outweighs the players actually existing?

boo_radley
Dec 30, 2005

Politeness costs nothing
The Chrome team is working—very possibly in cooperation with Valve—to bring Steam to Chromebooks.

Ringo Star Get
Sep 18, 2006

JUST FUCKING TAKE OFF ALREADY, SHIT
Usually before bed, for a laugh, I’ll check out the Stadia Reddit. I see A LOT of similarities between how MLM/Essential Oils/Cutco Knives people talk and how Stadia people talk.

Fried Watermelon
Dec 29, 2008


WaltherFeng posted:

If Kojima did a VR game it would probably break your brain harder than the ending of MGS2, in a good way.

Kojima's VR game keeps telling me to take off the VR helmet but I'm not wearing it is this a glitch?

Rotten Red Rod
Mar 5, 2002

Neddy Seagoon posted:

They're a non-starter anyway; They'd be siloed into their own playerbase and can't play with the other platforms, going by Destiny 2.

Chalks posted:

Is it a technical requirement that they do this or are they standing by some misguided idea that minimal latency between players on the server side is so important that it outweighs the players actually existing?

I don't think there's any reason to expect crossplay won't happen for games ported to Stadia. It just happens that all the games on Stadia so far are ones that do not feature crossplay - Destiny 2 has crossSAVE on all platforms, but not crossplay. That's on Bungie, not Google.


Oooh, neat! I'd love to play Stardew Valley on my Chromebook.

Rotten Red Rod fucked around with this message at 17:07 on Jan 17, 2020

Barudak
May 7, 2007


I, uh, that seems very Google to stab another department in the back

Stux
Nov 17, 2006

SCheeseman posted:

God this is tedious.

I asserted that games ported to Stadia are probably straight PC ports. I think your argument is that the games couldn't possibly be running as well as they are running without hardware-specific optimizations because of Stadia's relatively poor single threaded performace due to it's low clock rate?

But the games don't run good. The newer AAA titles ones often struggle to hit their targets of either 30 or 60 and have the PC equivalent of low-medium graphics settings, about in line with what you'd expect from a PC with a 2.7Ghz 4c/8t (or however many they assign to it, I assume it differs per game) CPU paired with a Vega 56. Not a common setup admittedly so it's hard to give any proof, though I guess someone could underclock an i7 and do some benchmarks to find out.

As for specifics about Stadia's specs, I did Google it. I didn't find any solid information on it having a shared memory architecture, the official word from Stadia seems to be it has HBM2 memory (listed under GPU specs) and 16GB "total" memory (listed under a separate memory section):


it literally says it, right there, in the picture you yourself have posted. you cant really be this much of a dipshit.

Stux
Nov 17, 2006

going to start saying "god this is tedious" when ive been thoroughly owned about not having a clue about what im talking about but unfortunately that never happens so ill never say it

SCheeseman
Apr 23, 2003

Where? It says the GPU has access to HBM2 memory. In another column it states there is 16GB RAM "total" with "up to" 484GB/s transfer speed, which is the same memory speed as a discreet Vega 64 with 8GB of HBM2.

"Total" implies a sum. "Up to" implies either variable speeds (unlikely) or is the speed of the HBM2, with the system memory being slower. There is nothing in that image that implies shared memory.

e: The conversation got tedious because it descended into pointless nitpicking, making comparisons with old consoles and other bullshit that doesn't really have anything to do with what I was originally talking about.

SCheeseman fucked around with this message at 17:25 on Jan 17, 2020

leftist heap
Feb 28, 2013

Fun Shoe

Rotten Red Rod posted:

I don't think there's any reason to expect crossplay won't happen for games ported to Stadia. It just happens that all the games on Stadia so far are ones that do not feature crossplay - Destiny 2 has crossSAVE on all platforms, but not crossplay. That's on Bungie, not Google.

The logistics of cross play with Stadia players in Google's cloud infrastructure is probably not straightforward if my experience with network poo poo in GCP is any indication

Gobbeldygook
May 13, 2009
Hates Native American people and tries to justify their genocides.

Put this racist on ignore immediately!

Ringo Star Get posted:

Usually before bed, for a laugh, I’ll check out the Stadia Reddit. I see A LOT of similarities between how MLM/Essential Oils/Cutco Knives people talk and how Stadia people talk.
Also bitcoiners and penny stock investors. There's a definite family resemblance.

Barudak posted:

I, uh, that seems very Google to stab another department in the back
There will probably be zero overlap between games that run on Stadia and games that run on ChromeOS. Now if Google Cloud rolled out a consumer-oriented service for running your own personal gaming cloud that would be a beautiful betrayal.

Adbot
ADBOT LOVES YOU

univbee
Jun 3, 2004




leftist heap posted:

The logistics of cross play with Stadia players in Google's cloud infrastructure is probably not straightforward if my experience with network poo poo in GCP is any indication

Cloud play's been a thing and a hot button issue for long enough that Google really should have designed Stadia with it in mind.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply