Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Warmachine
Jan 30, 2012



This is a general programming question, since it has been a sore spot with regards to a certain game I like. People ask for multi-threading/64-bit all of the time with Rimworld, and I tend to brush off the whole multi-threading thing because that is a can of worms I can't currently wrap my head around, and see as unnecessary compared to the constant headache of running out of addressable memory. I understand it isn't as simple as flipping a switch on the compiler, but what I don't understand is WHERE in the process it starts to become an issue. So my question boils down to two things:

1) Why in TYOOL 20XX since the release of Windows 7 would someone decide not to compile a game nativly to 64-bit even if initial design requirements don't predict addressing requirements to exceed the limits of 32-bit addressing.

2) What and where are the hurdles in taking a finished product from 32-bit to 64-bit addressing?

Adbot
ADBOT LOVES YOU

Warmachine
Jan 30, 2012



mastermind2004 posted:

1) At least until recently (don't know if it's still true or not), China still had a significant amount of Windows XP boxes, which would not support 64 bit.

2) Ensuring that there aren't cases where type conversion causes loss of data is important in the conversion. If you don't use explicitly sized types, things that you expect to be the same in 32 bit can be different in 64 bit, such as size_t versus int. You also have to deal with a change in the size of pointers and other types, so if you were doing anything clever with pointers to do something funky like serialization (or marshaling data or anything), that could all break. Also, the hurdle of convincing people that it is worth the investment of any amount of time on something that is already complete and well tested. Just because you change the build target to x64 and it compiles doesn't mean it is going to run, so you would need to go through whatever QA or cert process on whatever platforms you are shipping on.

So the shift in architecture, all else held equal, can result in something that was addressed in 32 bit space being in a different location in 64 bit space, and things that reference the 32 bit location won't be able to find the 64 bit location?

Warmachine
Jan 30, 2012



leper khan posted:

My understanding is that margins are improving dramatically with the move away from boxes. And the timing on cashflow has also almost certainly gotten better as well.

The "other factors" like this are the most likely cause of sticky prices. I can't vouch for the sensitivity of demand in the industry, but if you can make more money without changing the prices your customers see, that is a great way to avoid upsetting the apple cart. Speculating using what I know about economics (and specifically the economics that I'm sure the people setting the prices studied), "price goes up, demand goes down." Buuuut if overhead costs fall as demand rises (more consumers entering the market), the supply curve slides rightward instead, capturing the new sales without moving the price.

Seems as good of an explanation as any.

Warmachine
Jan 30, 2012



Chewbot posted:

The enforced pacing of TBS is probably the #1 reason any of what we've attempted works at all. If you weren't basically "on rails", the pacing wouldn't carry nearly the same weight that we were able to squeeze out of it. To make that happen, you've figured exactly right that we designed all the gameplay around the story, which is definitely NOT what most developers do. You have to be so confident that your story is going to carry the game that you're willing to potentially sacrifice the gameplay. We weren't as confident about our story as we were incredibly naive about how hard it would be, and also a fair bit of luck that it all kinda came together. :D

This was want I wanted to ask about, from a consumer of games who has a "games as a narrative" perspective when he goes hunting for story based games. I've often thought that the writing problem could be partially solved by writing the story first, then figuring out what gameplay elements you need to make the story come alive after, and develop the actual game based on that road map. The obvious hurdle is accounting for what is technically possible, but I feel like if you are trying to sell a game on story, lacking innovation on the gameplay front is probably more acceptable (at least to me) than having a horrible ludonarrative disconnect pressing F to pay respects.

The second thing that came to mind was Alpha Protocol, as this is one of my all time favorite story games for how natural the conversations seem. I never felt constrained by the story when playing, and always felt that I did in fact have agency even though the narrative was always going to end up in the same places. Real or imagined, I felt like my choices when handling conversations and characters led to different outcomes, even if every story leads to storming the Grey Box. If pressed, I'd say it is because the short length of the game lends itself better to actually portraying branching outcomes that long form AAA epics don't have the luxury of when trying to cram tens of hours of gameplay into the package. What are your thoughts on sacrificing length of gameplay for breadth of storytelling?

Warmachine
Jan 30, 2012



Discendo Vox posted:

lovingly rendered condom normal maps

The same cannot be said for the cigarette butts.

Adbot
ADBOT LOVES YOU

Warmachine
Jan 30, 2012



This is a specifically PC related question, but it really goes for any platform that can have variable hardware configurations: What goes into picking the hardware requirements for a game? Is it just, "These are the best and worst devices the QA team had access to," or is there something more?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply