Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us $3,400 per month for bandwidth bills alone, and since we don't believe in shoving popup ads to our registered users, we try to make the money back through forum registrations.
«21 »
  • Post
  • Reply
SupSuper
Apr 8, 2009

At the Heart of the city is an Alien horror, so vile and so powerful that not even death can claim it.


Warmachine posted:

This is a general programming question, since it has been a sore spot with regards to a certain game I like. People ask for multi-threading/64-bit all of the time with Rimworld, and I tend to brush off the whole multi-threading thing because that is a can of worms I can't currently wrap my head around, and see as unnecessary compared to the constant headache of running out of addressable memory. I understand it isn't as simple as flipping a switch on the compiler, but what I don't understand is WHERE in the process it starts to become an issue. So my question boils down to two things:

1) Why in TYOOL 20XX since the release of Windows 7 would someone decide not to compile a game nativly to 64-bit even if initial design requirements don't predict addressing requirements to exceed the limits of 32-bit addressing.

2) What and where are the hurdles in taking a finished product from 32-bit to 64-bit addressing?
Middleware, probably. To use Rimworld as an example, it uses the Unity engine, so they are bound by whatever Unity supports (which seems to do 64-bit builds, but has very poor multithreading support). And any native libraries and plugins they use have to also support 64-bit. And now they have to support two separate builds.

Adbot
ADBOT LOVES YOU

mastermind2004
Sep 14, 2007



Warmachine posted:

This is a general programming question, since it has been a sore spot with regards to a certain game I like. People ask for multi-threading/64-bit all of the time with Rimworld, and I tend to brush off the whole multi-threading thing because that is a can of worms I can't currently wrap my head around, and see as unnecessary compared to the constant headache of running out of addressable memory. I understand it isn't as simple as flipping a switch on the compiler, but what I don't understand is WHERE in the process it starts to become an issue. So my question boils down to two things:

1) Why in TYOOL 20XX since the release of Windows 7 would someone decide not to compile a game nativly to 64-bit even if initial design requirements don't predict addressing requirements to exceed the limits of 32-bit addressing.

2) What and where are the hurdles in taking a finished product from 32-bit to 64-bit addressing?

1) At least until recently (don't know if it's still true or not), China still had a significant amount of Windows XP boxes, which would not support 64 bit.

2) Ensuring that there aren't cases where type conversion causes loss of data is important in the conversion. If you don't use explicitly sized types, things that you expect to be the same in 32 bit can be different in 64 bit, such as size_t versus int. You also have to deal with a change in the size of pointers and other types, so if you were doing anything clever with pointers to do something funky like serialization (or marshaling data or anything), that could all break. Also, the hurdle of convincing people that it is worth the investment of any amount of time on something that is already complete and well tested. Just because you change the build target to x64 and it compiles doesn't mean it is going to run, so you would need to go through whatever QA or cert process on whatever platforms you are shipping on.

Warmachine
Jan 30, 2012

I've got a bad feeling about this.


mastermind2004 posted:

1) At least until recently (don't know if it's still true or not), China still had a significant amount of Windows XP boxes, which would not support 64 bit.

2) Ensuring that there aren't cases where type conversion causes loss of data is important in the conversion. If you don't use explicitly sized types, things that you expect to be the same in 32 bit can be different in 64 bit, such as size_t versus int. You also have to deal with a change in the size of pointers and other types, so if you were doing anything clever with pointers to do something funky like serialization (or marshaling data or anything), that could all break. Also, the hurdle of convincing people that it is worth the investment of any amount of time on something that is already complete and well tested. Just because you change the build target to x64 and it compiles doesn't mean it is going to run, so you would need to go through whatever QA or cert process on whatever platforms you are shipping on.

So the shift in architecture, all else held equal, can result in something that was addressed in 32 bit space being in a different location in 64 bit space, and things that reference the 32 bit location won't be able to find the 64 bit location?

Cocoa Crispies
Jul 20, 2001

Vehicular Manslaughter!



Pillbug

Warmachine posted:

So the shift in architecture, all else held equal, can result in something that was addressed in 32 bit space being in a different location in 64 bit space, and things that reference the 32 bit location won't be able to find the 64 bit location?

Yeah, but figuring out exactly where in memory is the linker's job.

From personal experience literally yesterday, I was trying to get Zandronum, a Doom 1 & 2 runtime, to work on 32-bit ARM (armv6l, on a Raspberry Pi Zero W). This involved:

  1. Getting the source
  2. Getting libraries with binaries available (a one-liner in Raspbian Linux)
  3. Letting the build process discover what libraries had to be downloaded and compiled as source (because Raspbian doesn't distribute them in the way Zandronum wants)
  4. Letting the build process start compiling it, i.e. turning C & C++ files into object files
  5. Discovering that p_spec.cpp assumes that char types are 8 bits and signed, as they are on i386 and x86-64, but on armv6l they're unsigned.
  6. Fixing that file by hand.
  7. Starting the build again
  8. Compiling succeeds, build links it. Linking is how one or more object files (which are basically CPU bytecode and references to memory locations that bytecode needs) get turned into an executable (which is broken into sections for CPU bytecode, sections for constants (like chunks of text or hard-coded values), sections for global & static variables (variables that there only needs to be a single place for in memory), and instructions on which parts of memory those sections map to.

Porting something between architectures (and i386 and x86-64 are different architectures) means lots of fixing bugs like that type problem I had. I lucked out in that I only had to change one variable declaration. Worst case scenario would be rewriting a huge chunk of Zandronum, rewriting libraries it depends on, and that's only even remotely workable because it's running on a fairly common open-source stack. If it was doing weird poo poo inside a closed-source engine and didn't work, I'd've just locked the computer and played Splatoon instead.

bob dobbs is dead
Oct 8, 2017

embrace your inner slack

What bug/issue trackers do gamedevs use? Do they have special issue trackers like they have perforce and poo poo for vcs?

mastermind2004
Sep 14, 2007



We use JIRA, and I think it's relatively common.

leper khan
Dec 28, 2010
Honest to god thinks Half Life 2 is a bad game. But at least he likes Monster Hunter.

bob dobbs is dead posted:

What bug/issue trackers do gamedevs use? Do they have special issue trackers like they have perforce and poo poo for vcs?

We use JIRA, and I really wish we didnít use perforce for code.

BULBASAUR
Apr 6, 2009



Soiled Meat

Depends on the studio, but we also use JIRA. I've been places that used TTP, Hansoft, perforce, and others.

SilentW
Apr 3, 2009

my It dept hgere is fucking clwonshoes, and as someone hwo used to do IT for 9 years it pains me to see them fbe so terriuble

Fallen Rib

The studio I'm currently at uses Jira, but I've used TestTrack Pro as well - there seems to have been an overall push towards Jira recently because it's free, and integrates with confluence.

Studio
Jan 15, 2008

Did I say a lot?
I meant TONS of anime titties!!

!

Jira isn't free?

Adbot
ADBOT LOVES YOU

ShadowHawk
Jun 25, 2000

The company has no assets of any significant value.


Nap Ghost

Studio posted:

Jira isn't free?
It's free until you hit a certain number of users.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply
«21 »