Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
mastermind2004
Sep 14, 2007

I'm one of the people who will also be answering, I've been a gameplay programmer for almost 10 years now, a couple years at EA, and have been at Robot Entertainment since.

Adbot
ADBOT LOVES YOU

mastermind2004
Sep 14, 2007

JPrime posted:

I've done software development (c#, web/windows based apps) for the last 15 years or so, how does game dev compare to a "typical" software dev (think web-based app talking to sql, along those lines)? I'm completely self taught so I wonder how much formal training helps in that industry (algorithms, complex maths, etc).
I will preface this all by saying that I have not worked a "typical" software job, other than a little bit in college, so my knowledge of "typical" dev is pretty limited.

The biggest difference between the two is probably how focused you have to be on performance in games. If you're aiming for 60FPS in your game, it means you have around 16ms to execute everything. If you're doing a VR title, that drops to around 11ms (due to VR being 90FPS). There's also less of a concern with things being perfectly correct, and more focus on them being fun or being mostly correct and fast. Code bases in games also tend to be pretty huge. OMDU has over 8,000 files in our solution, with some of those files being 10-15k lines of code. I don't happen to have a line counter handy, but I wouldn't be surprised if we had somewhere north of 500k lines of code, and that's actually probably shrunk from the peak on this project.

Different types of games also bring their own sets of challenges as well. An RTS for example, normally uses a distributed deterministic simulation (ie, everyone runs the entire game in lock step with each other), so you need to make sure that when you're writing code for it, that your code will behave the same on all of the clients, otherwise you end up with the game state desynchronizing, and the dreaded "desync, game over". If you're building an FPS/TPS, you generally have dedicated servers, which means separating the logic between client logic and server logic, and ensuring that everything works correctly in both local/singleplayer, and in dedicated server multiplayer (and also potentially "listen" server, or having one of the clients act as a server).

I feel like this is the sort of topic that I could probably spend hours thinking of additional differences to talk about, so I'm going to leave it here for now.

As for being self taught, I'm pretty heavily self taught as well. The main things that are probably relatively uncommon to learn that are helpful for game dev would be trig and vector math in general (both 2-D and 3-D are relevant). Algorithms wise, I haven't seen too many problems where I've been wishing I had taken an algorithms class, although brushing up on that is on my to-do list. I will say, the Fisher-Yates shuffle is a really handy algorithm for games, which was something I picked up off of a co-worker.

mastermind2004
Sep 14, 2007

Their sample size for that survey is pretty useless. I remember the game developer mag surveys seeming a lot more reasonable, and the salaries in that were generally significantly higher than that link for the US. Everything I've heard about the UK is that programmers in games there make a lot less in comparison to programmers in games in the US/Canada, which probably pretty significantly skews their results. Really, a global average for salaries doesn't seem like it's going to be very informative, given the wide range of salaries across different countries. Even within the US, there's a pretty huge range of salaries based on location.

mastermind2004
Sep 14, 2007

Warmachine posted:

This is a general programming question, since it has been a sore spot with regards to a certain game I like. People ask for multi-threading/64-bit all of the time with Rimworld, and I tend to brush off the whole multi-threading thing because that is a can of worms I can't currently wrap my head around, and see as unnecessary compared to the constant headache of running out of addressable memory. I understand it isn't as simple as flipping a switch on the compiler, but what I don't understand is WHERE in the process it starts to become an issue. So my question boils down to two things:

1) Why in TYOOL 20XX since the release of Windows 7 would someone decide not to compile a game nativly to 64-bit even if initial design requirements don't predict addressing requirements to exceed the limits of 32-bit addressing.

2) What and where are the hurdles in taking a finished product from 32-bit to 64-bit addressing?

1) At least until recently (don't know if it's still true or not), China still had a significant amount of Windows XP boxes, which would not support 64 bit.

2) Ensuring that there aren't cases where type conversion causes loss of data is important in the conversion. If you don't use explicitly sized types, things that you expect to be the same in 32 bit can be different in 64 bit, such as size_t versus int. You also have to deal with a change in the size of pointers and other types, so if you were doing anything clever with pointers to do something funky like serialization (or marshaling data or anything), that could all break. Also, the hurdle of convincing people that it is worth the investment of any amount of time on something that is already complete and well tested. Just because you change the build target to x64 and it compiles doesn't mean it is going to run, so you would need to go through whatever QA or cert process on whatever platforms you are shipping on.

mastermind2004
Sep 14, 2007

We use JIRA, and I think it's relatively common.

mastermind2004
Sep 14, 2007

ChickenWing posted:

machine learning-integrated loading bars
For fully accurate loading bars, upgrade to a Titan X.

mastermind2004
Sep 14, 2007

Goreld posted:

You can blame Namco for the lack of these. Also a fine example of how stupid software patents are.
That patent is actually expired now, so developers are free to do it, now it's probably more to do with the development time and load time tradeoffs of having a mini-game than anything. I would personally rather try to spend more effort on reducing loading times than developing a mini-game to cover the load time.

mastermind2004
Sep 14, 2007

It's also a divisor of most of the common display refresh rates (60, 90, 120, 240hz), so your frames should generally line up nicely with v-sync on the display at 30 (since you can hopefully just use each frame for 2 v-sync intervals), whereas other rates don't divide in as nicely.

Adbot
ADBOT LOVES YOU

mastermind2004
Sep 14, 2007

djkillingspree posted:

on a d&d game one of testers wrote up a bug that essentially was "this spell that does 20d10 damage NEVER does more than 150 damage or so, the RNG is broken" and we had to give an impromptu statistics lesson to the qa team lol
You do have to actually pay attention to the RNG implementation you're using, on one project I worked on, the random implementation just routed through to the C Runtime RNG, which had a known bias towards low numbers, which caused the game to roll epic drops way more often than it was supposed to. We switched it out for a Mersenne twister and started seeing much better results.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply