|
TheAgent posted:the guys who run crytek are loving awful, fyi I've heard some pretty crazy stories from people who interviewed there, including questions about whether the applicant had an SO (ie employees who have anyone waiting for them at home can't be counted on for proper crunch) and coding "tests" that look suspiciously like actual development work. Hearsay, of course, but they certainly don't have the best of reputations. As for the "delayed" payments - I don't know about German labor regulations, but in Denmark if you accept that as an employee, you're considered to be giving the employer a loan, and forfeit your privileged creditor position in case of bankruptcy (including access to a fund that will cover some of the missed salary almost immediately so you don't have to wait for the legal machine, and get paid even if liquidation doesn't cover it). Basically, you screw yourself over really hard if you don't play hardball when salaries are missed. I don't think it is necessarily wrong for a struggling company to ask the employees to help out - but if employees are to extend credit when no one else wants to do it, they need to given more than "inconvenience" pay. That can be a very high interest and/or the opportunity to become shareholders of the company. Responsible management will do layoffs and/or negotiate salary reductions rather than take employees hostage. I think it should be a criminal offense to withhold information about expected inability to make payroll, with outright deception being an aggravating circumstance.
|
![]() |
|
![]()
|
# ¿ May 25, 2025 06:50 |
|
Ol Cactus Dick posted:Sure are a lot of Lead and Senior openings for a project that has been going for 5 years. Senior generally just means 5+ years of experience for most IT/FinTech/GameDev jobs. It's pretty normal for companies to have more senior than junior/vanilla/associate openings. Take a look at Blizzard (https://careers.blizzard.com/en-us/openings) or Atlassian (https://www.atlassian.com/company/careers/all-jobs). What I do find weird is the '3D Vehicle Artist' position. That's an oddly specific way to put it. And the second-from-top responsibility bullet point is hilarious: CIG Job Listing posted:Responsibilities Huh?
|
![]() |
|
Whether the loan is a minor one for low interest or not (it's impossible to know the size, interest, etc. - and it's silly to speculate really), Ortwin should know better than to state this: "Foundry 42 and its parent company Cloud Imperium Games UK Ltd. have elected to partner with Coutts". Banks generally don't allow you to use the word "partner" about your business arrangement, and I'm pretty sure whatever contract(s) govern the loan explicitly forbid this. In practice, they probably won't care about this, but in principle making a public statement about partnership could be used as legal ammunition to void an agreement, demand compensation and/or public retraction of the statement. It's a standard clause in most business contracts, so I'm baffled that an experienced financial officer would use this phrasing. As mentioned above, it's silly to speculate about this loan. Even very minor business arrangements with banks can have legal conditions that are basically "we own you and everything you produce until you pay us". But the partnership claim used in the rebuttal? No-go. He's basically claiming a much more intimate and interdependent business relationship than actually exists, and Coutts could (but almost certainly won't bother to) give him legal hell for it. (I am a game developer working in a bank as an entry-level director, but not a lawyer though)
|
![]() |
|
Valatar posted:So, Derek or anyone else with dev experience, help me out here. The biggest hurdle for big world games seems to consistently be the fact that making content takes forever and is consumed in hours. What kind of cash would it take to pay enough artists and designers to keep putting out content fast enough to keep the ravenous poopsockers from blowing through it all in a weekend? In Star Citizen terms, say a new unique space station or outpost every week, a non-cookie-cutter interior with a few local shops that aren't chains and have their own signage and appearance mixed in with some reused assets to cover generic stuff like docking bays. Or a planet, mostly procedural for wilderness areas but with some hand-built stuff sprinkled over it. That is an impossible question to answer, considering differences in how well the team functions, how productive the artists are, how much artistic direction is needed/desired, do they have a good pipeline, are they experienced in using it, etc. I know at one point Japanese studios had a strong tendency to just add more bodies to projects to increase production bandwidth, and some of the Final Fantasy games had a ratio of artists to programmers which could be considered extreme when compared to western game development. In general each new hire adds overhead. It requires exceptionally good management to have a project in a state where it can scale effectively with additional manpower. With the right management and production staff 140 million can deliver insane amounts of content. Witcher 3, including marketing, had a 81 million dollar budget. For various reasons, having a high content output in an running MMO has been to be a challenge that noone has been able to tackle that well yet. It's far more complicated in a live environment than just scaling the number of content people. Complications exist like technical debt, matching on-going development costs with cash in-flow, employee retention and training, etc.
|
![]() |
|
Toops posted:Well it's loving hilarious. This is my personal fav: To be fair, I've seen quite a bit of game code, and it's generally of a very "different" kind than you'd find in the financial or health sectors. It's better than the gunk often found in industrial software (please never remove mechanical safety fallback from anything dangerous to humans), but it's very obvious that decades of "once it's shipped it doesn't matter what the code looks like" has had an effect on the culture at many studios. This is compounded by game development having a very high number of junior developers compared to other industries, and many of them being self-taught to boot. Most programmers get tired of crunch and crappy pay after 5 years or so, and head off for better hours and pay. The ones that stay behind tend to be very passionate and/or incapable of navigating a normal business/office environment. They're sometimes really skilled too, but many eventually succumb to cynicism and just tinker away on whatever code they enjoy, only buckling down for actual delivery crunch when forced to by managers and/or impending studio doom. When the big studios/publishers started reusing engines it got a bit better, but the recent move to using 3rd party engines, ie UE4, Unity, CryEngine/Lumberyard, has moved momentum back towards "poo poo-that-ships". TL;DR - Crappy code is not a CIG thing. Many awesome games have insanely lovely code. Graphics and gameplay may be polished and healthy - but the code is a Stimperial mess. (just like graphics drivers).
|
![]() |
|
Bootcha posted:Liquor pairing chat: You left out the most important pairing! Akvavit: Herring, eel, smørrebrød
|
![]() |
|
A public metric tracking bugs above a certain severity or priority is not very useful to measure actual progress on bugs. When using such a metric, project managers will invariably start closing and/or shifting bug priorities to move the metric in whatever direction the level of management above them want. It's like yanking the iceberg below the surface to make it look smaller (or the reverse if that's what management wants, which can also happen). Not having any idea of how many "almost-critical" bugs there are, how many bugs have been shifted up/down, how many are new, etc. muddles the picture so much that is of no real value. It will show whatever number or trend project management want it to show.
|
![]() |
|
Flared Basic Bitch posted:Hooooly poo poo. Total bugs reported taken by itself is in no way an indicator of tester competency. In fact it's wide open to abuse if testers are naively rewarded for it. I'm pretty sure there's not a software development house anywhere that still does this, so probably MoMA is talking out his rear end. It's still very widespread in India, and I know of several major (EU) financial institutions where testers are, primarily, rewarded for the number of defects found. I've also seen AAA game companies use it, but in a pretty cool way to be fair, with a scheduled competition for QA to find the highest number of new (non-duplicate, not already known) bugs. That kind of testing yields a lot of noise and is insanely annoying for programmers, designers and analysts, but it does have the upsided that the desperate hunt for bugs makes testers do some very imaginative and thorough testing, occasionally yielding a bug report that is extremely valuable. However, I am pretty sure the signal-to-noise ratio has a strong correlation with the skill and experience of the testers. It follow that giving players with no knowledge of good QA and testing practices an incentive to report a high quantity is probably not a very good idea. A manual screening would much better, as there are always some players who are exceptionally strong at bug hunting and reporting.
|
![]() |
|
reverend crabhands posted:That's really cool. That's because they are not the same. Even though the eaxct behavior of integer arithmetic overflow is implementation-specific, square and square2 will still produce the same result (assuming a runtime error is not simply the result of overflow). square3 will overflow on the addition operation where n*k > MAXINT. square2 and square will overflow on n*n (> MAXINT). Edit: The compiler still does some pretty arcane optimization, but the difference in how overflow will affect the outcome should be preserved. PederP fucked around with this message at 12:32 on Oct 21, 2017 |
![]() |
|
SSDs do actually help with build times - especially on workstations with lots of cores.
|
![]() |
|
tooterfish posted:You're ruining everything! I'll never forget how the transition from 5400 RPM hard drives to SSDs demolished my coffee break time.
|
![]() |
|
AutismVaccine posted:drat guys, did you buy your Intel SSD already or can we buy it in bulk? Remember, SSDs are faster the bigger they are. So make sure to get the most expensive model!
|
![]() |
|
Running game servers for a non-turn based game on cloud services designed for web services and sites is not feasible. Especially not for an MMO or a shooter/space sim. While getting the servers hosted and (physically) maintained by a third party is a great idea, you do not want them running in someone else's infrastructure. You want full control down of the servers and the network infrastructure. Game servers in the cloud are a cost-cutting measure, with a significant impact on stability and performance. It is not an advantage.
|
![]() |
|
The cloud is still struggling to be viable for many enterprise-level business systems. Various cloud vendors are trying hard to get their foot into the financial industry, and fighting for every inch. MMO servers are more complicated to develop and maintain than anything in the financial industry. Amazon may be able to provide a solution for lumberyard match-based games with straightforward requirements - but I am beyond skeptical they can provide anything useful for an MMO. The notion that they would be able provide something which is better than a bespoke, self-managed solution is ludicrous. Using Amazon will *add* complexity to the already herculean task it is to keep an MMO running. The ability to "spin up servers as needed" is vastly overrated, and plain websites regularly struggle to scale properly in the cloud. You can't "spin up" your way out of the bottlenecks which are typically the issue when player activity peaks and stuff starts breaking - there are always subsystems which can't simple scale by adding more instances on more hardware. It's cutting costs at the expense of a worse product and a lovely time for the engineers. So it will be popular, and it will make games bad.
|
![]() |
|
Viktor posted:Interesting post from a software developer I do respect: What's really annoying is how so many developers and project managers abuse the word. Refactoring is when you rewrite the code without changing the behavior. It's restructuring code to improve it internally. It is *not* a rewrite with actual functional changes, like performance optimizations, entirely new features, bug fixes, etc. I guess the usage stems from a desire to avoid using the word REWRITE, which is a somewhat offensive word to the people with spreadsheets, budgets and deadlines. Refactor sounds like juggling around existing stuff in a smartypants way to create new value. Rewrite sounds like throwing away existing work, because someone was an idiot. From a marketing point of view, refactor is also a magic word, that makes it sound like the code wizards are removing gunk from the engines and making everything better. Fixing technical debt is not refactoring. It's less like a trip to the machine shop and more along the lines of a minimum viable amputation, chemotherapy and an extended stay at an asylum. But if companies were transparent about technical debt and what it costs to fix, it would make them look incompetent.
|
![]() |
|
Jobbo_Fett posted:Good kind of lootbox: Cosmetics, temporary or one-time boosts (xp, currency) Slightly bad kind of lootbox: Cosmetics, temporary or one-time boosts (xp, currency) - and earned through in-game activity. Bad kind of lootbox: Sold for $$$ Incredibly bad kind of lootbox: Sold for $$$ in a full-price game - no matter what it contains
|
![]() |
|
To be fair, hiring is tough for everyone in tech, finance and medico/pharma. Top talent is getting increasingly expensive, outsourcing/offshoring ditto (and has turned out to not be the panacea many claimed it would be), and the wage gap which has always existed between game development and other software development is exacerbating the situation. It's not unique to CIG that filling positions is difficult. There's probably quite a lot of applicants to these positions, but most are likely either severely underqualified or demand a wage in excess of what they're willing to pay. The time when geeky whiz kids could be roped in with pizza and cheap benefits to offset the subpar wages and insane hours, is coming to an end. This is a good thing, as a lot of tech millionaires have built that wealth on the broken lives and bodies of underpaid tech workers and middle management.
|
![]() |
|
Beer4TheBeerGod posted:Hey programmer Goons, my brother in law is teaching himself programming and I was wondering if you had any book recommendations. Maybe something like that Clean Code book that was discussed earlier? No, if he is still learning to actually program, it won't do him much good. He needs a good foundation before principles and best practice are useful. The world does not need more cargo cult programmers. Depending on what his ambitions are, what platform he wants to work on and what prior experience he has, he should pick a language, and get books relevant to that language. Entry-level computer science books are also good: data structures, algorithms, computer and operating system architecture, etc. If not covered by his initial language choice (good choices: C#, Rust, Java; bad choices: Python, JavaScript), I strongly recommend he learns C++ and a functional language, which could be F#, ML or Erlang. When learning to program, people write lovely code. That's to be expected and part of the learning process. (There's also a larger discussion about principles and patterns not being equally relevant/useful in all settings, but I don't want this to turn into a blog, so I better stop).
|
![]() |
|
Beer4TheBeerGod posted:He's getting a lot of info online, so I was thinking more about programmer mindset or best practices, stuff a self-taught guy might not pick up. He cannot apply these principles in a meaningful way while learning. He needs to experiment to learn, and he might pick up unnecessary restraint from such books. It really isn't a very good use of limited time and mental bandwidth to spend time on clean code, SOLID, test-driven development, documentation practices, etc. When programmers use these without the requisite experience and knowledge, it has a negative impact on productivity and quality. His time is much better spent writing code and learning from his mistakes. Also, someone mentioned web, and well, working with "modern web-development" is quite likely to teach bad habits. I guess it depends on his ultimate goal, to what degree it makes sense to spend time on code quality and design. Either way, learning the fundamentals will do more for his ability to write good code than any book on principles and patterns. It will also help him distinguish the cargo cult bullshit from actual sound advice. If it's just a hobby, it's fine to write dubious code, as noone else has to maintain it. But if he wants to make it his profession, there is a shitload of stuff to learn if he doesn't want to be a mediocre cargo cultist. It's perfectly viable to cash big paychecks and ship software without being particularly competent. Code can be surprisingly idiotic and still be shipped for a profit.
|
![]() |
|
Sarsapariller posted:My CS degree program burned me so hard that after I graduated I went into networking for the first three years of my career. Once I eased into actual coding I realized that virtually nothing they taught after the first year was really applicable in real world environments. It's a fascinating subject and I have since learned to love it, but I'm not about to pretend that the average developer needs to know most of that. There's a lot of code purists out there and they have a lot of good lessons to teach, but the fundamental truth is that programming is a trade skill like any other. Knowing more about the mathematical underpinnings will always help but it isn't where I'd tell someone to start today. Indeed, in many ways programming is a craft. You need to learn by doing, having good mentors in an actual production environment is better than listening to someone in an auditorium talk about code, and so forth. But it is still a craft requiring a fairly extensive theoretical foundation. Actually, I think there is a problem with neophyte programmers not getting *enough* exposure to theory and foundations, rather than the opposite. When I see the kind of questions asked in various gamedev and programming related reddits and discords, they often betray a lack of patience, overreliance on libraries/frameworks/patterns and desire to learn tools over concepts. Software development doesn't just go awry (something like 25% of all large-scale projects experience serious or catastrophic issues) because of bad upper and middle management. Incompetent technical leadership and/or incompetence at the individual level is also a problem to be acknowledged. Even when it doesn't directly sink a project, it still has a negative impact. A cargo cultish approach to patterns and principles is one manifestation of such competence. Example: Dependency Injection. Not only is it being mindlessly applied to almost every single architecture, it sometimes even an obvious singleton pattern with another name (because it's a DI framework, duh, so it's DI not singleton). An eye-opener for me was when I was talking to some fellow developers during a break and they casually mentioned that a good programmer should be able to use at least 2-3 languages, to which I retorted that a good programmer should be able to use pretty much any language (with due time to become accustomed to syntax, libraries, etc.) I was shocked when they found that completely unreasonable. I've since realized many programmers just want to be able to do their job 'good enough' and play with new toys once in a while. They are either too stupid or too lazy to actually want to be good at their craft. They'll grudgingly learn new frameworks/languages/platforms if they have to, in order to stay employed, but they have no passion or interest in the craft or the science. As for CS degrees, quality varies tremendously around the world, from barely useful to the extremely good. However, a strong grasp of algorithms and data structures will be applicable to virtually everyone. The same goes for hardware and OS fundamentals. Multi-threading has a relevance to almost all applications. And the list goes on. And I haven't even gone into the lack of proper engineering practices, like considering and evulating alternatives when picking a technology/tool/libraries, basing decisions on data rather than gut feeling, etc. Programming is an odd discipline, the skill floor is low enough that you can make motivated people who have almost no education pretty productive by putting them through a few weeks of education. But it is also a discipline where most people need years of actual work experience to not be terrible - and being a top-tier programmer (and not just good at office politics) is extremely demanding on intellect, motivation and/or effort. Hence why so many programmers escape into management first chance they get. TL;DR - programming is easy to be almost good enough at, but hard to be good at. If you're a Star Citizen dev it doesn't really matter, because you're stuck in the programming equivalent of the Stimpire and your pain will double every second . Get a refund.
|
![]() |
|
Beer4TheBeerGod posted:Why is Python a "bad" choice? It's a great language for someone who wants to get something done right now (and might be a good choice in specific situations). It's not a good language for learning programming as the syntax is off compared (transitioning between C#. Java, C++, JavaScript, etc. is much easier), it trades quick results for expressive power, and it has a tendency to push a magpie mentality (although not as badly as JavaScript). Also I am a language snob who thinks "real" programmers should start with "real" languages, and not training wheels. Edit: I'll just be an even bigger snob and state my point more directly: It's a good language for stupid people trying to learn programming. If you are not stupid, C++ is a much better choice. C#, Rust, Java and even JavaScript also have merits as entry-level langugages. But the ecosystem surrounding JavaScript will infect all but the most iron-willed of novices with terrible habits and/or insanity. PederP fucked around with this message at 09:29 on Nov 26, 2017 |
![]() |
|
Anticheese posted:Learning C++ in high school was a miserable experience. C# and Java in tertiary was much nicer. Learning Python on my own initiative in the workplace and getting my job done a lot faster was even better. Yeah, C++ is not a good choice for high school level courses. And Python as a productivity tool when you already know what you're doing is perfectly fine. But at university-level or some self-teaching who wants to be a good programmer, I don't think there is any escaping C++. Not that there is any reason to avoid it - it's a perfectly good language to work with, even if it is not the best choice for all tasks. I am surprised at all the negative experiences so many of you had with it, I had a jolly good time learning it (mostly on my own) back in the day, and so did the people around me. Perhaps it's easier for a bad teacher to make a mess of the course with C++ than with many other languages. Edit: Nalin speaks wisdom.
|
![]() |
|
This is good. Keep at it.
|
![]() |
|
Nyast posted:Doing a quick test in my debugger, precision at this distance ( assuming viewer is at origin ) is at the order of 50 cm to 1 meter. So yeah, 64-bits precision wouldn't be enough. If you want a precision of the order of the millimeter, you could only handle up to 1/1000th of this distance. Well they shouldn't even be relying on numerical accuracy at all to make the space game work. Trying to fit a room in a spacestation into the same coordinate system as planets and suns is silly. Scales need to be converted on the fly, vastly differently sized objects should have separate render passes, there's a ton of optimization opportunities when dealing with large to astronomical distances and low velocities, etc. But I guess that's not feasible when working within the limitations of a CryEngine mod and aiming for full Fidelity™.
|
![]() |
|
An amusing thing about the whole subsumption thing, is the wildly optimistic expectations many seem to have for it. It's just a variation of an old school behavior tree, but explicitly without a world model. So these NPCs have no memories, do not have any abstract processing of the world (or even their basic surroundings), etc. It's all lizard-brain. Behavior trees in combination with scripting has been at the core of game AI for decades. There is nothing revolutionary or innovative about it. But CIG certainly hasn't dispelled the backer notion that subsumption is cutting-edge AI. I'd be fine with them stating they had an ambition to create cutting-edge AI, but they're talking about it like they have this special subsumption sauce, that's gonna make all the difference. It's complete nonsense and deceptive marketing.
|
![]() |
|
Toops posted:And if anyone wants to start poo poo and try and tell me promise objects are cool, you're gonna get the blog post from hell. Yeah, or that "web-workers" is much better than just having a normal threading / task library. Can't wait for web assembly to pave the way for proper alternatives when working with browser applications. Hopefully the horrible DOM will fade into obscurity, replaced by proper UI libraries and toolkits built directly on webgl. Sadly, the odds of a least one prevailing UI library being a declarative, spaghetti and boilerplate-spewing hellbeast are pretty high. Still kind of funny that the browser is turning into an OS.
|
![]() |
|
Ramadu posted:So I'm actually trying to learn c# in a class (intro to ito first semester) and the teacher is the worst I've ever come across and I'm basically trying to learn it myself and I'm doing ok I think. We just finished up arrays and my code actually worked but lol he said it was insufficiently put into modules/methods. It had an input output and loop calculations what more did I need. Your teacher is bad. You shouldn't worry too much about organizing your code while learning the basics. Structuring your modules/classes and choosing abstractions is a tool for you, the programmer, to keep track of everything. The run-time doesn't care about your coding practices, the user doesn't care about the code. It always annoys me when people go all dogmatic about interfaces, patterns, SOLID practices, etc. Programming languages are a crutch because our minds prefer abstractions to memorizing a couple million lines of assembly. Yeah, there are best-practice for a reason and a common vocabulary is good, but in the end what matters is that you're happy with your own personal crutch. I've seen really awesome programs with code that to me had insanely annoying and weird structure and abstractions*. But if the performance is good, the people who need to maintain it, can do so, and they're productive, it's all good. When you go work with other people, you can't be too idiosyncratic and messy in your coding practices, unless you're the boss and can just force everyone else to accept your crazy mindscape. So if you make a career of programming, you'll run into people with differing ideas about how to code. Instead of arguing about what's "right", either find abstractions the team as a whole can be comfortable with, or just accept that you'll have to work with the weird mental constructs of others, and get good at deciphering them. An example: Many people will say having duplicate code is a bad thing. E.g. two methods with the same exact same code in two different classes. And it does add a maintenance cost, because it is very likely that if you need to change one method, you need to make the exact same change in the other method. However, abstracting the shared code into one piece code (e.g. via inheritance or composition) has a performance penalty. Maybe it'll make you dislike the code, making you less productive. So as a general rule, yes, duplicate code is bad. But there can be reasons it's not, and you should never accept dogma on how to structure your code. The flipside is that the more you can challenge your own suboptimal preferences, and be happy to work with (generally) better abstractions, the better code you'll write. But if you don't internalize the reasoning behind following some pattern or practice, there's a good chance it won't benefit you as much as it should, or perhaps even be a net detriment to the process and end result. * probably a bit like my sentence structure is to people better at English PederP fucked around with this message at 13:33 on Nov 27, 2017 |
![]() |
|
tuo posted:Also to everyone learning an OOP language like C#, Java etc.: never code against an implementation. always code against an interface. also: always rub. No, that's only one out of three (ie always rub). There are several valid reasons to not always use interfaces in C#, mostly performance related, but also because it's not always the best abstraction for the programmer. If you're happy wrapping everything in an interface, great. But it doesn't magically make your code bad if you don't.
|
![]() |
|
Aramoro posted:Choose 1. There is nothing contradictory about those two statements. The run-time cares about what the result of the interpreter/compiler/JIT/whatever. Some abstractions are indeed more performant - often the ones that are more difficult to maintain/extend/scale. There are times when duplicating code is a good thing and there are times when it's bad. How and why is a combination of performance/maintainability/personal preference/team culture. Coding dogma is bad. Being informed about pros and cons of various practices, and self-critical of the abstractions we choose, is good.
|
![]() |
|
Aramoro posted:lol. So Runtime doesn't care about your coding practices unless it does. Pro-tip Let me rephrase it then - it doesn't about whether it is maintainable, readable, extensible, scalable, etc. My point is that the functional characteristics of code and the "soft" characteristics of code are separate. And when someone tells you something is categorically bad due to something "soft", you need to be critical. If it's got a functional issue (ie it's slow, eats up memory, etc.) then that's something you need to weight against the "soft" benefits. Sometimes it's worth it, sometimes it's not. But when a teacher says the student writes bad code because of "modules" in a loving array learning session, then that's very likely to be a dumb and irrelevant observation.
|
![]() |
|
Rantista posted:It doesn't, just tested. 68.6 seconds for sorted vs 69.1 seconds for unsorted. 1% difference isn't a lot faster :-) I guess your cpu has either lovely or majestic branch prediction.
|
![]() |
|
Combat Theory posted:I really enjoyed the musings on programming and different languages, but the last 165 posts have been 50% programming talk and all that while beers request was long answered. Can we like... Dial it down a bit again? We're just the round girls of this thread, showing off numbers and ourselves, while everyone waits for Derek and FTR to get back in the ring for the epic conclusion of the Rumble in the Verse.
|
![]() |
|
XK posted:The portion of code jumped into/out of is too short to really show off the effects of branch prediction. Hence the extremely small difference ![]() Also, don't have a conditional like that in a hot path loop. You all getting this? Not every day you get to see hot paths like these up close.
|
![]() |
|
Aramoro posted:Exactly what I'm saying, trying to optimise a high level programming language so that it works faster at runtime is a very difficult task and I'd have serious questions about anyone who deviated from the agreed patterns in the name of such an optimisation. Yeah, but that doesn't mean there aren't perfectly reasonable examples of deviating from "nice code design" to get better performance. C# interfaces are a good example. So is vectorizing data to maximize cache coherency (ie array of structs vs multiple arrays). One also needs to consider that while compilers and run-times can be smart, they can also do some unexpectedly dumb poo poo at times (or can't make assumption that you can), where looking at the ASM/IL output and getting dirty with the platform/runtime details is unavoidable. The people who hammer "clean code is always better than fast code" into young programmers heads over and over, are part of the reason everything is still so fricking slow in this day and age. Of course code shouldn't be ugly and unmaintainable, but sometimes performance trumps principles.
|
![]() |
|
Scrum is such a textbook example of cargo cult methodology that it's almost comical. Agile is what everyone claims to be, but very often it's old fashioned milestone-based development being dressed up in sprints, story points, retrospectives and soulcrushingly misunderstood standups. There is just the right amount of alibi for making vague, optimistic planning, and sprints plus standups are a perfect vehicle for guilt tripping developers into crunchmode. Agile is great when it is followed in letter and spirit, but too often the latter is missing. It's a fantastic window dressing for bad management, though. "You don't like how things are done here? Don't you like agile? That's what everyone uses, punk. What's the alternative? Waterfall? Anarchy?". CIG is agile. They have Chris at the helm, bravely making visionary and essential changes in direction and focus, keeping an eye on aesthetics, making sure development doesn't get mired up in details. This is good for Star Citizen.
|
![]() |
|
Pantsbird posted:But doesn't Agile have sprint retrospectives to gauge velocity? The Scrum way of doing Agile has "story" points instead of time units. By estimating in abstract story points, a velocity can calculated, and the expected progress gauged (in spite of those darn programmers and their inability to accurately estimate tasks). I think CIG has the "story" part down, but I am not so sure about the rest. They probably dumped it when they went Agile 2.0 which replaces "story" points with "dream" points.
|
![]() |
|
Golli posted:What's the term for people who buy tons of GW figures, maybe paint them, will theorycraft endlessly, but never actually play more than a couple of hours for any number of reasons? I don't think it's the same phenomenon. There is a difference between the act of buying the artifacts of a hobby, could also be sports gear that isn't used much, yarn that's never knitted into anything and the hobby of dreaming up a virtual world. Most of these people are not dreaming of being world-class athletes, knitting haute-coutore, etc. It's more on par with someone purchasing a (limited run) parcel of land on not-yet-built-but-totally-feasible artificial island, and then dreaming up how he's going to be rich, have a beautiful doting wife, and have so much fun with the other land owners. Because this island will be wanted by everyone, and the land is limited. So, haha on you haters. It's extreme escapism. This happens a lot in online communities, but by making access gated by purchases, a sense of exclusivity and community is fostered. This creates an echo chamber and the beginning of sunk cost, feeding the escapism and desire to buy even deeper into the dream. It's similar to a cult, but entirely the same. There are a lot of perfectly normal backers. People who put up some money, and check back once in a while on progress. They either have some sunk cost delusion or don't mind the risk of losing the money, so they aren't seeking a refund, but aren't particular tied to the community. The sad cases are the ones who start talking about not wanting to spend money on lesser games, that this will be the game to end the need for any other entertainment, etc. To me that's a symptom of being so heavily psychologically invested in the escapism, that they're easy to milk for more money. The subscription-for-PTU can be seen as a threat to the "inner-circle" dreams of these backers. So they have to either reevaluating their dreams or cough up the money. As it's a relatively minor fee (but summing up to a very nice chunk of money over time), most will probably give in. PederP fucked around with this message at 15:27 on Nov 28, 2017 |
![]() |
|
Hav posted:Code chat: Would heroin help my coding? In general, I know you haven't seen any of my stuff. It might help those having to maintain your code? For anyone making programming a career, you really should know that having to maintain/fix/extend other people's awful code is very common and very soul-crushing. That's why companies sometimes highlight "greenfield" (ie not maintenance) work as a pro when hiring. When developing a game the cycle is generally short enough to keep the majority of programmers onboard from start till finish. If you have to bring in too much fresh blood they'll run into trouble working with the codebase (game code tends to be more messy than other software). That's yet another reason Star Citizen is doomed. Not only are they working from multiple physical locations, outsourcing development, having massive feature creep, amorphous requirements, etc. but they've by now had lots of programmers leave and join. It takes some seriously expensive talent (which I highly doubt they're willing/able to afford) to salvage another person's broken code, and with backers watching and playing they can't scrap entire subsystems and start over. The more stuff they add, they more stuff is going to break. Even if they got a cash injection of 500 million dollars, it wouldn't matter. They need to start over.
|
![]() |
|
Iafeth posted:Isn't it the main reason to play games like WarFrame? If you are not enjoying grinding for a frame or weapon - what's the reason to pay for them in the first place? Why do people not at the cutting edge of progression raid in MMOs (while complaining it's a boring grind)?
|
![]() |
|
![]()
|
# ¿ May 25, 2025 06:50 |
|
Beet Wagon posted:How many environment artists does CIG have now? That's concept / splash art, not environment art. An environment artist is generally producing the raw material to populate levels/zones/planets/dreams. So they're 3D Artists, sometimes texturing is a separate role, but still categorized as an environment artist, depends on the studio and/or individual skillset.
|
![]() |