Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
ExcessBLarg!
Sep 1, 2001

Ephphatha posted:

Guess this is the most appropriate place to ask, there was some talk in one of the other threads about some articles detailing why over-using globals in code is an anti-pattern but I've never managed to find anything more comprehensive than a vague blog post.
I don't think use of globals was ever considered a "pattern" as much as a necessity in early programming languages and often reflective of hardware constraints.

For example, on an embedded system there's hardware registers for various things like GPIO pins, a PWM controller, etc. Those are sometimes treated as global variables in C-language code because you only have one (or at least a limited number) of those hardware components anyways.

Also, on some CPUs and microcontrollers, certain memory is "easier" and faster to access than others, like the direct/zero page on a 6502. Frequently-accessed variables will be defined as global and linked to that page as locating them there reduces cycle counts and significantly increases the speed of execution of the overall program.

Using global variables in general purpose, modern application software is generally not a great idea. Most higher-level languages make object-instance variables or some other non-local scoping just as convenient to use as globals, and in C it's long been best practice to use structs for "anything of which you can have more than one".

Folks have already pointed out many reasons why globals are challenging: unclear ownership, access contention, incompatibility with dependency injection frameworks, being limited to "just one of something", etc. What is considered an anti-pattern is to replace global variables with a "god object" that contains fields for everything you'd otherwise make global, without any particular organization or breakdown into smaller objects. Such objects suffer most of the same problems as globals, although you could have more than one instance of them.

As for globals, they do occasionally have their place. Things like "verbosity of log output" for a logging framework might be a global variable, and that's a fairly appropriate usage. I'll also use them in short scripts or other simple programs where defining a more complicated object structure is unnecessary.

Adbot
ADBOT LOVES YOU

ExcessBLarg!
Sep 1, 2001
I've been using Chromeboxes and Chromebooks as my primary desktops for nearly a decade. I can/could do much of my development locally on them in developer mode (or maybe Crostini but I haven't tried that), but all my professional development these days is on a "fast enough" cycle server on EC2.

And yes, I use VIM in a terminal for everything.

Oh my Chromebooks are lovely. Usually whatever FHD (but often TN) model they have at Microcenter for $400 or less.

prom candy posted:

Does anyone use a remote development environment?
Oh here's the OP. Historically most of my development has been remote via a terminal. My first machine was an Apple II and later DOS PC, but after that it was a SunOS shell, then Linux (both local and remote) for decades now.

I no longer use a glass terminal for development but I have in the past as a hobbyist. 9600 baud is too slow these days though.

Canine Blues Arooo posted:

If you are doing something where you need horsepower, put a 5950X and boatload of memory on a local machine and let it fly. You can make that real for under $1500.
For personal use? Sure! As a salaried, but remote professional I'm not hosting a big, power-hungry work machine at home. I already have enough low-power hardware at home for product testing that takes up a bunch of space I'd rather it not. If my responsibilities were solely in development though, nope.

ExcessBLarg! fucked around with this message at 16:09 on Mar 20, 2022

ExcessBLarg!
Sep 1, 2001
I need to evaluate Crostini to see if it's a sufficient replacement for Crouton since the latter has been deprecated. My main use case that might cause problems is that I occasionally use SSH port forwarding from a terminal (not the Secure Shell extension) and need to run tun-based VPNs (but not OpenVPN), though the latter I could probably get away with keeping in the VM.

ExcessBLarg!
Sep 1, 2001

Canine Blues Arooo posted:

This is very interesting to me - myself and my friends who've worked in this space have basically always wanted the fattest, biggest box we could get. I think I said this previously in this thread, but I was borderline pissed when my current job handed me a MacBook when what I really wanted was a powerful desktop. I'm not that familiar with the perspective of solving for space or mobility vs solving for 'needs to do all the things the fastest'. I can at least understand here why they would distribute mobile hardware as a primary device, since a sizable portion of this thread seems to prefer that.
A workmate of mine once expensed the largest, beefiest laptop he could find because he needed a machine that was portable and one that could compile the "entire" codebase in minutes or whatever. That machine was a nightmare. It was big, heavy, hot, loud, and routinely broke down. Since he could do all his development on it, he ended up developing only on it and transitioning to other machines wasn't as fluid as you'd hope. I still don't know how he ever traveled with it. One day, his itself-a-laptop-sized power brick shorted out and took the entire machine with it. That was not a productive day.

This was back in 2014 and we had just transitioned to Google Apps or whatever, so I went with the exact opposite approached and asked to expense a $400 Samsung Chromebook as an experiment to see if I could be productive with it while traveling. Given that it was like one-sixth the cost of machines they typically purchased my boss said "sure, whatever". It was a 13" FHD model, 32-bit ARM processor, 4 GB of RAM. It was probably a little too underpowered, but I could pull up Gmail, a terminal, and even run Perl/Python/Ruby locally in a pinch. Prior to that any laptop I owned (ThinkPads mostly) only had a couple of hours of battery life, but this thing ran on a battery for 8-10 hours which was insane to me. It was effectively an ultrabook in everything but performance and cost. But like I said, I've been used to remote development so performance wasn't a concern of mine anyways. For the next few years I had to travel frequently, and it's the best when you can travel for 2-3 days with everything in a backpack and not need a suitcase since you can take any flight, including last minute changes, board last, not worry about checked luggage or finding overhead space, and having such a small machine enabled me to do that. No way I could've done that with a ThinkPad even, let alone something larger.

So how did productivity work out? My workmate used Eclipse and, somehow, Eclipse uses every bit of CPU and RAM you throw at it no matter what and has for decades now. He always complained his machine felt slow even though it was incredibly fast. For my part remote shells has rarely been an issue. Sure, there's times when you have to debug something literally half-way across the world and the 300 ms latency sucks. But generally speaking AWS latency has only gotten better so I don't notice.

ExcessBLarg! fucked around with this message at 14:01 on Mar 21, 2022

ExcessBLarg!
Sep 1, 2001

BigPaddy posted:

A number of years ago there was a whole to do about why I asked the Indian developers non yes or no questions all the time.
So, I understand what you're talking about, but why not just transition to asking all of your candidates non-yes/no questions? Or am I misunderstanding and that's what you did?

ExcessBLarg!
Sep 1, 2001

Pollyanna posted:

Say I’ve got this pseudocode:
At risk of bikeshedding, why not write it as this:
code:
func isMatch(elements, match):
  for e in elements:
    if e == match:
      return true
  return false

func isBaz(fooElements, barElements):
  return isMatch(fooElements, matchingElement) &&
         isMatch(barElements, otherMatchingElement)
It's shorter, less code duplication, does the loop optimization with an early return instead of a break, and relies on operator short-circuit evaluation to avoid testing bar. I suppose for type reasons you might need separate isFooMatch and isBarMatch functions, but it should still be easier to follow. I mean, personally, I'm OK with loop breaks and early returns from functions, it just depends on the context.

So then, getting back to this:

Pollyanna posted:

How do you deal with stylistic change requests in code reviews?
When I review code, I avoid commenting on the style as long as it's internally consistent and defensible, even if it's not the way I'd personally write it. It's not my job to rewrite the code, just to review it.

But if the style is objectively bad, or if could potentially introduce downstream issues that the author may not have anticipated, I'll call it out and perhaps offer suggestions. When I'm senior, I'd expect such suggestions to be considered and responded to--either adopting the suggested or a third approach, or a defense of the current one.

Edit: To be clear, I actually think the PR-suggested version is bad style. While it's optimized, I agree that the logic is harder to follow, especially the early "return isFoo" in the bar-checking loop. If I saw that in a code base I'd be asking my "why not write it this way?" question as above (which is no less optimized), but much more forcefully.

ExcessBLarg! fucked around with this message at 19:12 on Apr 21, 2022

ExcessBLarg!
Sep 1, 2001

Protocol7 posted:

I don't think it's racist in my case after working with nearly half a dozen companies and many more dozen individuals to make assumptions about offshore developers.
It is xenophobic though, unless (i) you're already familiar with a specific off-shore team known to be yes-men, or (ii) you do a quick understanding check for all new teams you interact with, not just the offshore ones.

I get how anecdotal experience works, but racism/xenophobia sucks. Imagine that you're part of a really-talented off-shore team but you're continually stuck with lovely contracts because everyone thinks you're poo poo. Let's be better than that.

ExcessBLarg!
Sep 1, 2001

Canine Blues Arooo posted:

but an outsourcing manager can't spend their time trying to figure out which studios and which individuals within those studios will be able to turn around good work.
Isn't this literally the job of an outsourcing manager to figure out though?

Canine Blues Arooo posted:

If I was the one making a call on this, I'd probably suggest we just hedge our bets, hire a well known entity in NA or EU, and call it a day.
I totally get going with a well-known entity or one with good references. And yes, it may well be worth going with a more expensive bid if you believe it's likely to result in better work. But where all other factors are equal, there's bad dev shops in the NA and EU, and good ones elsewhere too.

Look, I get the anecdotes about outsourcing. I'm just saying that we should try to be as impartial as possible when evaluating folks from overseas. Or just admit to being xenophobic.

ExcessBLarg!
Sep 1, 2001

cum jabbar posted:

I also think anyone doing business software should be able to talk relational databases at least a little (normalization, SQL injection) but apparently they aren't as widely used as I thought
SQL databases are pretty widely used in business software. It's just that there's plenty of development roles that don't touch SQL databases and programmers won't have access to one as a matter of security policy.

If you're interviewing someone that has a history with SQL I think it's fair to evaluate them on it. Whether it's relevant has more to do with what your team actually does.

ExcessBLarg!
Sep 1, 2001
My problem with architects is that even if they participate in the initial implementation they immediately move on to the next big new project, so they're never around long enough to see what the long term issues of a system is. Because of that, they keep repeating the same maintainability mistakes over and over.

Adbot
ADBOT LOVES YOU

ExcessBLarg!
Sep 1, 2001

prom candy posted:

I personally don't like terms like architect or engineer. I'm a software developer, or computer programmer if you want to kick it old school.
"Engineer" is fine particularly if you have an engineering degree. If you come from a strictly compsci background though maybe you prefer something else.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply