Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
Slanderer
May 6, 2007

Rocko Bonaparte posted:

This isn't about embedded programming per-se, but I have a question about analog-digital converters for regular PCs. Specifically, what are some good ones out there? I see a few and can't really make up my mind. I was hoping to find something in the short term that takes a signal from the range of 0-5V, samples it very rapidly, like > 10Hz, and has > 8bit precision. Has anybody worked with anything like this before for situations like this?

Basically anything will do that, because 10Hz is comically slow, and 8 bits is, like, the minimum you'd ever see.

Realistically, it depends on what you want to do with it. If you just want to make some measurements one time, then basically anything is fine, but if you need to integrate it with existing software (necessitating that it makes multiple drivers available), it becomes more expensive.

Stuff from National Instruments is basically the default, at least here at work, but they're somewhat pricey for what they actually do.

Adbot
ADBOT LOVES YOU

Slanderer
May 6, 2007

armorer posted:

Pardon my ignorance, but what is the issue with dual headers? Is it basically that you can't easily plug the whole board into a breadboard? I made a little custom adapter to hook up the dual headers on my Raspberry Pi to a breadboard (although Adafruit sells one under the name "Cobbler"). I suppose it was a bit of a pain, and would be a larger pain on a board with more pins.

No, that's basically it. When I had the time back in school (and couldn't spend extra money), I made similar adapters for dual-row headers on LCD boards and AVR programmers using a bit of protoboard and headers pilfered from the electronics lab / ECE dept. / robot club. The only problem is that it's still kinda inelegant (your adapter covers up a lot of space on the breadboard), and homemade ones never have the proper spacing which means you gotta spend time bending leads until they fit.

Slanderer
May 6, 2007
The arduino libraries somehow manage to be both devoid of features and entirely unoptimized. (Also their I2C libraries will always be poo poo, forever). Their build process is totally hosed, too.

Just sayin.

Slanderer
May 6, 2007

armorer posted:

Arduino was not designed for anyone in this thread. It was designed to get people who have never done anything with embedded electronics in their entire lives started making things. It is the Duplo block of the construction world. Nobody in their right mind is going to try building a shed out of Duplo blocks, but they serve to get young kids building little "houses". Kids that are into it will graduate to Legos and them maybe move on to actually studying engineering. Sure you will run into people who have made their Arduino tweet when they flush the toilet, and think that somehow qualifies them as an embedded programmer, but those people are just stupid. There are stupid people in every area of life.

Let me tell you about a project called Ardupilot...

:sigh:

Slanderer
May 6, 2007

armorer posted:

If that project lets someone (without the know-how to make such a thing otherwise) spend some money on a glorified quadcopter "puzzle" to assemble, then it will do ok. Maybe that person will have fun with it and decide to learn how the hell it actually works, although probably they won't. How does that make it "bad" though? Robotics professionals don't scoff at toy robots, they buy them for their kids to get them interested in what they do. I just looked up Ardupilot, and $400 gets you a copter with 1kg lift. Some indie movie director is going to strap a goPro to that and use it to film interesting scenes. Would that director build a quadcopter otherwise? Hell no.

That is why the Arduino (and some of the myriad followup products / kickstarters) is booming right now. It is accessible to people who don't know Ohm's law, are just trying to teach themselves C, and basically have no idea what the hell they are doing.

You guys look at it and see a (negative connotation) toy that can't handle the stuff you do, these other people look at it and see a (positive connotation) toy that they can use to do fun stuff.

Actually, my complaint was about people taking Arduino way too far, and not saying, "Ok, let's start naming our files *.c and *.cpp, let's use our own build process, and let's actually commit to doing a good job". As it stands, that project has been so overextended that it's an incomprehensible mess a lot of the time, and it's not rare to encounter a completely untraceable bug that causes poo poo to fall out of the sky. Due to the way they integrated a lot of libs from arduino, tracing poo poo is hard as balls.

I admittedly still use the Arduino at times to prototype something extremely quickly (because it's quicker than digging out an Atmega, a programmer, a breadboard, and all the components I need). I through something together in a day last year using a graphical LCD shield and an isolator that counted hi speed pulses and kept a histogram of pulse duration (I was debugging an issue with some power switchover HW here at work). It's just annoying to see people accumulating a tower of crap on top of Arduino.

Slanderer
May 6, 2007

Base Emitter posted:

ARMs are almost cheap enough now to replace 8-bit micros, at least in low-volume hobby applications where CPU cost comes out in the wash. :neckbeard:

They are, but the issues comes with libraries. ARM-based chips are inevitably way more complex, it seems, and stuffed full of peripherals. In my experience, this means that you basically need premade driver sourcefiles for the stuff you need. A lot of "basic" stuff still can require a lot of code to setup and configure registers. This wouldn't be a problem, if the free libraries didn't inevitably have a whole bunch of bugs, and the manuals for chips didn't have lots of errors.

Once you get everything setup and your drivers working it's just as easy as working with an 8bit uC, but getting to that point is often harder.

Slanderer
May 6, 2007
Anyone have any good resources on doing ARM debugging? I'm trying to figure out an error that's popping up at work, and my knowledge of using the debugger is mostly from trial and error.

Also, if anyone here is familiar with the cortex M3: I'm getting an ARM Bus fault, which is being logged as a Precise data bus error.

I have no loving clue what that even means.

(for reference: http://infocenter.arm.com/help/index.jsp?topic=/com.arm.doc.dui0552a/Cihcfefj.html)

Thoughts?

Slanderer
May 6, 2007

JawnV6 posted:

Just from reading the link, the BFAR register (separate from BFSR) should hold the data address that caused the fault and the return address on the stack frame points to the instruction that caused it. The BFARVALID bit in that register should also be set. Imprecise would mean that a bus error happened, but due to some higher priority process the debug information got trashed and the proc can't be sure what to tell you. Not sure if that helps, but bow howdy I've read/written some fault documentation and that's what it's saying to me.

Knowing those two addresses should get you pointed in the right direction, figure out what instruction/code is causing it and how it got the goofy address that caused a fault.

That to make sense. I think my confusion comes from my ignorance of the architecture, since I have no idea of an example of something that could cause this fault to occur. I know this is occurring at an inline function that is appending null-terminated strings to a packet, which is called countless times, and is properly checking for overflows in the event that it is passed a string without null termination.

Worst comes to worst, I'll ask the King of the low level programmers here, since he would probably know exactly what was up in 20 seconds or less.

Slanderer
May 6, 2007
In the near future, I'm going to be doing work on an embedded Linux system we're developing here (actually, part of a larger system). However, I'm trying to get a good development environment setup, even though we're many months off (since we're doing various training and prototyping activities). The toolchain is Linux based, so I have a VM running Ubuntu from which I can code, build, and debug. Our current dev boards (just a new starter kit from TI that uses the same line of processor that we're using) is setup for debugging via ethernet, which complicates things. Specifically, in order to get an internet connection for my VM, I need to use wired ethernet (since then I can go through the correct work proxy servers, whereas the wireless needs authentication). However, in order for me to debug the board, I need to assign a static IP to the virtual network adapter in my VM and directly connect to it via ethernet. I got a USB ethernet adapter that works in Ubuntu out of the box, but I don't actually know if it's possible to configure...Basically, I need all outgoing data to the static IP of my dev board to go through the USB-ethernet (skipping the proxies), and then all incoming traffic addressed to the static IP of the USB-ethernet adapter to come in.

The problem is that I have no idea how any of this poo poo works, on a fundamental level. However, I figure someone here must have gone through some elaborate process to setup a similar dev environment (and had similar problems), so I'm eager to know if any solutions exist.

Slanderer
May 6, 2007

Martytoof posted:

Just wondering how you guys are handling your fragile baby dragons.

Put it in some sort of case (I think I either bought or made a case for one at some point), actually observe proper ESD protocols (some component on them will get knocked the gently caress out from basically nothing), and always use external power, when possible.

Slanderer
May 6, 2007

Captain Capacitor posted:

Why is everything about the Beaglebone so terribly documented, or at least terribly assumptive.

Is TI still horrible at the documentation for those things? I've been wanting to get a Beaglebone Black recently, since my original Beaglebone disappeared a long while back. I remember the documentation for the hardware being incomplete at the time, and not all of the basic drivers were done, which was cool.

Slanderer
May 6, 2007
All this talk is making me regret having to drop digital design back in school....

Slanderer
May 6, 2007

sund posted:

Hey guys, glad I found the thread. I'm using Atmel's SAM3 & SAM4 chips at work right now and it'll be great to have a resource to figure out what's going on in the real world. Haven't touched microcontrollers since school so I'm still getting up to speed.

Anyways, I'm using Atmel Studio as an IDE for now, but others use Eclipse on the same project, so I can't get locked too into one particular IDE. I've cooked up a makefile to pull in a bunch of ASF to replace the half baked drivers and libraries that were being used before and it's already a lot better. On the other hand, I've also already noticed an issue mentioned in the thread when portings over -- no timeouts in the I2C library! Has anyone tried filing improvement requests with Atmel? How responsive are they? Are there any projects out there that aim to provide proper platform independant midlevel peripheral libraries that I could port to the register interface of my chip or should I just be patching Atmel's and calling it a day? I see all the other interesting platforms out there and worry about getting too deep in Atmel's stuff...

I think part of the reason for the lack of timeouts is that I2C doesn't actually specify that feature, unlike SMBus. I think they expect you to either implement it yourself, or utilize the watchdog.

Slanderer
May 6, 2007
Except don't make your own RTOS one piece at a time, because inevitably the result will be less full-featured and way buggier than even FreeRTOS

Slanderer
May 6, 2007
Uh, all that hardware doesn't make an RTOS unnecessary, and a MMU is not required when you only have one process (and have need for a few global variables). The project I'm working on right now has several threads, which vastly simplifies scheduling.

And I don't know why you'd say a superloop is unnecessary. Unless you're doing something extremely boring, it's generally the simplest and most modifiable format. Sure, you could go balls-to-the-wall retarded with only state machines, but you'd probably only do that if you were an obsolete greybeard that is making an industrial controller that does something intimately uninteresting.

Slanderer
May 6, 2007

JawnV6 posted:

Has anyone worked with the MPR121 capacitive touch sensor chip?

I'm building something based on it. I have the arduino library for it and converted those calls to a real platform. It's working great just with aluminum tape. Now I'm having to tune the sensitivity to put a dielectric between it and the actual touch point on the device. I'm going to dive into the datasheet shortly, but any pointers on working with this or other capacitive sensors would be much appreciated.

If you'd like, I could try to dig up some app notes from either Atmel or Cypress that I had a while back that got into circuit layout for different types of capacitive sensors. Mostly stuff about the correct geometry and the effect of dielectric layers, but there was one specific demonstrating prototyping different sensors using only copper taper on the underside of thin polycarbonate/lexan/something, and another that demonstrated drawing a capacitive "slider" sensor (one of the ones made of chevrons) on a piece of paper using regular pencil, and then demonstrating that it loving worked.

Slanderer
May 6, 2007

JawnV6 posted:

Holy crap, yeah that would be perfect!

fake edit: this? http://www.atmel.com/Images/doc10752.pdf It's specific to one of their chips but includes a lot of general purpose information about the theory.

I don't have my files at work, but that might be it. I thought the document had a different title (it might have been an older version), but those diagrams seem familiar. Really good stuff in there regardless (the geometry diagrams, the stuff on making LED illuminated cap touch buttons).

I just realized that the demo of someone drawing a sensor on paper was a video, from Atmel I think. I can't find the video now, though, but I might be able to dig it up from my old bookmarks.

Here is the one on prototyping sensors:

http://www.cypress.com/?docID=3322

Slanderer
May 6, 2007
Be real careful with those AVR Dragons---they are really easy to kill. Presumably it's just ESD or something, but a lot of people have had those randomly die. A lot don't, but still consider getting/making a case for it.

Personally, I've always been just fine with the AVR ISP mkii.

Also, in case anyone is looking for a cheapo controller board, Adafruit started selling these a few weeks ago:

http://www.adafruit.com/products/1500

It's just an Attiny85 with some LEDs, a regulator, a reset button and a USB connector. They come in 3.3 or 5V versions, and they have bulk packs of them too. Nothing fancy, but better than using a full dev board (or making my own) for a single gadget that doesn't need to do anything complicated. The only downsides are that 1/3 of the flash is taken up by the USB bootloader, and it doesn't have a USB serial port.

Slanderer
May 6, 2007

JawnV6 posted:

We have a full machine shop, should be able to get someone to make one.

It looked like this supported fewer chips? If I end up frying the Dragon I'll order one.

It does, but I recall some of Atmel's documentation may be incorrect. The same with the Dragon--I think they may have patched it at some point to support more chips (with larger memories??). It's been a while...

Slanderer
May 6, 2007

Bad Munki posted:

Yeah, that was exactly how it went for me, too. I was looking for something for a project I was mulling over, and thought, "Well, I'm nowhere near needing them, but some day I might, and it's only $12!" *click*

And I came across a pretty featureful arduino library for that chipset, supported a number of different network designs including mesh. Depending on your platform, you shouldn't have too much trouble.

Jesus, I did the exact same thing months back. I just found them in a box of similarly cool components I've yet to do anything with.

Normally I feel back for wasting money on stuff that I can't see myself using anytime soon, but they were just so drat cheap.

Slanderer
May 6, 2007

ImDifferent posted:

Little microcontroller-based CTF hacking competition:

http://www.matasano.com/matasano-square-microcontroller-ctf/

MSP430-based, no less.

This did not mix well with alcohol, FYI.

Slanderer
May 6, 2007
Does anyone know of any good open-source embedded software that uses C++ effectively? I'm starting the design of some low-level software, and it occurred to me that outside of UI elements I haven't seen C++ used very effectively in embedded SW. Mostly I've seen instances where it is used, but to such a limited extent that C could have been more easily used instead.

Slanderer
May 6, 2007

movax posted:

Why do you need LEDs, you have a debugger and a scope to probe a test-point / some other pin of you need it? :angel:

- hw guy

Our hardware guys think actual testpoints are a luxury that we do not deserve (or if they do, they hide them underneath daughterboards / on the underside of boards mounted to test fixtures)

Slanderer
May 6, 2007

Delta-Wye posted:

Vcc/gnd swap? :smith:

I've done that with Atmegas before, and they just got slightly warm. After connecting it properly, it still worked fine

Slanderer
May 6, 2007

Arachnamus posted:

I'm looking to put together a prototype wireless HDMI system and I'm wondering what sort of hardware I should be looking at.

HDMI (particularly 1.4) is an absolute shedload of data (around 10Gbit/s) but I don't need to actually process it into frames, just enough power to be able to compress it, probably encrypt it, and then transmit across a ~50m distance, then reverse the process on the other end.

One of the only requirements is latency, we're talking tens of milliseconds maximum, though any device used would need to support (and ideally come with) an HDMI port that can be used for input or output.

I typically work with full-scale servers so I have absolutely no idea on the scale of modern microprocessors vs the task at hand. Any advice would be great, thanks.

Is this just for "fun", or for work stuff? Because this sounds terrificly difficult. Very few wireless HDMI products seem to boast a 50m range, and even those that do imply that it is a probably a theoretical limit that you will never reach if there is a wall in the way / a wall near by to reflect off and add multipath / any other devices in the same frequency band. My gut tells me that there may be technical issues to achieving that sort of range reliably while following FCC regulations, depending on what band you're using.

Or, dumb idea: if there is any chance your wireless receiver is stationary, you could use a commercial solution and swap out the built in antennas with directional ones. Of course that still isn't kosher from a regulatory standpoint either, it's certainly a lot easier.

Slanderer
May 6, 2007

Arachnamus posted:

It's a speculative work thing. The 50m range is as you say a theoretical line-of-sight limit, it's by no means the most important part of the spec, compared to the latency. A more realistic range is 5 to 10m indoor, 20m outdoor.

Ah, that makes it a bit better, then. It looks like professional grade transmitters also advertise that sort of range and latency:
http://www.idxtek.com/products/cw-1

Slanderer
May 6, 2007

armorer posted:

You could look into AVR as well, since you already have one in your Arduino.

It's super easy to get started with any of AVR device, which is why they are great. If you don't want an Atmega or Attiny, then the XMEGA line is pretty cool (for an 8bit micro with a ton of peripherals bolted onto it)

Slanderer
May 6, 2007
Time for a very specific question:

I'm working on a project with 2 microcontrollers (STM32F400 and STM32F300 series) and like 4 independent power sources. However, I currently don't have the ability to load code via the bootloaders, so it's JTAG all the way for now. However, since most of the time I'm not using a debugger for debugging (I'm making optimized builds that don't debug well anyway), I need to remove all 4 power sources from the device (which takes a minute or two) every time I load new code onto one of the microcontrollers.

I'm using IAR embedded workbench with a J-Link debugger (pretty standard). Anyone know if I can set it up to start the CPU automatically after downloading? If I were doing this from the command line utility it would be trivial...

Slanderer
May 6, 2007

Mr. Powers posted:

I've never seen an option for that. If you disable run to main, it will just stop at your reset vector. You may be able use a C-SPY macro file to do this, but I'm not familiar with the macros beyond displaying some info on log breakpoints. There's probably a resume directive and you can just put it in your debugger .mac file.

I looked at the macros and tried issues a reset at different times (after the flashloader, at least in theory) but it wasn't working out...

Then I figured out that I can semi-reliably reset the CPU by one of the following:
disconnecting from the jtag port
connecting to the jtag port
disconnecting my debugger from USB
connecting my debugger from USB

It's not consistent (reset lines are weird idk), but it's good enough for me.

Slanderer
May 6, 2007

sailormoon posted:

I'm looking for a device to learn embedded programming on and the OP seems to be out of date. Would the Beaglebone Black be the best place to start? It seems like it would be great to do bigger projects on given its specs too.

The Beaglebone Black is great for learning embedded linux systems, but any knowledge learned there wouldn't really carry over to an arduino, for instance. Smaller embedded processors are programmed without the aid of an operating system (usually), and have direct i/o access to all of the hardware. You also generally have a lot less memory, and programs are written differently as a result. So, depending on what you want to do, a Beaglebone Black may or may not be right for you, even if it is much more powerful.

Slanderer
May 6, 2007

No Gravitas posted:

Is there something more powerful that the Beaglebone that is still pretty cheap?

The MSP430 kit was good for basic stuff if you don't want too much trouble. Not sure if they still have it, and for sure it isn't 4.30$ + free shipping anymore.

I personally like PIC and anything that Microchip makes, but it isn't exactly newbie friendly. Keep in mind, I build MCU programmer devices for fun.

There are probably dev boards with faster ARM processors on them, but I haven't worked with any in the same price range as the Beaglebone. This list is a year outdated, but it may have some options:
http://www.linux.com/news/embedded-mobile/mobile-linux/732197-top-10-open-source-linux-boards-under-200

For smaller stuff, I still really like the Attiny series. They are dead-simple to work with, their peripherals are really easy to get setup, and they are cheap.

Slanderer
May 6, 2007

reading posted:

What kind of programmer/debugger do you use to load code and debug on the ATTiny? And I assume you have to use AVRStudio?

Any sort of AVR programmer--the standard official one is like $30, but there are a ton of clones which are cheaper. In some ways the clones are better, since they were easier to use with AVRDUDE (the open source programming utility). WinAVR used to be the go-to package for programming AVRs without AVRStudio, but that might be abandoned now? Regardless, AVRStudio still works fine if you're on Windows, and it might support clone programmers via AVRDUDE now.

If you can't tell, it's been a few years since I've worked with them.

Slanderer
May 6, 2007

RICHUNCLEPENNYBAGS posted:

Hey, this is not exactly embedded programming, but do any of you guys have any experience doing stuff in Windows with USB? I'm interested in trying to make a Guncon3 driver for Windows as a personal project and I'd like to know if you have resources to recommend. Like I'd guess the first thing I want to do is just be able to get the raw output of the thing so I can try to work out what signals it actually sends? I just have this idea and not much sense of how to go about it.

I make ASP.NET applications for a living so this is a bit of a departure from stuff I normally work on.


Wireshark on linux might support USB capture, so give that a try? There might be capture software for Windows too. There are also inline USB protocol analyzers and data loggers, but those can be expensive.

Slanderer
May 6, 2007

movax posted:

Generally CPLDs are much much smaller / less powerful / less complex than full-blown FPGAs -- at one point the architectural difference was that a CPLD was just a 'sea of gates' that you could connect in various shapes, whereas FPGAs were based on a modular block based architecture, where LUTs of various size are used to form logical functions.

It's also important to note that CPLDs are non-volatile, whereas FPGAs need to be reconfigured from external memory at reset. I've seen a CPLD (or maybe it's a PLA, but the distinction isn't critical) used as glue logic for a hardware power source manager (where using discrete components would be too big). In that situation, it is important to have the logic available as soon as power as present (even before regulators can stabilize) so that the hardware can be setup properly and allow the rest of the system to boot up.

Slanderer fucked around with this message at 21:20 on Aug 24, 2015

Slanderer
May 6, 2007
I don't really remember much about the 68k from school, but the Atmega8/168/328 only seems to have surpassed PICs in the hobbyist space once people started using them for Arduino derivatives (or at least people moving on from using Arduinos). The benefits I can think of off the top of my head:

1. Free compiler/dev environment from Atmel (which is pretty decent)...but also good GCC support with AVR Libc
2. Available in DIP packages
3. You only need the uC, a linear regulator and a programming header to get started
4. Cheap programmer (and now even cheaper unofficial programmers that are compatible with the open source toolchain)
5. Low power---you can run a dev board from a USB port
6. Atmegas are cheap as poo poo

One downside is that Atmel does not give away free samples, which is a big deal for broke college kids. For that reason, I'm kinda surprised that there isn't more stuff being done with TI's MSP430 line---unlike other companies, TI has always been good about giving samples of DIP packaged parts, whereas others exclude them specifically because of hobbyists. I did a lot of stuff with Cypress PSOC's when I was in school because I was able to get free samples of breadboardable parts that did all kinds of cool stuff.

I probably still have thousands of unused free samples of everything from motor controllers to 1 GHz ADCs to this day

Slanderer
May 6, 2007
Does anyone know anything about USB virtual com port drivers? Specifically, I'm trying to figure out how to keep my device from assigning to a different serial port for every physical port on my laptop. Is this a limitation of using the default windows usb serial driver? Or can I modify my .inf file to change this behavior? I'd like it to work more like the USB-Serial adapters I use, where (i think) the 1st device connected (regardless of which USB port) is mapped to the same ContainerID (and the same COM port), and then a 2nd connected device will be mapped to a second ContainerID (and its associated COM port).

Would I need to make a custom USB driver for this capability?

Slanderer
May 6, 2007

Yeah, I knew that adding a serial number on my device would let me use the UniqueID device capability, which would then my device instance could persist across USB ports. However, for our application, I don't want new devices of this type to enumerate new COM ports in the future---while its merely annoying for engineers like me who interact with multiple devices, its unworkable for production test station computers (that would see tons of new devices connected per day).

This would presumably be solvable with additional software that directly messed with the registry (or alternatively, I think there is something you can add to the registry to make windows treat every instance of a device with a certain VID+PID is the same device, but that seemed like a weird hack).

It's frustrating since lots of USB-Serial drivers work they way I want, but I don't know if they were only able to do that by writing their own driver. The MSDN documentation on all of this is straight bullshit and mostly not helpful for doing anything practical.

Slanderer
May 6, 2007

meatpotato posted:

As an added note, be aware that Windows will store information about every unique USB device plugged into it in the registry. If you're running an assembly line or a test station it'll lead to the computer becoming unusable sooner than later.

Maybe this isn't an issue anymore, but it's worth being aware of.

I hadn't even thought of that. It's hard to say how many devices it would take to cause an issue, but this might be worth making a note of. Thanks.

Slanderer
May 6, 2007
Does anyone have any good references/reading on data versioning in an embedded systems context? I need to manage a nonvolatile data store that is more mutable than the one's I've used in the past as it is tied in with our system for passing data structures between threads (and processors).

We have a system in place, but it relies heavily on engineers knowing all the side effects of modifying any of the data structure definitions shared between multiple CPUs, and also how to write conversion functions for upgrading the nonvolatile memory store from an older version (which, in turn, requires fun stuff like manually grabbing data offsets via the debugger). This system is still way early in prototype, but already I am fed up with it and I'm hoping to be able to point the engineers responsible in a more productive direction. Ideally, we would like a robust system that makes it hard for us to accidentally break something without generating a ton of errors at compile-time. However, I've only had limited experience with this, and none of the references I can find are geared to embedded systems (or at least to C/C++ software).

Adbot
ADBOT LOVES YOU

Slanderer
May 6, 2007

sliderule posted:

Is my only option to disable the interrupt while I update the ring buffer? Is there any way to make this atomic on ARM?

I don't see how. Is there any particular reason you don't want to disable the ISR? This is roughly similar to an RTOS disabling ISRs briefly to allow atomic access to its internals, and the problem of needing to safely access a message is particularly relevant.

  • Locked thread