Dodgy Wallet Analogies: Memory Management in Objective-C and Embedded Systems
I used to carry a bulky wallet - a thick, zip-up affair that held everything: cash, cards, transport passes, business cards, receipts, coins, and various “just in case” items that accumulated over time.
Today, my wallet is sleek and minimal: exactly one ID card, one payment card, and a few business cards. We all know what happened -everything migrated to bits and bytes. I no longer panic if my physical wallet goes missing; the important stuff is managed digitally and backed up in the cloud. My current wallet even attaches directly to my phone, taking up virtually no extra space.
This transformation mirrors exactly what happened to computer resource management. In the mid-1980s, when computers had limited memory and glacially slow processors, there was little incentive for sophisticated resource management. Applications were typically written in C, where memory allocation happened through paired malloc
and free
calls - you requested memory from the operating system, then manually released it back. The programmer’s challenge was ensuring these operations were always properly paired; otherwise, the computer would gradually slow to a crawl as available memory dwindled. We simply accepted that computers needed regular reboots to stay functional.
The evolution from stuffed wallets and sluggish computers to sleek, efficient systems happened as programming languages and methodologies evolved alongside Moore’s Law: faster processors, abundant memory, and crucially, lower tolerance for poor programming practices. Object-oriented languages like C++ and Java (gaining popularity from the mid-1990s onward) introduced the ability to bundle code units directly with the data they managed, relieving programmers from tracking memory separately. Java went further by popularising garbage collection - which we’ll explore later - freeing programmers from memory management entirely.
Enter Objective-C
(Note: Please read my previous article for some context.)
Apple adopted Objective-C as they acquired NeXT and its next-generation MacOS - a purer form of object-oriented programming rooted in Alan Kay’s Smalltalk philosophy. Objective-C featured an intriguing memory management system that fell between the “wild west” approach of malloc
& free
and Java’s comprehensive garbage collection.
The foundation of all memory resource management (like all effective resource management) lies in understanding ownership. If one or more entities own a resource, it persists. Once no-one owns it, it should be discarded. This mirrors my monthly wallet cleanup ritual; discarding everything useless, except those business cards I just know I might need someday…
You don’t need to track specific owners; a simple counter (the “retain count”) attached to each object can be incremented when ownership is claimed and decremented when relinquished. When the counter reaches zero, the object is freed. In Objective-C, this looks like:
|
|
(Note: In Objective-C, square brackets indicate method calls on objects. “NXObject” represents an object type with available methods like new
, retain
, and release
.)
As long as objects maintain their own ownership counters, memory resource management becomes implicit. But Apple’s MacOS implementation introduced another concept that refined this pattern further.
Autorelease Pools
We create an autorelease pool and “drain” it every ten seconds as events are processed. The magic happens because event processing code delegates object ownership to the pool rather than retaining objects directly:
|
|
(Note: Despite its verbosity, Objective-C is remarkably readable, making this code structure clear even to newcomers.)
Here, we’re creating an autorelease pool and “draining” it of objects every ten seconds, as events are processed. But the magic happens because our event processing code - rather than retaining the objects themselves, just delegates responsibility for the objects to this pool of objects:
|
|
When this code runs within the event loop, each pool drainage releases associated memory automatically. The code no longer manages ownership directly - it delegates responsibility to the pool. It’s like handing your wallet to a trusted friend before you go out for a drink, asking them to discard it after the evening ends.
Autorelease pools can be nested, with parent pools owning child pools. Draining a parent pool also drains all owned child pools. Modern Objective-C expresses this elegantly:
|
|
This creates a hierarchy of responsibility, like entrusting your wallet to your friend’s friend. Let’s hope they’re trustworthy!
Implementing an Autorelease Pool
An autorelease pool is essentially a managed collection of objects. Despite potential nesting, pools are typically implemented as linked lists:
|
|
This implementation maintains a single shared list of managed objects. Pool creation inserts a marker (NXPoolMarker
) into the list; draining releases all objects back to that marker.
flowchart TD subgraph "Autorelease Pool Structure" Pool[NXAutoreleasePool] --> |"_tail"| Obj1[NXObject A] Obj1 --> |"_next"| Obj2[NXObject B] Obj2 --> |"_next"| Obj3[NXObject C] Obj3 --> |"_next"| Marker[NXPoolMarker] Marker --> |"_next"| PrevObj[Previous Objects...] end
Automatic Reference Counting & Garbage Collection
As Objective-C became the preferred MacOS development language, Java emerged with built-in garbage collection. Though not compiled to native machine code, Java’s runtime includes implicit resource management. Functionally, it resembles autorelease pools: regular “sweeps” identify and release orphaned resources.
It’s like having a friend who automatically organizes your wallet, who knows to keep only what’s needed. For programmers, this eliminates resource management concerns entirely. Apple responded with Objective-C 2.0 and ARC (Automatic Reference Counting), deeply embedding memory management into the runtime. The compiler categorizes memory references as either reference-counted or not, managing retain counts through reference assignment.
The practical result is that manual memory management becomes unnecessary. [aObject retain]
, [aObject release]
, and [aObject autorelease]
become no-ops (operations which do nothing) in ARC-enabled code. The runtime automatically manages autorelease pools, adding objects during variable assignment and decrementing counts during deallocation.
On the other hand, the primary cost of garbage-collected languages is the sweeping phase, releasing unused resources requires processor time as every allocated resource is examined. This can take some time, and why game developers, whose performance was often measured in frames per second, who traditionally favoured C and C++ with their manual memory management.
Much more could be said about garbage collection, but programming language priorities have shifted. “Memory safety” (preventing crashes from runtime bugs) and “type safety” (preventing errors from mixing incompatible data types when you write the code) now far outweigh raw performance concerns.
Considerations for Embedded Systems
What about embedded systems? These platforms are constrained across multiple dimensions: processing speed, memory capacity, physical size, and power consumption. Look around and you’ll find dozens in your watch, phone, refrigerator, and toaster. Embedded processor specifications read like stepping back in time: single cores, 16-bit or 32-bit architectures, sub-1MB RAM limitations….In developing Objective-C for modern embedded systems like the Raspberry Pi Pico, I wanted to address several resource management challenges that desktop programmers rarely consider:
- Implement automated memory management suitable for constrained environments
- Provide memory usage tracking and leak detection capabilities
- Establish hard limits on memory consumption to prevent system failures
These constraints require a different approach to memory management. While desktop systems can rely on virtual memory and generous real memory allocations, embedded systems demand precision and predictability with their meager resources.
Memory Regions: Bringing Structure to Constraints
Whilst the usual malloc
and free
functions work on embedded systems, they’re typically provided by the runtime environment rather than a full operating system kernel. However, we could explore some improvements here. Apple originally implemented NSZone
classes to create dedicated memory regions for resources, though they eventually abandoned them due to lack of interest (or maybe it just didn’t make sense?). For embedded systems, this concept could be useful. A reimplemented memory region manager might look like this:
|
|
(Note: This is the definition of methods you can call for a Class or Object. It’s pretty much the documentation for how to use the code, which you refer to when developing your application.)
The zoneWithSize:
method creates a dedicated memory region that Objective-C can manage independently. This enables creating temporary managed memory regions for specific operations.
The allocWithSize:
and free:
methods work as pairs within the zone, providing controlled allocation within our defined boundaries. The dealloc
method serves dual purposes: releasing the entire memory region and reporting any unpaired allocations, effectively detecting memory leaks. This resembles how modern development tools like valgrind
work, but embedded directly in our runtime.
The actual memory management algorithms within zones could fill an article themselves. While implementing custom memory management is generally inadvisable (and usually unnecessary), embedded constraints sometimes justify the complexity (????). The real NXZone
implementation includes additional methods for memory usage reporting, free space tracking, and debugging dumps of all managed resources.
Optimised Autorelease Pools
For embedded systems, even the autorelease pool implementation requires optimisation. Rather than maintaining separate linked lists, the implementation stores objects as part of a chain within the objects themselves:
|
|
This approach eliminates additional memory allocation for chain management—each object carries its own “next” pointer, with the pool itself pointing to the last managed object (or the “tail”). When draining the pool, we simply follow the chain and release each object. For embedded systems where every byte matters, this optimisation reduces both memory management overhead and processing requirements.
Ownership is Everything
The evolution from bulky wallets to sleek digital payments mirrors the journey of memory management in computing. We’ve progressed from manual malloc
/free
pairing in C, through Objective-C’s reference counting and autorelease pools, to modern garbage collection and automatic reference counting. Embedded systems are a little behind the power curve, so autorelease pools might be a good step towards better memory management.
I asked Claude what key insights came out of this article, as I’m kinda terrible at summarising my own work. Here are the key takeaways:
- Ownership is Everything: Whether managing wallet contents or memory objects, clear ownership rules prevent resource leaks and conflicts.
- Automation Reduces Errors: Just as digital payments eliminated the need to count change, automated memory management eliminates entire classes of programming bugs.
- Context Matters: Desktop systems can afford the luxury of garbage collection overhead, while embedded systems require more precise control and predictable behaviour.
- Tools Evolve with Constraints: Each computing environment—from mainframes to smartphones to microcontrollers—develops memory management strategies suited to its specific limitations and requirements.
My wallet analogy breaks down eventually (as all analogies do), but it illustrates a fundamental principle: good resource management isn’t about the specific technique, it’s about understanding your constraints, defining clear ownership rules, and choosing tools appropriate to your environment.
For embedded systems, this means combining the elegance of object-oriented design with the precision demanded by hardware constraints. The result could be code that’s both maintainable and efficient, much like carrying exactly what you need and having a trusted friend for the rest.