Starting with the basics - Part II
In the previous post, I covered some of the core concepts behind operating systems, in general. To better understand how these systems fit together, it’s useful to see them in action.
Exploring the User Space in MacOS X
Before diving into how a SwiftUI application runs on an Apple operating system, it would be useful to become familiar with some of the main entities that constantly run on a Mac (simply because it’s easier to see, compared to an iPhone or iPad).
When you turn your computer on, it has no concept of files, memory, operating system or applications. However, as it boots up, more and more services start working in concert, to provide the logical support for those concepts. The exact steps are different between Apple Silicon based Macs and Intel based Macs, but the general approach is the same: the System On A Chip (as there is no CPU or GPU per se, in ARM) executes the instructions it finds on the Boot Read Only Memory ( known as the ROM. If you remember the old days of custom jailbroken Android ROMs, this is the concept). Generally, one of the first operations is to verify the signature of the Bootloader (as a security mechanism), and, if the signature is valid, it starts it. This, in turn, executes a series of exchanges to gradually verify and load various other pieces of low level software (firmware) from every connected piece of hardware that is relevant to the base boot process ( such as Security Chips, RAM, integrated Storage). Towards the end of this process, MacOS finally begins to start up. The very first MacOS process to execute in the user space is the System Manager, or launchd.
You can easily check this, if you open the Terminal app on your Mac. Thanks to its BSD roots, MacOS supports BSD commands, many of which are very similar to Linux commands. You can run the ps -eaf
(process status) command in your terminal. The output is going to be ordered nicely, so you can even get a rough idea of which processes start the first. You would see many lines similar to the one below:
UID PID PPID C STIME TTY TIME CMD
0 1 0 ... ... ... ... /sbin/launchd
The first row in the output represents the output header. There, UID represents the User ID (0 is root, the user with full rights in the OS), PID represents the process ID, PPID is the process’ parent’s PID. CMD represents the command issued to start the process. This shows that launchd, which is the system process manger for MacOS is one of the very first processes that start, on your Mac. It is started directly by the kernel_task process, which is the only process that has PID 0 (which is why the PPID of launchd is 0) - and it starts as a privileged process (as root). It does so because it needs to be able to issue requests to the Kernel (which is not allowed from a non-root level, which is why you are sometimes asked to input your “root” password for very specific tasks).
If you analyze the full output of the command, you will start seeing that many of the processes running on the system (particularly the first part of the output) are started by launchd (they have PPID 1). Eventually, after the group os processes that started from PPID 1, you will start seeing processes that are started with another PPID. These are typically processes that other applications started. For example, you would likely see the terminal process.
... 1468 1 ... ... ... 0:00.33 /System/Applications/Utilities/Terminal.app/Contents/MacOS/Terminal
... 1469 1468 ... ... ... 0:00.01 login -pf <...>
... 1471 1469 ... ... ... 0:00.02 -zsh
... 1477 1471 ... ... ... 0:00.01 ps -eaf
It’s useful to know that PIDs are assigned by the operating system, to processes that are launched, in consecutive order. The OS has a special PID counter, which is incremented every time a new process is started. In the example above, you can see that, between -zsh
and ps -eaf
there are 6 missing PIDs. This indicates that, in the meantime, 6 processes likely executed and finished their execution.
It is common, when troubleshooting applications on a computer, to check the list of running processes, then walk back up the PID -> PPID chain, to identify relationships between processes. For example, ps -eaf
was started by -zsh
(Unix Z-Shell), which was started by the login command which was itself issued by the Terminal application.
If you open any application (for example, the Music Player), you would see a set of dedicated entries (you can use ps -eaf | grep -i music
). For reference, you would use the pipe symbol (“|”) to take the output of one command and pass it as the input of another command. You can use grep to filter, -i is the flag to make the filtering case insensitive and, in this case, music is the argument. In other words, the command lists the processes that are currently running on your Mac, but only shows you the filtered ones. This is very similar to a Swift filter on a collection (let b = try a.filter { $0.contains(try Regex("m|Music")) }
):
... 4357 1 ... ... ... 0:02.02 /System/Applications/Music.app/Contents/MacOS/Music
... 4358 1 ... ... ... 0:00.66 /System/Applications/Music.app/Contents/XPCServices/VisualizerService_x86.xpc/Contents/MacOS/VisualizerService_x86
... 4359 1 ... ... ... 0:00.05 /System/Applications/Music.app/Contents/XPCServices/VisualizerService.xpc/Contents/MacOS/VisualizerService
The PID for the Music application is 4357. We can confirm it was indeed started by the launchd daemon, because its PPIDis 1 (which is the PID for launchd). Additionally, right after launchd started Music, it also started two other processes: the VisualizerService (PID 4358) and the VisualizerService_x86 (PID 4359).
This is another very good example of how Operating System design philosophies, together with Kernel functionality, can influence the way we write applications. When running in an Operating System, each application is sandboxed. It receives its own area of memory and, as we’ve already seen, it receives its own OS Process with a dedicated Process ID. This means that, in general, one application cannot directly (willingly or accidentally) access another application’s internal memory space. However, there are cases where you need one application to have multiple isolated components. They remain sandboxed one from the other, but still function as one application and, just as importantly, they are bundled together. They are installed, updated and uninstalled together and the separation into isolated components is not relevant to the end-user.
In this case, the Music application, which is the main music playback application on the Mac, also comes bundled with two music visualizer components (which you can bring to the foreground when the Music Application is in focus, by going to the Window menu and clicking on Visualizer). This is a common application design pattern, where developers can break down complex applications into separate components, which can then be managed independently, by the OS System Manager (launchd in this case). The Music process starts as the main executable of the Music App. The two Visualizers are implemented as XPC Services, and they each receive their own OS process. Besides increased security, this also ensures that, if the visualizers encounter issues ( they are more prone to crashes due to potential decoding issues or memory problems), it can crash without bringing down the Music Player with it. It can also be managed independently by launchd, which can then restart it.
When it needs to, the Music process can then communicate with the Visualizer XPC Service, via XPC messages (which are wrapped over Mach Kernel IPC Abstractions). Essentially, as it streams the music information, the Music process bundles a copy (or some subset) of the binary data into Codable struct - and then sends it to the Visualizer. The visualizer then processes the data it receives and generates the animations it needs to.
Of the numerous processes and subsystems that start up with the Operating System, many work in concert to support the underlying functionality required for Application Graphical User Interfaces. One of those is the WindowServer process (the SpringBoard process in iOS). After launchd starts this process, there are others which follow shortly - such as the loginwindowprocess (which starts the login procedure) and then, some time later, the UserEventAgent process for MacOSX’s higher level system events. Shortly after, Apple’s WindowManager (often referred to as the Stage Manager) process starts up. The WindowManager process is responsible for arranging windows in various workspaces (desktop windows).
As described in Apple’s manual (in the terminal app, run the command man WindowServer
), the WindowServer process is in charge of “window management, content compositing, and event routing”. It is a process that runs on multiple threads. All of these services are important and no application would function without any of those services - but the event routing service is perhaps the least obvious, so it merits a dedicated discussion.
Device drivers react to events in real time. To put this in perspective, the screen of an iPad could potentially send one event every 8.33 milliseconds (if its refresh rate is 120Hz). An Apple Pencil device has a polling rate of 240Hz (sending one event each 4.16 ms). A mouse could potentially send an event once every 0.4 ms (if the controller’s polling rate is 2.5 GHz). To prevent applications from being flooded with events, the WindowServer also coalesces and buffers these events. Applications can still use the entire array of events, for high fidelity applications (such as drawing applications), by leveraging the coalescedTouches(for:) API. Generally, however, they retrieve the last touch event recorded at the beginning of the V-Sync cycle (more on this later).
Another important part of the ecosystem consists of the device drivers. On MacOS, you can see the status of the Drivers Registry, by opening the terminal and running the command ioreg
. If you would like to research further, you can check the main classes (such as IORegistryEntry or IOSurfaceRoot or ) and build a more detailed understanding, based on the elements that run on your computer.
Finally, you can also check the connected Human Input Devices your OS recognizes at any given time, by running the command hidutil list
. Apple provides a guide on how to analyze this information in their “Discovering HID Devices in the Terminal” guide.
The Activity Monitor is another particularly useful tool for exploration and troubleshooting. You can call it from the Launchpad as a generic Task Manager (similar to Windows’ Task Manger) or from the Menu Bar (<Application Name> -> Services -> Activity Monitor), as a Development Instrument. Depending on the way you call the Activity Monitor, the interface will likely look slightly different.
If you run the Activity Monitor as a Task Manager, you can also sample any running process. As shown in the screenshot below, you would select the process you’re interested in (WindowServer in this case).
Then, using the System Diagnostics Menu, you can sample the process. When doing so, the OS takes a snapshot of the running threads and calls - as well as a Stack Snapshot (similar to what you see when reaching a breakpoint when debugging your application).
Once the sampling process is complete, Activity Monitor opens up a dedicated Sample Window. You can then choose from multiple Display Options and analyze the process call graph. You can then cross-reference the calls with Apple’s documentation, to determine what each call does. Notice how, for example, 2220 represents the amount of times the thread/call was found during the sample period - and it does not represent a Process ID.
You can later use these tools to further explore how other processes and applications work (to some extent). You would not see implementation details and, in many cases, the method calls are intentionally not documented. However, with time, you would become proficient at interpreting call graphs and stack traces.
Using Instruments to explore the User Space on an iPhone
You can also explore the main processes running on iPhone and iPad devices, as well as the way the Operating System interacts with the applications you create. It is, however, a slightly more involved process, since you need to connect the devices to a Mac, then use Xcode’s Instruments suite.
For example, you could use the Logging Instrument to capture the log entries created by various processes running on the phone, grouped by the subsystem (framework) that creates them. To make the initial experience easier, it would be a good idea to switch the iPhone to airplane mode, to minimize the amount of messages the logger instrument would intercept. As you gain more and more knowledge and become more comfortable with the toolset, this step would become unnecessary.
There are other Instruments you may want to use, such as:
Runloops instrument, to explore the execution details of running processes’ various input processing threads, known as runloops
Virtual Memory Trace instrument, to explore various structures in memory, as your application changes
GPU and/or CPU Profiler instruments, to analyze various activities performed by the GPU and CPU sides of the SoC, respectively
Although you can add multiple instruments to a single recording session and capture data from all processes, the bandwidth you can use is limited. Sometimes, events are dropped due to various rate limiters.
Therefore, it helps to create a simple test scenario (eg press a specific button in your application), then run the test scenario multiple times, with one instrument at a time, but for all processes. Or, with all instruments, but on one specific process at a time.
The purpose of this section is to show some of the main processes running on MacOS and iOS, to give you a general idea of where to look, given your specific needs.
To get started, in an Xcode project, go to Xcode -> Open Developer Tool -> Instruments. Then, select Logging as the initial instrument. You can add multiple other instruments. In fact, you can select a different set of instruments for each recording, to make the process easier to follow.
The Instruments application's User Interface is designed to streamline the viewing, organizing, and filtering of captured events while serving as a centralized repository for recording sessions across multiple test cases. The screenshot below demonstrates a scenario where logs were captured across various test cases, which are displayed in the Recorded Sessions section.
The Instruments UI also features a central Toolbar, similar to Xcode, allowing you to select your target device and specify which process to monitor during recording sessions. You can configure the recorder to capture all processes system-wide or focus on a specific application, system process, or app extension. The toolbar also displays currently running applications on the connected device, functioning similarly to the ps -eaf
command in the macOS Terminal.
As shown in the screenshot below, you have several monitoring options to choose from. Like macOS, iOS assigns process IDs (PIDs) sequentially. In other words, lower PIDs indicate processes that started closer to system boot time. Note that certain processes cannot be monitored through Instruments due to security restrictions.
As an example, we could analyze the processes that are involved in starting the Music Application on an iPhone, in iOS26, when pressing its icon on the Dock (the area at the bottom of the Home Screen on an iOS device).
To make it easier to follow, switch the device to Airplane mode. Since Airplane Mode turns off all wireless services, make sure the device is paired with your Mac via a cable. Also, make sure the device is unlocked and that TrueTone is off ( also to minimize the amount of events the logger would capture). Finally, make sure you switch the Recording Mode to Deferred, to make sure the logger doesn’t drop events.
In the Instruments Application, start recording, then touch the Music Icon and, finally, stop the recording. The faster you execute this chain of events (start recording, start the music app, stop recording), the shorter the session would be and the easier it becomes to parse.
While your mouse cursor hovers over the timeline, you can zoom in and out, using the option (⎇) key + the mouse wheel up or down. Alternatively, you could use cmd (⌘) and +/-, to zoom in and out, respectively. Zooming in and out is particularly useful when, besides the logger, you have other instruments in the recording, because you can more clearly find correlations between events across processes and various CPU/GPU/Memory operations. In the screenshot below, the highlighted area represents an area of interest. It starts around the moment the Music Icon was touched and it ends around the moment the Music application starts up.
According to the log timestamps, the entire process, from the moment the touch was registered until the application fully loaded, took under 200 milliseconds (or a little under 12 frames).
Timestamp | Process | Subsystem | Event (Observations) | Message |
---|---|---|---|---|
00:01.769.459 | backboardd | com.apple.Multitouch | TouchEvent | Dispatching event with 1 children, _eventMask=0x23 _childEventMask=0x3 Cancel=0 Touching=1 inRange=1 (deviceID 0x…) |
00:01.769.934 | backboardd | com.apple.BackBoard | TouchEvents | Touch entered (insideExclusive) <BKSHitTestRegion: 0x77e7bfb80; {{0, 0}, {428, 926}}; exclusive: {{0, 0}, {428, 926}}> |
00:01.783.911 | SpringBoard | com.apple.UIKit | EventDispatch | Evaluating dispatch of UIEvent: 0xd0cf42c00; type: 0; subtype: 0; backing type: 11; shouldSend: 1; ignoreInteractionEvents: 0, systemGestureStateChange: 0 |
00:01.784.005 | SpringBoard | com.apple.UIKit | EventDispatch ( send to application window) | Sending UIEvent type: 0; subtype: 0; to window: <_UISystemGestureWindow: 0xd0ce22300>; contextId: 0xa0e094e4 |
00:01.793.924 | SpringBoard | com.apple.SpringBoard | Icon (register tap on Icon) | Allowing tap for icon view 'com.apple.Music' |
00:01.795.476 | SpringBoard | com.apple.UIKit | EventDispatch ( send to application window) | Sending UIEvent type: 0; subtype: 0; to window: <SBHomeScreenWindow: 0xd0b129c00>; contextId: 0xe51eda8f |
00:01.799.194 | SpringBoard | com.apple.SpringBoard | Icon ( process Touch) | SBIconView touches began with event: <UITouchesEvent: 0xd0c20f200> timestamp: 6200.01 touches: {(<UITouch: 0xd097501c0> type: Direct; phase: Began; is pointer: NO; tap count: 1; … location in window : {373, 859.33333333333326}; … |
00:01.869.166 | SpringBoard | com.apple.SpringBoard | Icon (Start Launching) | Launching application com.apple.Music from icon <private>, location: SBIconLocationDock |
00:01.878.367 | SpringBoard | com.apple.runningboard | General (Send Launch Request to running board) | Sending launch request: <RBSLaunchRequest| app<com.apple.Music(…)>; "FBApplicationProcess"> |
00:01.881.191 | runningboardd | com.apple.runningboard | Job (Start application) | Creating and launching job for: app<com.apple.Music(…)> |
00:01.889.972 | launchd | - | - | Starting Music App |
00:01.894.277 | runningboardd | com.apple.runningboard | Assertion (Record PID) | Added pid 1.098 to RBSAssertionManagerStore; count 21; size 4.096 |
00:01.894.651 | SpringBoard | com.apple.FrontBoard | Process (Register PID with Window Server) | [app<com.apple.Music>:1098] Bootstrap success! |
00:01.938.164 | Music | com.apple.amp.Music | Application (App Started) | Welcome to MusicX! |
00:01.949.997 | Music | com.apple.amp.mediaremote | MediaRemote (Initialization) | MediaRemote server initializing |
00:01.951.237 | SpringBoard | com.apple.FrontBoard | ProcessScene ( App is now In Focus) | [0xd0a890300:(FBSceneManager):sceneID:com.apple.Music-default] Scene lifecycle state did change: Foreground |
00:01.964.835 | mediaremoted | com.apple.amp.mediaremote | NowPlaying | Set: 【 LOCL (iPhone) ❯ com.apple.Music (1098) Music ❯ Music 】 setting playbackQueueCapabilities from <request> to <private> |
Finally, the Instruments application supports Input Filters, to help quickly locate specific log entries. The filtering mechanism accepts two input fields, which can be combined for more precise results.
First, you can use the filter search box with the syntax <filter category>:<filter value>
to isolate specific log entries. For example, process:SpringBoard
would show SpringBoard related logs and nothing else. You can also search for any text strings across all categories, simply by typing it out, without a category identifier. For example, the string Welcome to MusicX
would identify the event that signals the start of the Music Application.
Secondly, although not clearly marked as such, you can press the Input Filter button, to filter results by threads. Notice you can specify the filters independently. Contradicting filters ( runningboardd
in the thread filter and process:Music
in the search box) are valid selections, but they would display an empty list of results. The screenshot below showcases an example of these filters.
This section introduced you to some of the tools that will help you better understand the vast ecosystem Apple has put together over the years. These techniques do not replace perusing Apple’s documentation or their WWDC sessions. In fact, you will find references to many of them, scattered throughout this book. Instead, together with the Xcode Debugger, they will help you verify your assumptions and put the information in a more practical context.
To be Continued…