Cover Image for The Subjective Charms of Objective-C.
Mon Apr 14 2025

The Subjective Charms of Objective-C.

The programming language, which once seemed to be a universal form of communication, is no longer perceived that way.

Gottfried Leibniz, known for his invention of calculus, actuarial tables, and the mechanical calculator, among other contributions, felt that his work was not complete. Since his childhood, this 17th-century polymath yearned to develop what he called a characteristica universalis: a language capable of perfectly representing all scientific truths, thereby facilitating the discovery of new ideas through the simple act of writing grammatically correct sentences. This "alphabet of human thought" was intended to eliminate confusion and fallacy, and Leibniz dedicated himself to this task until his death.

Today, an echo of his ambition persists in programming languages. Although these do not encompass the entirety of the physical and philosophical universe, they resemble his dream by allowing manipulation of the zeros and ones that make up a computer’s internal state, an invention also attributed to Leibniz. Computer scientists who dare to create new languages seek their own characteristica universalis, yearning to develop a system so expressive that it eliminates any possibility of errors, making comments and unit tests unnecessary.

The expressiveness of a programming language is as much a matter of personal taste as it is of information theory. My fondness for certain languages was shaped by the first one I learned on my own: Objective-C. Defining Objective-C as a language with divine characteristics would be akin to stating that Shakespeare is best appreciated in Pig Latin. This language is, at best, polarizing. Often criticized for its verbosity and odd brackets, it is used exclusively for Mac and iPhone applications and would have faded into obscurity in the 90s if not for an unexpected twist in history. However, during my time as a software engineer in San Francisco in the early 2010s, I found myself debating in bars or in the comment sections of tech platforms about its more cumbersome design choices on multiple occasions.

Objective-C became part of my life at a crucial moment. As a student about to graduate, I discovered my interest in computer science too late to specialize in it. I watched teenagers surpass my knowledge in software engineering classes, but my university did not offer mobile development courses. Thus, I found a niche and learned Objective-C that summer, guided by a series of cowboy-themed books called The Big Nerd Ranch. When I wrote my first code and saw it illuminate the pixels on a device's screen, I fell in love with Objective-C. I felt it had the power to express myself without limits, and I believed I could create anything I imagined.

However, Objective-C emerged during the frenetic early days of object-oriented programming, and for many reasons, it should not have survived that era. During the 80s, software projects grew too large for a single person or even a single team to develop them alone. To facilitate collaboration, scientist Alan Kay developed object-oriented programming, a paradigm that classifies code into reusable "objects" that interact via "messages." In 1983, Tom Love and Brad Cox, engineers at International Telephone & Telegraph, combined this paradigm with the readable syntax of the C language to create Objective-C.

The early years of my experience with this language were fascinating. I admired how objects and messages took on a structure similar to sentences, marked by brackets like [self.timer increaseByNumberOfSeconds:60]. However, over time, the verbose nature of Objective-C began to reflect in my own perspective. How can an engineer tell a computer what to do without employing an extensive language? Objective-C was undoubtedly verbose; as the code grew, its structure became increasingly confusing and error-prone.

In early 2014, Apple announced Swift, a new language designed to solve the problems that iPhone and Mac developers had encountered with Objective-C. This new language eliminated the features they most loathed about its predecessor: the annoying brackets and the need for prefixes like “NS.” Despite my growing disdain for Objective-C, I wasn’t excited about the idea of learning Swift, as I knew my time as a software engineer was coming to an end.

Ultimately, my effort to find my own characteristica universalis continued, even though it was evident that it would always remain elusive, much like it was for Leibniz. Before leaving my job, a new computer science graduate joined my team and had spent a summer learning Swift. With enthusiasm, he saw in this language a divine form of expression, clean and effective.