Last week while sitting in a “tutorial” session at OSCON 2008 I had occasion to ponder the relationships between the progression of technology, my own progression as a technologist, and the way that organizational constructs coalesce in response to social and technological pressures. The combination of a momentary disconnection from the tumult of my work-a-day existence and subjection to a rapid fire treatment of myriad technologies must have provoked such a contemplative state. This state consisted of equal parts wonder and terror.
The last three years have made for an interesting chapter in my odyssey as a software developer. I began a Skunk Works style project as a solo programmer and have since witnessed it grow and evolve in fashions both amusing and bemusing. A relatively amorphous ball of code gradually stratified into a general purpose infrastructure and a collection of mission specific plug-ins on top of it. My own role has in parallel shifted away from cutting code and toward cogitating on matters of architecture, project mechanics, and community engagement. Thus my daily life on this endeavor has in many ways transitioned from being a specialist to a generalist.
This transition has yielded many advantages and challenges. Donning the mantle of the generalist has rendered it possible for me to bring a project of non-trivial complexity to fruition expeditiously. Greater command of the big picture, however, has naturally come at the expense of intimate knowledge of every detail. The right colleagues, carefully vetted and then gradually delegated increasing responsibility, makes taking an increasingly hands-off role more palatable, but I am still left with the significant challenge of finding the right balance.
This conundrum feels to me like a microcosm of the larger software landscape. Today’s collection of technologies, tools, applications, languages and frameworks boggles the mind. Thorough mastery of two programming languages marks a developer as a rare individual, and those who command three fluently are nearly non-existent. Comprehensive knowledge of a particular database platform entails such an undertaking that expert knowledge of two seems unlikely. The litany of frameworks for any given problem domain often makes remaining au courant a Herculean undertaking. The proliferation of sub-specialties in Computer Science has long since made comprehensive knowledge of the field an impossibility.
Yet all is not lost. A software developer may endeavor to manage the complexity in a variety of ways:
- learn several programming languages from different families and extract general principles
- develop a core of CS knowledge centered on data structures, algorithms and protocols
- take the time to understand computer system architecture
- acquire deep knowledge of a few specialties while building a mental road map of the profession as a whole
- obtain a high degree of proficiency in a core set of tools such as a text editor, a version control tool, and a defect tracking system
This works to a degree, but it can only take one so far. A major challenge for software professionals consists of dealing with the ecology of a system that includes a whole planet of people. Technology decisions often go a certain way not because of superior design and implementation but rather because one or more market elephants went a certain way. Sometimes the success of a given technology rests overly much on the shoulders of a small number of individuals and either the weight of the world crushes them or life generates sufficient distraction. Sometimes politics, government and consumer advocacy groups drive the ebb and flow of technologies, yielding boons for some and irrelevancy for others.
At a higher level of abstraction, the challenges of knowledge management become more exasperating still. Integrating the tools and expertise of an entire enterprise, or sometimes even just a sub-domain thereof, requires an architect to move firmly in the direction of being a generalist. The breadth of information that one must digest and leverage leaves no room for being a specialist.
While the generalist struggles to discern the specks on the ground from his vantage point in the sky, the specialist unavoidably suffers a myopia that prevents him from seeing the full context in which he works. This requires one thing above all else: trust. The generalist must be able to trust the specialist to build an implementation that exhibits robust adherence to the spec. The specialist must trust the generalist not to lead him astray. The generalist must abstract his knowledge enough that he can see the big picture, but inevitably he finds himself plagued with the sentiment that he may be too detached from reality. The specialist must possess sufficient arcane knowledge to wield his tools proficiently, but a nagging feeling of being too far down in the weeds is his constant companion.
The progression of technology both improves and exacerbates the situation. Greater computing resources afford us the high level languages and virtual machines that offer a greater degree of expressiveness and blissful ignorance of the messy low level details. Were we inclined to use these technologies merely to do old things more elegantly, things would become easier. Naturally, however, we (mostly) use such faculties to build ever more abstract methodologies of labyrinthine complexity that ultimately leak, often doing so in befuddling ways. Thus we create more sub-domains to be inhabited by specialists and spread the brain cycles of generalists thinner still.
We’ve seen various backlashes as a consequence. In the race to provide more features, security has taken a back seat and people have started to notice. The typical home user finds the non-reliability of his Wintel box maddening. Corporate IT managers bemoan the chaotic landscape of the modern enterprise and its utter non-amenability to sane configuration management. Yet the choking growth of technology continues, yielding bloat, non-orthogonality, brittleness, and incomprehensibility in spades.
As we accelerate toward a technological singularity, one might ask whether it is more likely to promote bewildering anarchy, stultifying despotism, or a state of enlightened transcendence. A proliferation of technologies that evolve constantly makes the management of sclerotic bureaucracies challenging. Large scale problems that require capital intensive infrastructures result in centralized power. Disruptive technologies create asymmetric threats that hurt lumbering giants disproportionately. A pervasive surveillance apparatus risks handing a lock on the game to a government that goes off the rails and ceases to serve its populace. A robust collection of open and hackable software libraries allows individuals to create their own realities. A desire for tractable configuration management pushes organizations toward monoculture. Monocultures expose organizations to huge security risks and stifle innovation. These competing forces leave would be seers with much to ponder.
For a species hard-wired to find beauty in simplicity, we have a rather poor track record of crafting simple but effective systems. The exigencies of the world nearly always favor a “good enough” approach for any given problem. These sub-optimal solutions, which accrete steadily over the eons, produce the existence that we know all too well. This is not a phenomenon unique to engineering. Witness only the laws of a democracy, a pile of inconsistent and overlapping hacks accumulated over decades or centuries, and you will know this to be true.
— AWG