Finally, from so little sleeping and so much reading,
his brain dried up and he went completely out of his mind
One question that I regularly encounter amid projects deploying new content technologies is "why is this always so hard?" Another variation on this theme takes the form of "why is XML so hard?" Most recently a blog post entitled "Has XML failed publishing?" sparked a stream of interesting exchanges delving into the different sources and aspects of implementation challenges. Considering these questions puts me, naturally enough, in mind of Don Quixote, Cervantes' knight-errant whose overindulgence of reading emboldened him with a sweeping vision of how the world might be and set him on an unbalanced mission to see that vision realized. The leap from one topic to another might seem like a bit of a stretch but I have found it a useful way to approach the initial questions, with all their gritty practicality.
So why is the implementation of content technologies, on almost any scale, so hard? And why is XML, a standard that was intentionally designed to be simple, so hard to deploy in most publishing environments? The answer is, paradoxically, quite simple. Content technologies, and the XML family of standards that these technologies invariably leverage, are all about integration and the blunt truth is that integration is always hard - even excruciatingly hard. So very much like Don Quixote, people who champion the deployment of content technologies into publishing environments are really championing a new, more integrated way of doing business. And a bit like Don Quixote, these champions usually don't fully appreciate how different this integrated way of doing business is from what is routinely done today. Indeed, these champions discover, as they push forward, that there are really many different layers and types of integration that are involved - many more than they would have ever guessed before setting out.
What makes integration hard is the fact that it entails cobbling together disparate parts to produce a unity that is, as the saying goes, greater than the sum of those parts. It is about framing a vision of a new level of performance and assembling all of the necessary parts to make that vision a reality.
Where publishing is concerned, integration entails unifying all aspects of the publishing process into a seamless, and streamlined, whole. From a business perspective, this calls for building an environment that, following a lean manufacturing regime, can be closely monitored and continuously improved. From an organizational perspective, this calls for breaking down the barriers between institutional silos and disciplinary fiefdoms. From a technology perspective, this calls for introducing open standards into the mix and enforcing a strict separation of the content resources being managed from the tools being used. And from the all-important perspective of the customers of published material, this calls for the ability to produce a new order of products - products that will be compellingly effective in any format that the customer deems appropriate at a given point in time. Now what really makes the integration of the publishing process so hard is that all of these steps must be taken together if a successful transition to a future-friendly posture is to be made.
It should not therefore surprise us when we hear that a given initiative to introduce modern content technologies into a publishing process has encountered challenges. In fact, the reverse - that of hearing tales of effortless change - should be what arouses the strongest suspicion.
And the real reason why content technology projects seem to suffer so many setbacks is that they rarely, if ever, proceed with a sufficiently broad scope of integration. This is in fact why introducing localized enhancements in one part of the publishing workflow, say around improving how eBooks are produced, will immediately raise questions about how other parts of the overall process might be modified in order to work around specific production problems. And this is why, if the integration agenda is followed back to the content sources, there is such an impact on authors and editors who can find themselves confronted with a working environment that is fundamentally different, and more multifaceted, than what they have been familiar with. It turns out, then, that integration is very difficult to achieve in a piecemeal fashion.
But piecemeal implementation is what, in almost all circumstances, we must attempt. Champions of new content technologies rarely, if ever, get approval to undertake anything more sweeping than a series of self-justifying localized improvements. This then leads us into the very Quixotic role of continuously pronouncing a grand strategic vision while proceeding on a much more mundane level of small-scale investments. So the analogy established between champions of integrated content technologies and Don Quixote may be more appropriate than might have first appeared. When we consider Don Quixote heading into battle outfitted with mismatched tools and recycled artifacts, working as it were on a shoestring budget, then the analogy crosses a line and becomes a little too familiar - perhaps uncomfortably so.
Sadly the applicability of the analogy goes even further. Don Quixote, as we all know, has more than a few rough encounters with reality. And so will the champion of open content technologies and frequently these buffets come from an unexpected corner - the Information Technology (IT) groups. It turns out that not only do the participants in publishing processes find the implications of integration vexing. The IT groups, to whom many organizations turn for assistance when introducing new technologies, usually find the implications of integration to be no less problematic. In fact, they frequently find the central tenet of integration, the obligation to carefully and completely separate content resources from the application behaviour that might be applied to those resources to be a bit too much to accept. Or at least it is too different from the data-driven business applications that they are accustomed to building and supporting.
In the types of business applications that the IT groups are familiar with, there has historically been ways to separate data from behaviour but these are far, far less radical than those associated with content technologies and publishing processes. And the data resources involved in these business applications are far, far more closely aligned with the processing behaviour than is typically the case in publishing scenarios. Business data, in fact, is typically selected, represented and stored precisely because it is directly amenable to the processing behaviour that is envisioned. Publishing scenarios, in sharp contrast, typically reverse this relationship and routinely present the processing behaviour with data patterns that fall outside what is expected or for which no behaviour has been defined at all.
This difference of perspective is more than just theoretically interesting. When a content technology project gets rolling, it is among the most common outcomes of the project for the initiative to be sideswiped by the IT group. The representatives of this group appear one day and declare that the available enterprise software tools and practices are more than sufficient to handle the real requirements once these are digested properly (see my earlier post on Fear of Content). And our hapless champion of the content technologies, such as may in fact be called for in order to genuinely address the needs of an integrated publishing process, is unceremoniously thrown to the ground.
Now one of the ironies at play in this story is the fact that this situation is entirely predictable. The truth of the matter is that content technologies are, at their very root, intended to facilitate radical integration. So what is different about these technologies, and the associated techniques, should not really come as such a surprise, or affront, to the IT groups. Now if we follow the history of XML all the way back to the late 1960s, we find a number of integration problems being confronted. At the time, Charles Goldfarb and his colleagues at IBM were exploring ways to move legal working documents between virtual machines on a mainframe in a way that would not take down the mainframe. The community of typesetting specialists were at the same time arriving at the recognition that refining and proliferating typesetting languages was going to quickly undermine the viability of the entire publishing industry. And the aerospace and defense industry was simultaneously realizing that the publishing and documentation management costs associated with the emerging generation of integrated weapon systems was simply not sustainable without a more intelligent application of computing technology to the problem of interchangeable electronic documents. And so it was that open markup standards grew in ways that were explicitly intended to enable the movement of content across all manner of boundaries and to facilitate the introduction of progressively more ambitious automation to achieve higher and higher levels of quality, timeliness and cost-effectiveness.
So integration is implicit in the tools and standards that make up content technologies. To introduce content technologies is to introduce the wildcard of integration into an environment. Sometimes this comes as a surprise. And sometimes, when an organization is settled in its ways and is comfortable dealing with a specific set of technology problems, it comes as something of a shock.
So we can see, rather too clearly, that the reason why content technology projects are hard, and why more tersely XML is hard, is because their introduction amounts to opening an Pandora's box of integration challenges. And, for quite real historical reasons, integration is not something that most organizations have much experience pursuing. Nor for that matter do they have a natural taste for the turbulence that comes with it. This is true on the levels of both business activities and technology considerations.
None of this is really a matter of pointing fingers or laying blame. Integration, and by that I mean genuine and deep integration, is profoundly difficult. That very few have ventured down this path, and that fewer still have ever returned successfully, should come as no surprise. That people will express trepidation about embarking on this type of journey should also come as no surprise. Whether they have prior experience to draw upon or not, their fears have substance and need to be respected.
The catch in all this however, and the reason for this post, is that organizations and their people do not have any choice now but to prepare for exactly this type of journey. Whether the organizations are commercial publishers or enterprises that need to deliver useful information to their clients, the shifts in the marketplace have made it impossible for them to be satisfied with their old ways of doing things. And the need to drive useful and relevant content to the latest mobile device is only the most visible example of the change. More serious is the increasingly intense competitive pressure that is being applied to publishing processes. And the only response to this building pressure is to pursue a program of radical integration in a manner that has already been seen in the manufacturing and logistics sector. The time has come for publishing to catch up and to be counted as part of the 21st Century. It is time for publishing to embrace content technologies and the possibilities for radical integration, and improvement, that come with them.
Perhaps the jarring experiences and hard lessons associated with past integration efforts will be recognized as valuable now that so many organizations need to start down this path. As with the adventures of Don Quixote, the relentless pursuit of a vision can prove its value by helping to illuminate reality more clearly and by edging us slowly in the right direction.