Bookmark and Share
My Photo

Oxford England

  • The End of the Road
    These photos are separated from my Travels album because Oxford is something of a second home. I still manage to visit it several times a year. So the pathway between Manotick and Oxford is well trodden and I can likely do it with my eyes closed - and probably have on more than one occasion.

Royal Roads University

  • Hatley Castle
    This series of photographs was taken over the last few years. I have stayed at the campus of Royal Roads on several occasions and I have been repeatedly impressed by the grounds. They are in many ways a little-known treasure.

Travels

  • Kafka Statue
    Here is a selection of pictures I have taken during my travels over the last few years. I am very obviously an amateur photographer and it is not uncommon for me to forget my camera altogether when packing. What the pictures do not convey is the fact that in these travels I have met, and gotten to know, a great many interesting people.

Manotick Ontario

  • Springtime in Manotick
    Manotick Ontario Canada is the part of Ottawa that I call home. Much of Manotick stands on an island in the Rideau River. Interestingly, the Rideau Canal, which runs through and around the river, was recently designated a World Heritage Site by the United Nations. So this means that the view from my backyard is in some way on a similar par with the Egyptian Pyramids - although the thought strikes me as ridiculous.
Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported

« Life, Death and XML | Main | Professional Publishing »

April 02, 2013

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00e54f8d09138833017d4278877b970c

Listed below are links to weblogs that reference The Joy of Reuse:

Comments

Feed You can follow this conversation by subscribing to the comment feed for this post.

Scott Abel

As usual, great content. Thanks for sharing, Joe!

Don Day

"SGML dirty tricks" were used in DITA out of necessity, Joe, because the then-new XML spec was not quite complete enough to deserve the X for "eXtensible." In fact, out of the highly popularized X-troika (XSL, XLink and XPointer) of follow-on standards, only XSL-T (for transform) was reliably functional at the time, so it was explored deeply for whatever it could do to provide, in standards-compatible way, some of the richness of SGML in lieu of the other nascent standards. Using @spec (later changed to @class) as an architectural form identifier tied to an absolute base class enabled the eXtensibility that we felt was missing in XML, and an XSL-T trick for copying XML "snippets" became the basis for the missing SGML conref feature (which met the need for general entities, which we could see were unusable for the distributed page model implicit in XML for the Web, and which didn't exist in XML Schema).

Thus the earliest talking points about the architecture that I recall Michael Priestley expressing were that DITA provided reuse of content (via conref and topicref), reuse of design (by referencing prior classes and their content models as the basis for new elements and their content) and reuse of processing (the ability of XSL-T templates and CSS properties to trigger on the base class if nothing else, ensuring always-reasonable default processing of newly-specialized elements). Dave Schell, the development manager who sponsored the DITA design activity, would often say, "and DITA is all about reuse." Dirty tricks indeed!

Joe Gollner

Hi Don

I would add one important point here and that is that in my vernacular the epithet "dirty tricks" is a compliment. It's the sort of thing that "old hands", or as the Germans delightfully put it "old rabbits", do in order to get things done and to save a little energy and grief doing so.

I can't resist one story. It was 2004, and I received a visit from an enthusiastic sales resource from an XML product vendor. The question this individual repeated, to the point of being annoying, was "how much DITA experience does your team have?" I had to smile. As it was only 2004, there was not much DITA experience around simply because there were not many DITA implementations around. But the question conveyed some interesting things about my innocent inquisitor. To this individual, DITA was something akin to manna from heaven as it seemed to supply a powerful religious sign that would finally bring unity and momentum to a fractured content management industry that had grown tired to wrestling with so much complexity.

Now having been around awhile, I took a look at DITA and recognized almost everything I saw in it - from the delightful reuse of the term conref through to the echoes of HyTime architectural forms. All this I took as exceedingly good signs even if the precise implementation was different than what we were in the habit of doing in the good old days of SGML.

I recognized all these items precisely because in the second generation defense standards I had worked on under the CALS (Continuous Acquisition and Lifecycle Support) umbrella we had similarly pursued reuse to almost every possible conclusion. And our implementations worked remarkably well, I must say. In fact, one of our strict rules was constraining the natural tendency of technical resources to gravitate towards self-defeating sophistication by setting exceptionally harsh tests for the standards we were creating and their associated reference implementations. One such constraining test was that validation and resolution of all content artifacts must be something we could do on a 286 computer, running only DOS and a conformant public domain SGML parser. An admittedly harsh test but it had the desired effect. Our final reference implementation addressed a very wide range of publishing requirements and did so with essentially a tea cup full of application code. By leveraging ideas such as information typing and publishing overrides, we managed to distill literally hundreds of pre-existing DTDs into one that had fewer elements and attributes than HTML and to compress literally dozens of complex FOSI stylesheets into one articulated process leading to one modular (and multi-output) stylesheet. So in DITA I recognized a fellow traveller, albeit ten years after the work we had been doing.

And the leap into the nascent world of XML recommendations and tools definitely did complicate matters for those seeking to carry the lessons from that earlier era forward. I can definitely sympathize. In some of my more cantankerous moods, I have been heard to say that in several regards the new galaxy of XML development tactics did not do us many favours when it came to tackling the hard problems of handling content well. Elsewhere I have explored the details and implications of this phenomenon, which I dubbed "XML in the Wilderness".

So in response to your comment, Don, I would say that my reference to SGML dirty tricks is meant as a compliment to, and a major plus for, DITA. From very early on in DITA's public life as an OASIS standard, I used the argument that DITA's standing in a longer markup history is a key reason to accord it serious attention. These arguments I found myself using in environments like aerospace and defense projects where DITA did not originally receive a particularly warm reception.

Now it is true that there is another implication behind my connecting DITA back to this earlier generation of standards and implementation efforts. It does tend to deflate those people, like that strange sales resource who came to my office in 2004 essentially looking to cash in on this "new" thing, who see DITA as a revelation and as a deliverance from an earlier era of enshrowding darkness. These are the same people who, given to a form of standards hagiography, quickly descend into attacks on everything different than DITA as "proprietary XML" and other forms of heresy. As you may already know, I have been inclined to treat these people rather roughly - for their own good of course.

Don Day

Agreed, I was hearing your "SGML dirty tricks" as I hear Roger Whittaker singing "Dirty Old Town" with his interpretation of sometimes-too-close familiarity (not a bad thing!).

Your post makes me wonder, though, What is needed in XML to take reuse and contextual adaptiveness (to camp onto the newest content trend) to any higher levels, short of looking for other application-level tricks to emulate such function? I'm pleased that several vendors have been able to create XProc-based processing implementations, for example, but that only applies on the back end. The newest challenges in content architectures (which all seem to be about applying content intelligence) aren't any easier without still relying on the emulation of SGML tricks to improve how we match markup design to new requirements. And increasingly, we've got the challenge of how to incorporate the best of HTML5 and related advances in Web architectures as part of the design portfolio that we are obliged to work with.

You may sense that I'm posing that question with my OASIS DITA Technical Committee hat on... if DITA 2.0 is to be worthy successor to DITA 1.X, what might we do any differently besides falling back on these SGML dirty tricks, reliable as they may be? It would be nice if the XML standard could give us some new genetic material to work with!

Joe Gollner

You raise a very good point here Don and one that is going to have me tossing and turning for several nights (so thanks for that). In my paper on the Emergence of Intelligent Content, I painted a very optimistic picture of the opportunities that lie before us now that we have several threads potentially coming together - XML returning from the wilderness, social media models for collaboration becoming common practice, DITA bringing a welcome return of attention to core "content challenges", the proliferation in devices driving a renewed interest in multi-channel publishing, and the next generation in Web standards and architectures.

The optimism part is easy although I think that there was merit in formally declaring it. The hard work, as you point out, is creating something that is well and truly a creature of the future. I believe that there are some core architectural strategies, also reused from the past, that can help us but they will not really change the fact that hard work lies ahead. Substantial rewards though are also out there and that in itself might tell us a little something about the path to get there....

Verify your Comment

Previewing your Comment

This is only a preview. Your comment has not yet been posted.

Working...
Your comment could not be posted. Error type:
Your comment has been saved. Comments are moderated and will not appear until approved by the author. Post another comment

The letters and numbers you entered did not match the image. Please try again.

As a final step before posting your comment, enter the letters and numbers you see in the image below. This prevents automated programs from posting comments.

Having trouble reading this image? View an alternate.

Working...

Post a comment

Comments are moderated, and will not appear until the author has approved them.