Bookmark and Share
My Photo

Oxford England

  • The End of the Road
    These photos are separated from my Travels album because Oxford is something of a second home. I still manage to visit it several times a year. So the pathway between Manotick and Oxford is well trodden and I can likely do it with my eyes closed - and probably have on more than one occasion.

Royal Roads University

  • Hatley Castle
    This series of photographs was taken over the last few years. I have stayed at the campus of Royal Roads on several occasions and I have been repeatedly impressed by the grounds. They are in many ways a little-known treasure.


  • Kafka Statue
    Here is a selection of pictures I have taken during my travels over the last few years. I am very obviously an amateur photographer and it is not uncommon for me to forget my camera altogether when packing. What the pictures do not convey is the fact that in these travels I have met, and gotten to know, a great many interesting people.

Manotick Ontario

  • Springtime in Manotick
    Manotick Ontario Canada is the part of Ottawa that I call home. Much of Manotick stands on an island in the Rideau River. Interestingly, the Rideau Canal, which runs through and around the river, was recently designated a World Heritage Site by the United Nations. So this means that the view from my backyard is in some way on a similar par with the Egyptian Pyramids - although the thought strikes me as ridiculous.
Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported

« Information Management Best Practices | Main | The Power of Artistic Mastery »

December 05, 2010


Feed You can follow this conversation by subscribing to the comment feed for this post.

Joe Gollner

Get Smart: Architecting Content for Maximum Value (eContent Article)

It sounds simple enough: We'll architect our content to maximize its value to the business. Everyone can get behind that. Of course, it's not quite as simple as it sounds. This would explain why very few organizations actually walk the talk and direct consistent attention, and investment, toward architecting their content with a view to maximizing its value. It's not that most organizations don't see the need or that they don't want to do it. It's just that it has proven remarkably difficult.

There are many reasons architecting content is so hard. One is that the tools and standards that allow us to work with the real complexities of content have only now begun to mature. Another is that the mainstream technology infrastructure has only recently become fully capable of leveraging richly designed content. As examples, the revolution sparked by social media and full-featured mobile devices has recently provided engaging new channels through which to deliver content. With these new channels enjoying high levels of media, and therefore management, attention, many publishing units have been tasked with addressing them as a matter of urgent priority. Finally, it is only with more than 20 years of experience that the community of content practitioners has established the store of lessons learned that can guide how we architect our content and the associated technologies. The good news is that these pieces are now in place, and organizations can now tackle the question of how to best architect their content assets so as to maximize the value being returned.

Content, almost by definition, is what we are trying to communicate. It is the way we deliver useful information. It has always been a challenge to design content that can play this role efficiently and effectively. It was a challenge when all we had to worry about was producing high-quality printed documents. And while we still need to produce good hard copy today, we also need to address a seemingly endless array of new and changing channels. Organizations have come to the realization that we really do need to architect our content differently for today's marketplace.

If I were to summarize in a single word how we need to design our content, I would suggest that it must become fundamentally more "intelligent." We need content that, once it has been prepared, can interact with smart automation to respond dynamically to the many different channels and contexts where it may be used. Content that has been cast into a single physical form, no matter how high its quality, will be seriously limited in its potential value if someone has to intervene manually each time a new demand surfaces. One of the harshest lessons that we have learned over the years is that automation is essential if we want our content investments to scale up to the level of usage where real benefits are generated. And as we have also learned, there is no magic in automation. If we want automation to handle wide-ranging customer requirements effectively, then we need to invest in how the content itself is designed.

It is not simply a matter of embracing a bucket of open standards, as that, by itself, leads nowhere. It is not simply a question of purchasing a given technology no matter how completely it claims to streamline all the tasks revolving around content. What is called for is the adoption of a pragmatic approach to investing in the design of intelligent content and in the smart automation that leverages that intelligence. More important still is the need balancing these investments with a continuous stream of innovations to satisfy customers' needs. Unless we engage real customers doing real things, then all of our investments in intelligent content and smart automation will drift off track. In fact, I would argue that content is only intelligent and automation is only smart when they have been grounded in actual business improvements that validate those investments.

When we survey all the changes happening in the field of publishing, it is difficult not to feel excited. Experience has shown that tremendous things can be achieved when we architect content to be intelligent and leverage that intelligence to realize escalating benefits now and in the future.

This article appeared in the December 2010 issue of eContent Magazine, posted on 22 November 2010. I could not resist adjusting some of the wording (see inline italicized text) in the text to returning it to its original sense.

Joe Gollner

Case Study: Design of a New Aircraft (ASIS&T Bulletin Article)

The business problem. In setting out to design a completely new aircraft, an airplane manufacturer realized that they were faced with both an opportunity and a challenge. The global marketplace for aircraft was changing rapidly, and radically new design concepts were needed. This business environment meant that the very latest in design technologies and manufacturing techniques would be required. One of the key obstacles to be overcome in moving forward, however, lay in the fact that the content sources for the current aircraft designs, and the engineering standards on which all new designs needed to address, existed in a number of different formats, ranging from proprietary databases, arcane desktop publishing files and even custom data structures with their own unique, dedicated compilers. These sources were shared across many aircraft fleets, and encompassed both military and civilian variants. Some of the content sources were even shared with competitors. If the aircraft manufacturer wanted to embrace full-scale innovation in the design of their next generation aircraft, they would need to dramatically increase the level of intelligence exhibited in this bewildering volume of content sources.

Goals and objectives. What was needed was an intelligent content strategy that would establish the authoritative source for all content assets and that would set out a sustainable approach to managing these sources so that they could be used by a massive array of consuming applications.

The solution. The intelligent content strategy needed to accommodate what was termed a multi-dimensional content architecture where content assets would be managed in a way that would simultaneously support many different standards. This goal was accomplished by deploying an extensibility framework based on the Darwin Information Typing Architecture (DITA).

Once in DITA, the content sources would be variously pulled into the three-dimensional design modeling environments, the part selection applications, the product data management systems, and the manufacturing control tools. In all cases, these environments, applications, systems and tools would be operated by different suppliers working in various locations around the world and using software products provided by many different vendors. A sophisticated content-sharing architecture was established where content was dynamically accessed, modified, augmented and monitored across this global network of collaborators. Driving the sophistication of the architecture were considerations such as security, with the entire program operating under strict export controls, and performance, as necessitated by the fact that the design and manufacturing tasks needed to be coordinated on a near real-time basis.

Project success. Leveraging the new level of content intelligence this aircraft manufacturer was able to move ahead with design innovations while at the same time ensuring that the rich design knowledge available within historical repositories could be leveraged. They were able not only to maintain the required levels of control and oversight, but to take them to an even higher level. It turned out that one of the key benefits associated with heightened content intelligence is the ability to apply very precise analytics to every step in the content lifecycle.

The types of aircraft that can be designed and manufactured using an intelligent content strategy are fundamentally superior to anything that has come before. The aircraft being produced are safer, more maintainable and much more economical to operate. And future aircraft design projects will have the benefit of starting from a far more intelligent content repository of historical knowledge, and engineering and regulatory guidance.

Challenges. Finding the authoritative source for any given element of content was far harder than was expected, and once identified, the authoritative content sources were found to exist in a wide range of proprietary formats. Establishing reliable and cost-effective ways to extract the content sources from these legacy formats, and to enrich them with the necessary intelligence, proved to be a challenge. A number of technologies and techniques were introduced to overcome these obstacles. Authors and editors were also going to need specialized tools to handle these complex structures efficiently and effectively. Addressing the authoring challenges relied mostly on the iterative refinement of an authoring version of the content schema so that the markup choices being presented to authors could be kept manageable.

At the end of the project, one of the lead developers working on the solution confessed something to the client: “I have to tell you that many parts of this project were really difficult.” A senior technical representative from the client organization did not hesitate with his answer. “That’s OK, we thought it was impossible.”

I contributed this case study to the article in the Dec 2010 / Jan 2011 Bulletin for the American Society for Information Science & Technology (ASIS&T). I have made a number of additions and revisions (see inline italicized text) in order to enhance the copy retained here. In addition to this case study, I did have a hand in some of the other parts of the article.

Tom George

This is an exceptional post on the topic of content strategy. I see why you got the nomination.

The comments to this entry are closed.