Archive for May, 2008

h1

What Can We Learn About Motivation from a Tornado?

May 28, 2008

I spent the last four days scheduling volunteers for an organization supporting the disaster relief after last week’s tornado in Colorado. When I came home each evening, I would find another installment of a lively discussion about motivation on the mailing list for the Society for Technical Communication’s (STC) Management Special Interest Group (SIG). It wasn’t until the fourth day that I realized that my “day job” as a volunteer was also a lesson on motivation.

So, what did I learn?

  • For the right cause, people will do pretty much anything.

    For example, I found out at 6pm on Friday night that I needed to assemble a crew of 10 to unload a supply truck at 5am the next morning. Hard work at an insane hour, but I got commitments from everyone I needed within 3 hours. And, for every activity I needed to schedule, we had more volunteers than we needed. This response didn’t happen because I have some magical skill as a motivator; it happened because the need was critical.

    Most business objectives are not as clearly “right” as feeding disaster victims, but if your objectives make sense, are clearly communicated, and can be seen as productive for the organization, you’ve got a much better chance of having motivated people.

  • A strong group helps keep people motivated.

    Nearly everyone who volunteered came as part of a group, and not just groups like Red Cross and Salvation Army that exist specifically for disaster relief. Churches and other community groups were a big part of the effort. While there were a few stalwart folks who volunteered independently, they were in the minority. The reason is pretty clear; if you’re a member of a strong group, you have a motive to serve the group as well as a motive to serve the group’s objectives. The two reinforce each other.

    Given today’s highly outsourced, geographically diverse projects, most of which operate in an environment where downsizing is the flavor of the decade, it’s difficult to build a strong team, but if you can pull it off, it becomes really difficult for team members to remain unmotivated. They either get with the program or leave.

All of that said, motivation is internal. You can identify clear objectives, communicate them vigorously, create a strong team, and build a supportive environment, but they have to drink the Kool Aid, you can’t do it for them.

I believe that most people will be motivated if you do these things, but some won’t; they may be in the wrong place, be going through problems outside of work, or simply be one of those folks who never gets motivated by anything. When that happens, you can coerce them and get some results, but almost always the best thing you can do is find them a better job fit or get them out of the environment.

Advertisements
h1

Day 3: DocTrain Conference

May 9, 2008

Today was the last “official” day of the conference. Tomorrow (Friday) is a post-conference day of workshops. Unfortunately, I need to leave early Friday, so I’ll miss the workshops.

Today started with 3 keynote talks. The first was from Joshua Duhl of Quark and titled, Once Content is in XML, Now What? He introduced the idea of dynamic publishing. While the talk was mostly a pitch for Quark, it was interesting because Quark is not normally thought of as an XML company. I think it’s encouraging that companies that have mostly been in other parts of the technical documentation world are seriously working with XML.

The second keynote was Document Engineering in User Experience Design, by Robert Glushko of the University of California, Berkeley. Bob has a new book titled, Document Engineering: Analyzing and Designing Documents for Business Informatics and Web Services.

He spoke about the idea of Document Engineering as a methodology that is a synthesis of information and system analysis, and business process engineering. In an e-commerce environment, customer satisfaction is more than the User Interface. We tend to think of the customer experience as being a “front stage” function; that is, the immediately visible parts of a system. But, in fact, the “back stage” functions, the plumbing that makes everything happen matters as much to customer satisfaction.

Glushko says that we need designs that cut across back and front stage operations. He described the “Moment of Truth,” which is the point where a problem or other critical event becomes clear to the customer. Moments of Truth are often the intrusion of a back stage problem (supply chain problems or other failures in plumbing) onto the front stage. The design needs to encompass both the front and back stage to be fully successful.

The third keynote was Social Media 101, presented by Darren Barefoot of Capulet Communications. Darren was probably the most skillful presenter I saw this week in the use of presentation materials. He used very few words; large portions of the talk were accompanied by pictures that alluded to what he was discussing, but rarely, if ever, repeated what he was saying.

He discussed characteristics of social media, which include conversation, collaboration, sharing, broadening of scope, community, transparency, and authenticity. To me the critical point is that all manner of web communications have become multi-way. There are few areas where a pure, one way broadcast of information makes sense. For example, users are now part of the product support cycle, having input into every part of the support cycle.

The second critical message is that resistance is not only futile, it’s counterproductive. Companies will gain more by giving up some control and taking full advantage of social media. This means empowering your most passionate users, giving users the tools they need to help each other, and going to the places your users go. If they use youtube, you need to be there; if they use twitter, go there.

The first regular session of the day I attended was DocBook vs DITA: Will the Real Standard Please Stand Up, presented by Teresa Mulvihill. Since I’m on the DocBook TC, I was particularly interested in this talk. In general, I think Mulvihill’s views are close to mine (see my article titled, Choosing an XML Schema: DocBook or DITA). However, as someone who has worked with customers on both DocBook and DITA, she sees more difference between the two tool chains than I’ve seen (I’ve done extensive work with DocBook, but much less with DITA, so I don’t have her perspective). She strongly favors DocBook in smaller, turnkey environments; a sentiment I share, though possibly not as strongly.

We were fortunate to have people who use both DocBook and DITA in the audience and had a good discussion after the formal talk.

The next talk was a short talk by David Ashton, of SDL (the company that offers the TRADOS translation support tool), titled 24 Ways to Shut Down the Application and other Apocryphal Stories. He gave a nice high level introduction to some of the trials and tribulations of translation, tying it into the publication process.

I then snuck into the second half of Joe Gollner’s last talk of the conference, Extreme Content Makeover: Migrating Content to DITA. While the talk focused on DITA, the basic principles would apply to any data conversion. In addition to the expected exhortations to plan carefully, Joe covered some less obvious, but important points, including:

  • Establish Control Collections, which means to look for groupings of files that have similar features, and therefore are likely to need the same kinds of conversions.
  • Define the target end state, which goes beyond simply selecting a target schema; that’s just the bare minimum. You also need to consider linking, metadata, and other details of the target.
  • Prepare a conversion specification, which defines the end state, naming conventions, mappings, and so forth.
  • Establish a representative example set, which is a group of files that cover the range of the control collections and gives you a set of content for testing.
  • Build a conversion plan, which is a project plan for the conversion itself.

Overall, Gollner provided an excellent overview of what it takes to do a successful conversion, whether it be a conversion to DITA, or a conversion to some other XML format.

The last regular session I saw was Using Task Modeler to Streamline DITA Content Development, by Mark Wallis of IBM. This was a demonstration of IBM’s Task Modeler. In addition to demonstrating the software, Wallis described the methodology his team uses to structure content. They design “Task Support Clusters,” which provide conceptual, task, and reference information as a self-contained cluster of DITA topics. They use a minimalist approach, but try to keep each cluster independent, to the extent of avoiding links outside the cluster.

The software helps with the mechanics of setting up a cluster by helping authors visually build ditamaps, develop initial topics, build relation tables, and build a skeleton cluster. The interface looked straightforward, and I’ll be trying it out.

The closing keynote was titled Living Multiple Lives: The New Technical Communicator, and was presented by Noz Urbina. He talked about findings from techdoc evaluations his company has done for several companies, but the main focus was on distilling those ideas into suggestions on “What to do on Monday”? In other words, what can participants do when they get back to work next week to improve their environments.

He did what a closing keynote speaker should do; he summed up the main points of the conference, which I take as the following:

  • Moving to XML can gain significant productivity. He cites the potential for 15-30% reductions in localization costs, and another 15-30% savings from reuse.
  • You don’t get these benefits by buying some tool; in fact, selecting tools should be one of the last things you do.
  • You need to understand your needs and use cases first.
  • The job of technical communicators is rapidly changing, mostly for the better. But, to take advantage of those changes, communicators need break out of the “manual box,” establish a business-driven strategy, and work closely with other parts of their organizations.
  • Agile methodologies can be a good thing for communicators because they pull communicators into the process early on, and the methodology takes documentation seriously.

Overall, I found the conference to be useful, both through the talks and through interactions with other attendees. There was a manageable number of people, and I had the good fortune to speak with many of them, including several of the speakers. There was a casual atmosphere that encouraged discussion with the speakers and other attendees. The facility was excellent and the logistics were handled cleanly and efficiently. Overall an excellent conference.

h1

DocTrain: Day 2

May 8, 2008

Today was the first “official” day of the DocTrain conference. Yesterday was a “pre-conference” workshop day.

Things started with 2 keynotes. I was a bit late and only caught the secoind half of RJ Jacquez’s keynote: Bringing the Video Revolution to Technical Communication. This was primarily a demonstration of Adobe’s latest products, in particular, Adobe AIR. While it looks cool, I didn’t see enough to get a sense of how well it would fit in with an XML environment.

The second keynote was XML in the Wilderness by Joe Gollner. This was an interesting look at the history of XML and Content Management, which if nothing else has pushed me to check out Ted Nelson, Vannevar Bush, and Douglas Engelbart, three of the “fathers” of XML and content management. Gollner is an entertaining speaker, who has obviously spent a lot of time thinking about SGML, XML, and Content Engineering. The main points of his talk echoed his workshop, with the addition of some nice historical points.

The first conference session I saw was Ann Rockley’s Component Content Management. She defines Component Content Management (CCM) as managing content on a granular level with each component having its own life cycle. In her view, CCM is not well supported by most tools. Since terminology is still vague, this can case problems, especially when an IT organization buys a CMS and assumes it will work as a CCM. The problem is that most Web CM, Enterprise CM (ECM), and Digital CM systems don’t deal with chunks of information that aren’d documents or webpages.

Rockley then spoke about evaluating CMS software, highlighting a new publication, developed in conjunction with CMS Review, titled XML Component Content Management Report 2008, which evaluates CCM systems. She also spoke about the importance of going beyond tools. In her view, which I heartily agree with, “Successful CCM is about:

  • Understanding your content
  • Understanding your user requirements
  • A solid reuse governance plan
  • Information Architecture (taxonomy, UI, workflow)”

While tools are important, they are not the most important question. Yet, tools considerations often dominate the discussion and distort evaluation.

The next session was Single Sourcing House, by Heidi Sandler of Siemens. She used the analogy of building a house, which while occasionally strained, was an apt analogy for building a single sourcing system. I was mostly impressed with some of her quantitative measures, which showed a significant (3-4x) improvement in productivity with the introduction of an XML based single sourcing system. The other item of note was a suggestion to keep a written history about decisions. Given how quickly things change in most organizations, this could be very useful.

I got a third dose of Joe Gollner in the next talk, which was titled, Putting Everything Back Together Again: Delivering Effective Information Products. This talk expanded on his earlier discussion. He made the point that while preparation and design are important, a more flexible planning approach may be a better way to deal with the “Uncontrolled growth” that characterizes content management these days.

Gollner presented a few case studies that served to illustrate the wide variety of content applications out there and to emphasize the point that old style waterfall style planning is not necessarily the best way to approach content management design.

The last talk I attended was Rahel Bailie’s Content Management Successes: Separating Fact from Fantasy. In a continuation of what is becoming a theme of this conference, she pointed out the common fallacy that leads people to think that “Tools are the engine,” when in fact, “Tools are the caboose.” I know from hard experience that organizations tend to select tools too early, and often without any idea of what their needs really are.

She also gave a cogent description of how to look at the blizzard of features offered by most tools. Her suggestion on evaluatiing features is to go beyond the “what” questions, like “do you have version control,” to more open questions like, “How do you handle version control.”

The other point that stood out for me was a discussion of “governance,” which emphasized the importance of understanding who owns processes and the budget, and understanding the depth of support, or political will, behind decisions. She suggest being wary of support below the “C-Level,” i.e., CIO, CTO, or CEO. And, she suggests using the tools of audience analysis on the people governing projects.

Again, an interesting day.

h1

DocTrain: Day 1

May 7, 2008

This week I’m in Vancouver at the DocTrain West Conference. I’ll be posting each day on the sessions I attended that day. I’ll cover the highlights and add comments.

Day one offered four pre-conference workshops. I chose Content Engineering: Workshop, presented by Joe Gollner of Stilo International. Joe is an excellent speaker, who kept us engaged for 3.5 hours. It’s impossible to summarize the full seminar in a blog entry, so I’ll hit the high points and add a few comments of my own.

The workshop was a comprehensive introduction to Content Engineering, which Gollner defines as the “application of rigorous engineering discipline to the design and deployment of content management and content processing systems.” He sees Content Engineering as a necessary means for controlling the explosion of both the volume and complexity of the content that organizations must deal with.

Gollner divides Content Engineering into two major activities: Content Management and Content Processing. While many CMS vendors see Content Management as the overriding discipline, and other activities, like Content Processing, as subordinate, Gollner sees Content Processing as an equal, and in many ways more complex, discipline. He also sees Content Processing as a weak link in many CMS offerings.

Following from an engineering approach, Gollner made some other important points:

  • Metadata and link information must be treated as “first class” content; no different from any other content.
  • This means that metadata and links must be “detachable” from any CMS; i.e., you must be able to export this information in usable, non-proprietary form, something that not all CMS’s support.
  • Technology components must be “loosely coupled,” which means that interfaces must depend on the exchange of validated content, rather than depending on component to component interfaces like proprietary APIs.
  • Processing rules aka business rules must be treated like content and therefore be expressed independent of any particular technology component.
  • In general, must be able to export everything (content, links, metadata, processing rules, etc.) as processable content.

He concluded with some general comments and a “Top Ten” list of guidelines. The general comments centered around the importance of recognizing the content is inherently complex and getting more complex all the time. Effectively processing content requires engineering discipline that covers the entire life cycle. Doing this well is an elusive goal.

In the Top Ten list, without a doubt the most important point was “Don’t invest in Content Management technology too early.” Gollner has seen many projects get “bogged down in molasses” by committing to CM technology to early. Instead, he suggests focusing on Content Architecture and Content Processing first. Having seen exactly the same thing happen, I heartily endorse this recommendation.

Another notable item in the Top Ten was: Take a “Customer Service” focus in delivering tangible benefits to real users. All too often, the people who should be receiving new features and benefits from Content Engineering are forgotten and see little or no direct benefit. It’s important to keep delivering benefits to real users and not just “cool toys” for internal users.

Overall, I found the session valuable and hope it is a harbinger of how the rest of the conference will go.