Common Questions: Bursting

Bursting, chunking and content reuse in Arbortext

This week’s topic: Bursting

We start talking about something we call “bursting” when we get questions like the following:

Can a writer or translator work just on paragraph chunks while an editor can see entire sections?

This question is nearly always followed by this question:

Can this level of granularity be specified in ACM?

Before we can answer either question, we need to address some of the implicit assumptions here.

First, we’re assuming you’ve using the component content management functionality of Arbortext Content Manager (ACM). This means you have created reusable chunks, or have implemented bursting rules so that ACM creates the chunks for you. Once you have the chunks in ACM, you can set permissions so that individual users can only access those parts that they have permission to see.

In Arbortext, we call this process of automatically chunking your documents into individual, reusable pieces “bursting.”

Arbortext Content Manager performs document bursting based on configuration files that are stored on the server.  If no burst configuration is specified, the ACM will treat each document as a single document object. For some document types, such as the DITA document types, the burst configuration is used to import document objects rather than burst a larger document into smaller objects. DITA documents are already modular and typically do not require bursting.

Bursting rules are also used when you use the Documentum Adapter, the DB2 Adapter, or the Oracle Adapter with Arbortext Editor.

Note that much of the “work” involved in document bursting is researching your authors’ requirements and determining a burst configuration that meets their needs. It may be helpful to get some assistance defining your first burst configuration and then using an advisor to confirm your first solo bursting project.

As always, if you have a question that we haven’t answered or if you want more details, remember to send us your questions or add them to the comments!

Conference notes from: Content Management Strategies/DITA North America 2010

The CMS/DITA NA 2010 conference was full of lessons learned and pain, but there is light at the end of the tunnel. The successful projects were those who had properly socialized the impact of the project to all parts of their organization.

by Liz Fraley

For the last decade I have made a regular habit to speak at or at least attend the CMS conference. I think I only missed one year. It gives me an interesting perspective because I can see trends over time, changes in presentation themes, popularity of topics, and the changing interests of the attendees.

For the first year, DITA seemed more of an assumption than an experimental technology. More companies are in year 3-5 of their implementation. In years past, the message has always been: Why choose DITA? What’s the value proposition? This year, the message was lessons learned. The biggest lesson learned was that you can’t treat a DITA implementation like a line item. The projects that were successful were those that had properly socialized the impact of the project to all parts of their organization.

For example, Catherine Lyman (NetApp) said she’d done a fantastic job socializing the impact and the value of their DITA implementation all the way up the chain. Her CEO really understood exactly the value behind their effort and exactly what benefits this shift was bringing to NetApp’s business.  However, she hadn’t socialized to lateral departments and every time she brought the project to a new group, she had to start from square one and begin the buy-in discussions over again. It slowed down adoption across the company and, as a result, caused a delay in the ROI she had projected. Her advice? Go to business and engineering groups early and be clear on the corporate drivers. Also, sell to the whole organization the benefits for their departments. Put customer-facing improvements first!

Successful projects place a high emphasis on collaboration and socialization. It was a story we heard over and over at the conference this year. Intel was starting over again — going back to square one — because they weren’t getting the system they needed to really serve their business goals. They hadn’t originally defined their requirements well enough to really evaluate the vendors. They focused on tools first. As a result, they have worked out a set of vendor questions to envy. They included these questions in their slides for the attendees of CMS/DITA NA 2010.

HP talked about the importance of collaboration within your team and with other groups in the enterprise because reuse is a cultural issue. You need to build trust and structure so that you can measure and track effectiveness.

Actuate said that they were also back at the drawing board. They had overused FrameMaker’s tools to the point it wouldn’t compile correctly and they’d get spurious content. They recommend moving to a robust, enterprise-level, dynamic publishing system with a DITA-aware editor built to do it from the ground up.

Rebekka Andersen, a professor from UC Davis, presented her research into why CMS adoptions fail. She followed a company from the early stages through their CMS evaluations and participated in the discussions every step of the way. In this case, the team not only decided against the vendor’s tool but also against CMS in general, but the reason why was not what anyone could have predicted. Her conclusion? The prevailing tool-focused approach to implementation. Don’t let tool define you: it should be the other way around. Her advice? Understand that technology can’t solve the problem or save the day. Your focus should not be on the tools. Tools should be <10% of project implementation; 50% of any implementation is change management and 40% is process management.

There was one presentation on using Sharepoint as a CCMS. If you just see the slides, it comes across as a success story, but the reaction of the audience (and the talk track behind the slides) made it clear that it really wasn’t. Not from any perspective except for the highly-paid consultants doing all the Sharepoint development ($$$$).

Me? I told success stories this year of companies who were 10+ years into their Arbortext/XML authoring implementations. You can find the slides and abstract here: Where Are They Now. After my presentation, Charlotte Robidoux (HP) said that she was glad someone was telling success stories. The conference was full of lessons learned and pain, and it was good to see that there is light at the end of the tunnel.

To put it in the words of some of the longest-running Arbortext customers that I interviewed for my presentation:

“This is all doable because we went to XML and Arbortext in 2004”

“This is ‘bottled gold’ because it gives us a HUGE advantage over our competition”

Overall, it was a great conference. The returns really are there if you frame your project as something that has enterprise-level impact (which it does).