Friday, August 10, 2012

Content Inventories, Audits, and Analyses: All part of benchmarking


Go to article

An interesting discussion recently arose in a content strategy forum about content benchmarking, and this seemed an opportune time to discuss the topic. (Thanks to Destry Wion for suggesting I turn this into a blog post.)

According to Wikipedia, the definition of benchmarking is “the process of comparing one’s business processes and performance metrics to industry bests or best practices from other industries. Dimensions typically measured are quality, time and cost.” Senior business executive F. John Reh defines benchmarking slightly differently, that “benchmarking is the process of determining who is the very best, who sets the standard, and what that standard is.” Both of these definitions assume that benchmarking be a sort of competitive exercise. And perhaps, technically, they are right.

The idea of benchmarking in technology involves assessing the performance of an object relative to a set standard of performance, where acceptable levels are determined through heuristics or some agreed-
upon sort of industry standard.

Which type of benchmarking would apply to content? Are we measuring content against the content of competitors, or against known standards? I’d wager that the answer is both. Content is part of a larger complex system, a system that extends outside of our departmental silos and corporate systems.

Content strategists are used to discussing benchmarking content in terms of inventories and audits, whereas I’ve thrown the term content benchmarking into the mix. All three activities could be rolled into benchmarking (or gap analysis or any other framework that includes assessment), and considered the initial phase of content strategy.  However, I think it’s important to demonstrate why all three activities should be identified as separate and distinct activities and not rolled into a single line of a deliverable.

I’m just wrapping up on a project (City of Vancouver website) where we started with over 26,000+ HTML pages down to under 5,000 pages of content that RAITES. It was definitely a rewrite project – no content went through a technical migration where a technologist does some sophisticated mapping scripts and then moves the content over in an “as is” state. (For several reasons, this was not possible: because the content wasn’t written to what today we’d call “writing for the web” best practices, because the vast majority of it wasn’t technically structured so that migration scripts could be run against it, because of the 30,000-plus PDFs and other attachments, we needed to choose the most used and hand-optimize the metadata for search, and so on.) 

We needed a team of trained writers and content strategists who rewrote every page on the new CMS so that content could be integrated, converged, aggregated, and syndicated in ways to make the site perform the way we wanted it to, to meet the standards set out in the guidelines for improving the site to increase content findability, transparency, and ease of use.

Benchmarking sub-activities

So how did we figure this out? Through the separate activities that make up a content benchmark:

  • Content inventory -  This is a “sizing” activity, an exercise in measurement. Not doing an inventory is like starting to bake when you don’t know what ingredients you have in the house. An inventory isn’t done against any industry standards; it is simply a measure of what you have on hand to work with. Unless you know what your starting point is, you can’t know what your best strategy will be to deal with the content. So an inventory – particularly when dealing with over 60,000 pages and files – is important.

  • Content audit – This is a largely quantitative measurement. Once you have your inventory ready, you can start to slice and dice the data about the content to figure out how to tackle it. One of the most common complaints about government websites is that it becomes a very large landfill, where all the documents from the beginning of time – at least, the beginning of the web – gets dumped. Every department thinks their content is critically important, and “everyone” looks for those committee meeting minutes from 2002. Doing an audit lets us see, at the very minimum, which pages and documents get the most visits. No one clicked on those 2002 minutes for five years? Then those pages go to the bottom of the pile to be dealt with. The pages that get 100,000 visits a year? Top of the heap to deal with.

  • Content analysis – This is the qualitative side of measuring content against known standards.  We know how people navigate through sites (search and browse patterns), how people consume web content (sometimes called skip, skim, and scan), how people read web pages (F-shape  pattern), how content has to be structured to meet accessibility standards and to work on mobile devices, how images should be optimized to avoid big data downloads (keeping in mind that Canadians pay some of the highest rates in the world for mobile data access), and so on. We developed a 35-point checklist against which to analyze the high-value content, so that we could determine which content could be copied over as is, which needed a light edit, and which content needed to be completely rewritten.

Without all of those measurements, we couldn’t have demonstrated that the approach taken was the most effective and cost-efficient way of going about things. When you have only eight content strategists on staff (the original recommendation was to have three times that number), and you have just over a year to go through the entire content lifecycle of planning, structuring, writing, fact-checking, editing, approvals, publishing, curating, and so on when you have to sift through 26,000 pages of content, you don’t want any false starts. You have to  come out of the gate with a strong start and keep up the momentum.

Transparency makes for happy clients

Another reason to keep the three activities separate is transparency. Transparency has a particular focus in government (increasing transparency of City operations) that I have adapted and adopted for my own practice. If I can digress with an illustration, I heard a story about a woman upset that the house across the street was being torn down and a small apartment building going up in its place. The city staff actually went to her house to talk with her about the process and, upon looking out of her front window, saw the huge billboard-size sign entitled “Rezoning Application” that had been there for months. 

For city staff, it was part of a well-established process of a developer making an application, community consultations, input from the community, application approval, and implementation. But for the woman, the sign had no context so she didn’t connect the sign with the demolition. For her, the process wasn’t transparent – she didn’t understand the steps involved. There was a sign, and then there was demolition. Similarly, without understanding the overall process, it’s easy for clients to misunderstand the amount of work that goes into a content strategy and how that ties in to the greater project plan.

Transparency is an important aspect of City operations, and in industry, it’s an important aspect of client satisfaction. Transparency for clients can look very similar to transparency in government. Many specialized business processes can be misunderstood. You say “we’re going to do some benchmarking” or “we’re doing a content “[activity name]” and it looks to be a single exercise, perhaps something you go away and do in isolation. Making it transparent is to name the steps in between, and to name them in a way that explains what happens. By naming the parts, your client understands them better:

  • Inventory – Business people get that because parts inventories are done all the time.
  • Audit – Business people know what an audit is because accountants do that to financial records.
  • Analysis – Executive staff know what that is because they are expected to do analyses as part of creating strategies.

Put these activities all together and you get the same result as calling it by a single name with four steps to it. But probably a happier client because you’ve been more transparent about the components of what you’re about to do, and what the outcomes of each step is. And because people – even experienced business people – always, always vastly underestimate the time, effort, cost, and diligence it takes to create content (let alone create good content), it’s worth the small effort to add transparency to your content strategy processes.


Share this post:
  • del.icio.us
  •  
  • StumbleUpon
  •  
  • email
  •  
  • Facebook
  •  
  • LinkedIn
  •  
  • TwitThis

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.