Auditing content systems through a narrative lens

Audrey Ross
Audrey Ross
March 10, 2025
Are your content guidelines keeping up with change? Assessing and refining your governance can strengthen collaboration, improve UX, and keep your project on track.

At the start of a recent project, I followed a common content design practice: I created a wonderfully comprehensive system of content guidelines and documentation — maybe you call it content governance or a content system. 

I worked with stakeholders and cross-functional team members to define content goals, expectations, roles, and timelines. These resources successfully guided us through a major redesign effort with minimal adjustments.

But despite the initial success guiding us through the redesign, something wasn’t quite right anymore.

The pain wasn’t obvious at first. It started with small signs — processes being skipped, guidelines not quite fitting new content types, and stakeholders creating content without the guardrails we’d carefully put in place. The system that had worked so well through the redesign was slowly becoming less effective, not because it was poorly designed, but because the context around it had shifted.

When guidelines and systems start failing, it’s easy to fall into one of two traps to regain control and get things on track:

  1. Double down on enforcement: Schedule another meeting, remind everyone about the guidelines, and hope they’ll start using them again.
  2. Start from scratch: Throw everything out and rebuild, assuming the old system is fundamentally broken.

But neither of these approaches addresses the real issue. Guidelines don’t usually fail because people are ignoring them or because they were poorly designed. They fail because they haven’t evolved alongside the needs of the people using them.

Over time, even well-designed guidelines can become less effective as teams, processes, and needs evolve. Instead of assuming they’re broken, we need to assess how people are actually using them, whether they still serve their purpose, and how they can be adapted to meet current needs.

Instead of enforcing or abandoning guidelines, I wanted to understand what was happening in people’s day-to-day work, starting with an audit.

Subscribe to the Button newsletter!

Get more valuable content design articles like this one delivered right to your inbox.

Thanks! Check your inbox to confirm your subscription.
Oops! Something went wrong while submitting the form.

Setting up your audit 

I love a good audit. From my first time reading about content audits in the classic Content Strategy for the Web, the straightforward nature of columns, labels, and neat little boxes felt right. So, maybe I shouldn’t be so surprised that I returned to the audit framework to evaluate and update my content guidelines and governance. 

Of course, one of the first things to do when starting an audit is to decide what those columns and labels will be. What do you need to look for and document during the audit?

Our work had begun to involve more points of the user journey than the initial scope of the content guidelines covered. This expanded scope meant more people and channels to consider.

My original content guidelines were primarily meant to ground my own tasks and set up how I would collaborate with others, so stakeholders and team members knew what to expect from my work and how they would be involved.

In other words, the system focused on me, as a content designer. But the shifts in our work now forced me to look at the broader web of people who needed the guidelines and resources I was creating.

So, I decided to use the audit to map out the narratives of how work really happens and the real people involved. I started by creating an audit framework to tell a simple story. For each person or team, the story took shape through columns that connected: 

  • Who they are (the internal person/team)
  • What tools and platforms they use
  • Which guidelines they should reference
  • What they’re trying to create or manage

With a solid framework like this one, you can start by filling out what you know before digging into how well it works and opportunities to improve.

Sketching your narratives

For a content audit, you might have a site crawler grab your URLs and page titles, or you might do the work manually. For a content system audit, there probably isn’t an app (yet) for you to surface all the people, tools, guidelines, and content items involved, but you likely have several places to look as you get started.

I have three favorite resources for making sure I’m covering the necessary ground for content system stories.

  1. Content ecosystem maps: These visually lay out everything involved in your product. They’re particularly valuable for checking what might be missing from your current scope. In my case, reviewing our ecosystem map revealed new touchpoints that had emerged as our product evolved. If you have your map in Mural or Miro, you may be able to export the content into a CSV file that can help you set up your audit.
  2. Service blueprints: These connect touchpoints, front stage, and backstage actions. They help ensure you consider the whole scope of your user journey and your design team’s responsibilities. In my experience, service blueprints have been invaluable for understanding how processes and guidelines are supposed to be used and applied across different stages of the user experience.
  3. RACI charts: These show how different roles are involved in each task. They’re essential for understanding how team members interact with processes and what they might be using (or not using) for their tasks. If you don’t have an existing framework, consider asking leadership or leading an effort to create one.

Logging what you find from these artifacts in the narrative framework can reveal changes affecting your guidelines.

In my case, a new department had been created to manage our product, introducing new stakeholders and channels, as well as new types of reports and presentations. But the people involved in those deliverables weren’t very involved in the initial governance conversations. That meant valuable content was spawning in new places without the guardrails we’d set.

At first, I wondered if I had missed something significant by not involving these stakeholders more deeply in the initial governance development.  After reflection, I realized it was scoped accurately the first time, and it was the point of the audit to identify changes like this.

The key here is to start by writing down what should be happening based on the system you are evaluating. In the next section, you’ll look at what’s happening in reality and opportunities to improve. But for now, you are capturing how the system should be working.

Conducting your narrative-based audit

Now that you have the narratives of what should be happening, you can begin evaluating how everything is working and identifying areas for improvement.

To prepare for this step, you’ll want to add a few more columns to your sheet. Next to the narrative described before, you’ll add sections for:

  • Product goal(s) impacted
  • Rating for how well it is moving that goal forward
  • Tool/guideline application rating (how well or consistently the guidelines are being used)
  • Tool/guideline knowledge (of the person meant to use them)

I like to have open spaces for notes on application and usability as well. This is a good time to double-check that you have the relevant product goals on hand. While quantitative metrics are always helpful, don’t discount more qualitative goals. When I first launched a recent evaluation, I was considering questions like:

  • Is it improving the user onboarding process?
  • Is it increasing internal alignment?
  • Is it improving delivery speed?

To conduct my audit, I combined several approaches. You don’t have to do all of these, and there may be other methods that fit your context better.

  1. Individual interviews: These proved especially valuable with stakeholders who weren’t involved in previous rounds of development. The one-on-one conversations allowed for deeper exploration of their specific needs and challenges.
  2. Collaborative workshops: These created a collaborative environment, reinforcing that the process benefited everyone. As we worked together, larger themes emerged, and the sessions were especially effective in helping team members — who may not have initially seen these guidelines as relevant to their work — recognize their role in the broader content ecosystem.
  3. Content reviews: Looking at actual content helped identify where guidelines were being applied successfully and where they weren’t meeting needs. This can include examining both quantitative measures, like previously set key performance indicators (KPIs), and qualitative evaluations like consistent tone and messaging.

As I examined my narratives in light of our new context, I started to understand what was happening and the gaps in the current system.

Our initial governance approach was built around the purpose of our site and the broader user experience. As I evaluated the tools more closely, I realized I needed to tie the processes to more specific goals — and ones that other team members were more actively contributing to. For instance, a team member with a different function may not use the website often, but they may still contribute to a shared goal of improving service requests.

Adapting for new stories 

The best part of a narrative-based audit is writing new stories. What new roles, goals, or content types can you better support? The future is yours to help shape! 

Through this process, I identified a few key areas to update for more effective content. While in-depth guidelines were necessary for the website redesign team, the audit clarified that those guidelines were too much for other roles to reference alongside their other responsibilities. This led to developing a self-service menu of resources to support distributed, on-brand content creation. 

We also hosted additional team training sessions to address specific gaps in understanding, created some new templates where needed, and restructured documents for improved usability.

Embracing change through regular review

Of course, things will continue to change even after you make updates. I recommend a quarterly check-in, which usually provides enough time to see the impact of changes while staying responsive to shifts in your ecosystem, team structure, or product goals. 

You can also run smaller check-ins on specific implementations. For instance, if you’ve conducted training with your support team to address gaps in how they implement messaging guidelines, you might want to check in after a month to see if emails now follow the style more consistently and if that has impacted customer responses.

A note on AI

If you’re already incorporating AI tools into your workflows, they can and should be considered in this narrative-based evaluation. In fact, they may need more frequent review as the GenAI landscape shifts quickly. Because of the scope and specific needs of my recent work, I haven’t yet prioritized incorporating specific processes or guidance for AI. But I still experiment with new tools in my own work and stay on top of developments to make sure we’re taking advantage of opportunities as they arise.

Moving forward together

This narrative-based audit process has improved not just our content guidelines but also my view on the work. When something doesn’t turn out as expected, it can be easy to keep pushing it forward anyway or chalk it up to a loss. This framework has helped our team adopt a person-focused, living system to be fostered, grown, and shaped over time.

Wherever the opportunities come from – AI or otherwise – you’ll be more ready to identify and implement them by regularly evaluating your governance through the lens of real people’s workflows and needs. 

The context around us will keep changing. By iterating and adapting, we can create guidelines that help teams deliver better experiences. Rather than treating guidelines as fixed rules to enforce, we can see them as evolving tools — shaped by collaboration and continuous learning — to better align with how our team works.

Join us for April Workshop Days

Learn how to plan, build, and implement content design at scale with the content design team from Atlassian on April 16 and 17. Tickets are selling fast and space is limited—secure your spot today!

Share this post
Audrey Ross

Author

Audrey Ross is a content designer and leader with over a decade of experience helping organizations help their people through clarity and connection. Audrey has spent her career in nonprofits and civic tech, working on a variety of high-impact problem spaces, including international justice and public health.

Audrey lives outside of Washington, D.C., with her husband and two children. She can often be found making frequent TV and movie references and laughing at a good GIF reaction.

Headshot - Sean Tubridy

Illustrator

Sean Tubridy is the Creative Director at Button Events.

Find out how you can write for the Button blog.

Sign up for Button email!

Subscribe for content design resources, event announcements, and special offers just for you.

Thanks! Check your inbox to confirm your subscription.
👉 IMPORTANT: Firewalls and spam filters can block us. Add “hello@buttonconf.com” to your email contacts so they don’t!
Oops! Something went wrong while submitting the form.