Content Score: How To Assess Performance

Content Score: How To Assess Performance


Universally acknowledged truth: Good content serves as one of the most effective ways to attract and retain your target audience’s attention.

Research-confirmed truth: Half of content produced goes unused in the relentless onslaught of information consumers live in.

So, how do you know if your brand’s content gets used and performs the job it’s designed to do?

Metrics help, but they don’t tell the whole story. To truly understand the effectiveness of your content, you need a scorecard.

A content scorecard shows how an asset scores against industry benchmarks or your company’s standard content performance. It marries qualitative and quantitative assessments. Quantitative evaluations are based on metrics such as views, engagement, SEO rank, etc. Qualitative evaluations are derived from criteria, such as readability, accuracy, and voice consistency (more on that in a bit).

Let’s get to work to create a content scorecard template that you can adapt.

Build quantitative content scoring worksheets

To know what to measure, you must know what matters. What is the job of that piece of content?

For example, consider the purpose of a landing page. Since it’s rarely the final destination, it’s usually not a good sign if readers spend too long on it. On the other hand, a long time spent on a detailed article or white paper is a positive reflection of user engagement.

Be specific with your content asset’s goals. What should you measure based on its purpose? Consider these ideas:

  • Exposure — content views, impressions, backlinks
  • Engagement — time on page, clicks, ratings, comments
  • Conversion — purchases, registrations for gated content, return visits, clicks
  • Redistribution — shares, pins

With your quantitative criteria determined, you need to identify the benchmarks. What are you measuring against? Industry standards? Internal standards? A little of both?

A good starting point for general web user behavior is the Nielsen Norman Group. You also can look at social media and email marketing tools, which often provide engagement benchmarks by industry. And finally, your organization’s marketing history provides an opportunity to measure how a content asset compares to what you have already published.

Below is a sample benchmark key. The left column identifies the metric, while the top row indicates the resulting score on a scale of 1 to 5. Each row lists the parameters for the metric to achieve the score in its column.

Sample Quantitative Content Score 1-5 *

Score: 1 2 3 4 5
Page Views/Section Total (internal benchmark) <2% 2 – 3% 3 – 4% 4 – 5% >5%
Trend in Page Views (internal benchmark) Decrease >50% Decrease Static Increase Increase >50%
Time Spent/Page (external benchmark) <20 sec 20 – 40 sec 40 – 60 sec 60 – 120 sec >120 sec
Bounce Rate (external benchmark) >75% 65 – 75% 35 – 65 % 25 – 35% <25%
Click-Through Rate (external benchmark) <2% 2 – 4% 4 – 6% 6 – 8% >8%
Social Media Engagement <0.5% 0.5 – 1.5% 1.5 – 2.5% 2.5 – 3.5% >3.5%
*Values are defined based on industry or company benchmarks.

Using a 1-to-5 scale makes it easier to analyze content with different goals and still identify the good, the bad, and the ugly. Your scorecard may differ depending on the selected benchmarks.

How to do it

You will create two quantitative worksheets.

  • Label the first as “quantitative benchmarks.” Create a chart (similar to the one above) identifying the key metrics and ranges for each 1 to 5 score. Use this as your reference sheet.
  • Label the second worksheet as “quantitative analysis.” The first columns should include the content URL, topic, and type. Label the next columns based on your quantitative metrics (i.e., page views, clicks, engagement, etc.).
  • Add the details for each piece of content.
  • Calculate each score (1 to 5) using the benchmark reference sheet.

It’s easy to look at your content’s metrics, shrug, and say, “Let’s get rid of everything that’s not getting eyeballs.” But if you do, you risk throwing out great content whose only fault is that it hasn’t been discovered.

That’s why you need to add a qualitative component to the scorecard.

Create qualitative success worksheets

You will use a different 5-point scale to score your content qualitatively to identify valuable but buried pieces. At this point in the content scorecard process, a content strategist or someone equally qualified on your team/agency analyzes the content based on the organization’s objectives.

TIP: Have the same person review all the content to avoid any variance in qualitative scoring standards.

Among the qualitative criteria I’ve used:

  • Consistency: Is the content consistent with the brand voice and style?
  • Clarity and accuracy: Is the content understandable, accurate, and current?
  • Discoverability: Does the layout of the information support key information flows?
  • Engagement: Does the content use the appropriate techniques to influence or engage visitors?
  • Relevance: Does the content meet the needs of all intended user types?
  • Authenticity: Is the content authentic and original? This evaluation is increasingly important in light of the recent surge in generative AI use.

To standardize the assessment, use yes-no questions. One point is earned for every yes. Add the points and divide by the total number of questions to calculate the average score for each category.

Here’s how that would work for the category of clarity and accuracy as well as discoverability. Bold type indicates a yes answer.

Clarity and accuracy: 

  • Is the content understandable to all user types? Yes.
  • Does it use appropriate language? Yes.
  • Is the content labeled clearly? Yes.
  • Do images, video, and audio meet technical standards so they are clear? No.

Score: 3 (3/4 * 4)

Discoverability: 

Score: 1 (1/5 * 5)

TIPS: When calculating the relevance category, base the questions on the information you can access. For example, if the reviewer knows the audience, the question, “Is it relevant to the interests of the viewers,” is valid. However, if the reviewer doesn’t know the audience, the question wouldn’t work. But, almost any reviewer can answer whether the content is current. So, that would be a valid question to analyze.

In some cases, tools can be used to assess attributes like accessibility, readability, or originality. Translate those results into a 1-to-5 scale.

How to document it

  • Label the worksheet as “qualitative questions.”
  • Add three columns for content URL, topic, and type.
  • Add columns to the right for each category and its questions.
  • Add a final column and insert the formula (yes answers/total questions multiplied by total questions) to calculate the average score.
  • Create a new worksheet labeled “qualitative analysis” to make it easier to see the results. Include columns for the content URL and each category’s average score.

Put it all together in a content scorecard

With your quantitative and qualitative measurements determined, you can now create your scorecard spreadsheet.

Based on the earlier example (minus the content URLs), here’s what it would look like.

Qualitative Scores
  Content A Content B Content C Content D Content E
Consistency 5 1 2 3 1
Clarity and accuracy 4 2 3 2 2
Discoverability 3 3 3 3 3
Engagement 4 2 4 2 2
Relevance 3 3 5 3 3
Average Qualitative Score 3.8 2.2 3.4 2.6 2.2
Quantitative Scores
Exposure 3.2 1.2 3.0 3.2 2.8
Engagement 1.8 2.2 2.0 2.5 2.0
Conversion 2.2 3.2 2.8 1.5 3.0
Average Quantitative Score 2.4 2.2 2.6 2.4 2.6
Average Qualitative Score 3.8 2.2 3.4 2.6 2.2
Recommended Action Review and improve Remove and avoid Reconsider distribution plan Reconsider distribution plan Review and improve

Each asset now has an average qualitative score (total category scores divided by total number of categories) and an average quantitative score (total category scores divided by total number of categories.)

You can now use the side-by-side comparison of each content asset’s scores to determine what to do next:

  • Qualitative score higher than a quantitative score: Analyze your distribution plan. Consider alternative times, channels, or formats for this otherwise “good” content.
  • Quantitative score higher than a qualitative score: Review the content to identify ways to improve it. Could it be improved with a rewrite? What about the addition of data-backed research?
  • Low quantitative and qualitative scores: Remove this content from circulation and update your content plan to avoid this type of content.
  • High quantitative and qualitative scores: Promote and reuse this content as much as feasible. Update your content plan to replicate this type of content in the future.

Of course, you may find discrepancies between quantitative and qualitative scores that may indicate the qualitative assessment is off. Use your judgment, but at least consider the alternatives.

How to use AI to generate a content score

To turbocharge your scorecard, you can use machine learning to extrapolate a sample once patterns are identified. It takes some patience and iterations, but it’s possible to prompt your favorite AI tool to infer an initial qualitative score for content not in the original scorecard sample.

By showing the AI tool a series of content pieces, along with the respective scores for each piece, you can ask it to score additional similar content using the same logic. For example, if articles with a specific format, topic, or publish date all scored similarly for certain attributes, you can assume all others with those qualities will, too.

Of course, these scores would be preliminary, and an expert would still need to assess their accuracy before you act on the score, but this AI assist could become a time-saving tool.

Start content scoring

Developing a content scorecard may seem like a daunting task, but don’t let that stop you. Don’t wait until the next big migration. Take bite-size chunks and make it an ongoing process. Start now and optimize every quarter, then the process won’t feel quite so Herculean.

Selecting how much and what content to evaluate depends largely on the variety of content types and the consistency of content within the same type. In my experience, 100 to 200 assets were sufficient. Though there is no hard-and-fast science to this sample size, the number should be sufficient for seeing patterns in topic, content type, traffic, etc. Your number will depend on:

  • Total inventory size
  • Consistency within a content type
  • Frequency of audits

Review in batches so you don’t get overwhelmed. Set evaluation cycles and look at batches quarterly, revising, retiring, or repurposing your content based on the audit results every time. And remember to analyze content across the performance spectrum. If you only focus on high-performing content, you won’t identify the hidden gems.

Updated from a January 2022 article.

Want more content marketing tips, insights, and examples? Subscribe to workday or weekly emails from CMI.

HANDPICKED RELATED CONTENT: 

Cover image by Joseph Kalinowski/Content Marketing Institute



Source link


Discover more from Сегодня.Today

Subscribe to get the latest posts sent to your email.

Leave a Reply

Your email address will not be published. Required fields are marked *

Discover more from Сегодня.Today

Subscribe now to keep reading and get access to the full archive.

Continue reading