The biggest challenge for measuring digital analytics at BBG is the agency’s size. The BBG has more than 300 websites (most that are mobile responsive) and mobile apps, five separate organizations and more than 100 units that need individual reporting.
When we committed to getting an enterprise web analytics tool, we were using at least three separate Google Analytics implementations, a Sitecatalyst implementation, and other tools to measure whatever we couldn’t catch in those. VOA alone had 50 Google Analytics profiles.
While everyone was doing due diligence to maintain their analytics tool(s), we didn’t have a way to look across the whole organization’s digital properties. Some of the questions we couldn’t answer before were things like “Which BBG network is most popular in Vietnam?” or “Of our Russian-language content (many sites’ worth), what topics are most read by Russian speakers in Kazakhstan?” And every report about the BBG’s digital performance in general required calls for data and assumptions that the data all meant the same thing, even though it came from different places. It’s hard to make business and content decisions based on shaky data.
Planning for the new web analytics tool revolved around answering those questions, and making nearly instantaneous feedback about people finding and engaging with our content accessible to everyone–journalists (content reports by byline, so writers can see how their content performs), editors (content reports by topic, so it’s easier to get a feel for topical interest in a target region), marketers (everything that the BBG does, as consumed by a given city or country) and strategists (the whole universe of the BBG’s content, consumed by the whole world).
After lots of hard work, the expertise of some great thinkers and consultants, and really good feedback from our editorial teams, we’re anticipating a really exciting outcome–usable information that tells stories about all BBG entities online that nobody has ever had before.
As with all web analytics tool changes, we anticipate changes in the numbers we get–different tools count things slightly differently, so we may see all traffic increase or decrease by a constant amount. When we start tracking mobile visits too, we’ll see another change in traffic. I’m already looking at how our new setup is getting different numbers than the tools we have been using.
None of these changes mean that our audience has changed how it behaves. It just means we’re recording it differently. And the specific number isn’t the most important thing in web analytics; the stories the data tells and the information it can help you find are the valuable insights.
When you move to any new tool, you have a new baseline and a new normal daily or weekly number. You want to keep your eyes out for changes–good and bad ones–and determine what caused them. You want to monitor projects you’re putting effort into to see if you’re getting the outcome you want. And if you’re targeting a certain audience, you can get to know them based on what they do, and try to get to know more about what they respond to by testing things that you think they’ll respond to. For example, this might be a slightly different headline, a different angle, using more or less pictures, or promoting a story with a different hook on social media.
If you’re planning an enterprise web analytic tool rollout, here are some tips:
1. Find great experts to advise you. We were lucky to get to work with outside consultants who helped us define reporting requirements before we selected a new tool, and to work with great vendor specialists who helped us turn those into reality. We not only have a setup with best practices, we learned a lot about the tools and their uses by working with industry leaders.
2. Thoroughly assess what information is most useful to stakeholders across your organization before you start setting up or selecting a tool. We had consultants come in and hear from our internal key stakeholders what they wanted to know. They assembled the organization-wide feedback and made expert judgement calls on what data we needed to gather and how we should present it.
3. Decide whether to use a tag manager. We had the luxury of choosing whether or not to get a tag management tool. We chose to get one because we have many different groups managing the technical side of our digital properties but wanted to maintain a unified analytics/measurement system. Using a tag manager centralizes our web analytics management.
4. Plan a clear, specific structure for naming and tagging. We worked closely with the technical teams to create a data layer on all of our websites containing uniform information about the page and the site. This means the data in our web analytics tool is clearly named across all of our web analytics report.
5. Keep a list of priorities. Know which reports or platforms you’re tracking are most important or most time-sensitive. Knowing what’s most important makes it easier to pick the elements that won’t get built when resources run low. Clear priorities make it easier to move forward to an actual release by instead of waiting to complete everything perfectly.
6. Know the field limits for your analytics tool and any tag manager. The last thing you want in your reports are awkwardly truncated page titles or worse – gibberish. Multibyte languages have more bytes than characters, and automatic truncation may garble them. One of our developers alerted me to this, and we made wiser decisions knowing exactly how much information we could track (Further reading on truncating multi-byte languages here from RFA Developer, Flip McFadden :http://stackoverflow.com/questions/8719330/python-utf-8-xml-parsing-suds-removing-invalid-token)
7. Find out what’s easy to set up but hard to change. Some things, like profile names, report suites, reporting heirarchy, and default values are better to set correctly at the beginning. Other things, like dashboards, are easy to change later. Know what to commit to early, and what you can wait on or change later.
8. Organize page-level and site-level variables early. This really only applies to implementations where you need multiple content management systems to track the same way in analytics. We created a matrix of all variables for each type of page on each CMS with sample values and notes for the developers. We also created a matrix of all site-level variables for each property. Both of these reference documents continue to be invaluable.
9. Make sure you know exactly which things happen in tool configuration and which things are coded onto your site. This is particularly important if you’re not technical. If your tag manager is separate from a web analytics tool, just give the developers tag management documentation. You’ll set up the web analytics tracking inside of the tag manager. If you conflate these, you’ll confuse yourself, and probably slow down development work.
10. Prepare to spend a lot of time checking that your new tool is configured correctly. Good documentation, including what domains you expect to see in what reports and a complete list of all reports, is really helpful to have here.
Special thanks to Ahran Lee, Designer at BBG, for creating the artwork for this post.