A circular puzzle of Google Analytics bar chart logomark.

GA4 Guide Chapter 5: Best Practices for Designing your Events and Parameters

,
,
Sep 13, 2022

What good is a pirate’s treasure if there is no way to find it? Often in stories, a coveted and tattered map gives the protagonist the guidance they need to find the prize.

This chapter helps you get a map like that for your data layer and implementation. Let’s be clear: you create the map, and we’ll help you craft it so multiple stakeholders can use and interpret it—by the light of the waxing crescent moon and otherwise.

Once you create your measurement strategy, you can begin working on your solution design documentation to outline your events, parameters, test cases, and more. If done effectively using standard practices, any analyst should be able to derive the information needed about the implementation for their exploration into your dataset.

This chapter details best practices for designing your events and parameters. Upon completion, you will be able to understand the importance of a solution design reference (or SDR) for your implementation. You’ll also learn how to navigate the following: auditing your current implementation, naming conventions, creating a test plan, and understanding your data layer. While each implementation instance may be different, you should have the skill set to build an effective SDR for your website or application.

CREATE A SOLUTION DESIGN TO MAP TECHNICAL REQUIREMENTS TO YOUR BUSINESS NEEDS

A solution design reference (commonly called an SDR) is your step-by-step guide to your analytics implementation. It outlines events, parameters, and chosen conversions (determined by your stakeholders) to guide you through the data layer of your website or application.

You often begin this document by outlining your stakeholders’ goals or objectives, then determining what data to collect. Utilizing your data, group it into associating dimensions and metrics that facilitate your analysis. To do this, first follow a series of steps to determine goals, audit your current setup, create metrics, dimensions, and events, and then test your implementation.

But wait—how does this differ from the measurement plan (discussed in the previous chapter) that also uses stakeholder input to list prioritized requirements? Let’s look at the difference between these two essential documents.

Measurement Plan:

  • List of what the group is trying to achieve, KPIs, and key drivers.
  • Created and rarely edited, other than to add additional business information.
  • Often used while developing the solution design document, but not used daily.

Solution Design:

  • Mapped dimensions, metrics, and events used to fulfill the measurement plan.
  • Often added to and edited as the implementation changes.
  • Used consistently when building reports and analyses.

The solution design reference is often referred to as a map to guide you through the data layer implementation. Continuing with the metaphor, if the solution design is the map, your measurement plan is the context and goal of finding the treasure. Establishing a thorough measurement plan to outline the objective of the implementation is crucial to having an effective solution design.

AUDIT YOUR CURRENT IMPLEMENTATION AND MAPPING YOUR PATH

Before creating your official solution design reference, it is important to understand the current implementation (if applicable). If your site has a current implementation, reach out to the stakeholders and view any existing documentation outlining data layer specifications, pre-existing goals, and any created event structure. You can build on the existing implementation and edit it as needed for the updated framework. In many cases, where an implementation already exists, it can be tweaked rather than removed completely.

It is also important to determine if the current implementation matches the existing documentation or if it has fallen out of sync. The implementation documentation is often forgotten when making changes, so it doesn’t get updated. This can cause issues for the analyst later on. By ensuring that your existing documentation matches the implementation and business requirements, you can understand what needs to be done for the new implementation.

NAMING CONVENTIONS AND SUGGESTED EVENTS

It is important to determine if a naming convention exists for your events and parameters. Consistency is vital in your dataset so you can work to match the existing events or restructure the naming convention entirely. The goal is to have a constant way to name your events and parameters.

Google recommends creating your general event, then using an underscore to space out multiple words in your parameters. This way, it can give structure and clarity to the event that transpired. Essentially, you want your event parameters to be an extension of your event data. For example: Event = video_start, Parameters = video_title, video_percent, video_current_time.

Google Analytics 4 has suggested events that can automatically collect data using the new Enhanced Measurement feature. It is recommended to use them as they are relatively standard across most online properties. Using the recommended events and associated parameters, you can get the most details from your reports and benefit from future integrations as they are released. Google has provided recommendations for all properties (as seen below), online sales, and gaming-related platforms.

Google event recommendations are standard across most online properties. Using the events and associated parameters listed here, you can get the most details from your reports and benefit from future integrations as they're released.

While Google has outlined an array of recommendations for events, building out definitions for the events you are using is critical. This allows a user who did not participate in the implementation process to learn what your event is capturing, the type of event, and its associated parameters. Below is an example of how Search Discovery has mapped out varying events and all the associated data in a GitHub repository. While using GitHub is not mandatory, what is essential is creating your event outline so that others can find their path to your treasure chest of data.

An example of how Search Discovery has mapped out varying events and all the associated data in a GitHub repository.

CREATING A TEST PLAN

After you have invested time, money, and (possibly) tears into outlining your events and parameters to match the business requirements, it is time to create a test plan. The test plan marries the events, parameters, and business goals to create event-specific instances of where and what they should convey. It begins by outlining a scenario where the event will fire. For example, ‘page_view’ would fire when a page loads. Then, it specifies the location and environment for testing—keeping with our same example, ‘US’ and ‘Web and App’, respectively. The test plan will then give a specific page where the event should fire and detailed instructions on how to get the event to fire. After setting the context, your test plan should provide the event name, associated parameters, and expected values.

Test Plan

Following this breakdown, you can assign stakeholders specific events to test on the site. In addition, you can notate whether the event passed or failed in your document. After completing the test, you can refer to the specific event tags in Google Tag Manager based on if your events or parameters failed. This test plan is a simple but essential approach to ensuring your implementation is parallel to your business requirements.

A downloadable template for developing a solution design resource is available here.

PERFORMING THE DATA LAYER DEEP DIVE

What is a data layer? It sounds mysterious and intimidating, but your data layer is your JavaScript-based code that sends information about activity on your website into your Google Tag Manager container. This data is then sorted and organized by your tags and variables and sent to your connections.

If you struggle to understand data layers, you are not alone. Check out this blog post, and also note that the data layer has two main benefits: accessing otherwise unavailable data to the JavaScript environment and ensuring that Analytics tracking is not broken when changes are made to the DOM.

Let’s look at an example using SearchDiscovery.com and our Google Tag Manager setup. Once going to the homepage and selecting the “Learn More” button, we will see a variety of events fire in tag manager. But let’s look at the event ‘Demandbase_Loaded’ as an example.

Data Layer Values

With the event titled in line 2, this is all the event data that comes in from the JavaScript. This means that a backend developer has coded this as an event, and then provided all the subsequent lines as associated information to the event.

This is the backend portion of the site and is not the ‘events’ we are used to seeing in Google Analytics. This can be relayed into tags, and then sent to Google Analytics, other Google Marketing Platform products, LinkedIn, Facebook, and more. After we have a complete data layer, we can build variables into tag manager.

Variables can be automatically extracted by Google Tag Manager, or they can be user defined. They are set to pull data from the data layer (or other forms of page related data, elements, utilities, etc.) and import it into tags. If all the information needed is present for your event, then you can create variables to communicate the data into your event tags.

However, if all of your desired event information is not coming in through your data layer, it is important to communicate with your developers on the best way to add to the data layer. If you are looking for a tool to easily manage and deploy your data layer while following industry best practices, Apollo may be for you.

CONCLUSION

Creating your solution design is generally less risky than a quest to find a treasure map and then a buried treasure. It contains the information an analyst needs in order to understand their way around your implementation. The steps outlined here are all essential to successfully lay the groundwork for future analysis and understanding of your dataset.

There is no such thing as a perfect implementation, but as analysts continue to refine the process, it becomes easier and easier. Previously, in Universal Analytics, Google had also established best practices. But as with most creator suggestions, they went out the window for most users.

Google Analytics 4 is different. It is more imperative to use the best practices, automatically collected events, and suggested naming in order to avoid an overcomplicated or faulty implementation. Google Analytics 4 is structured differently, and allows for a higher data yield if used correctly. Avoiding best practices for your implementation is akin to building a 6,000 piece Lego castle without instruction. Possible to accomplish without following the steps? Yes. But one would need extensive experience and a fair bit of luck.

After you complete your measurement plan and solution design, you are prepared to begin your implementation. The first step in this process is to create your Google Analytics 4 property. Now, additional tools and resources now available that were not obtainable in Universal Analytics. In the next chapter, we will discuss how to approach setting up your property that will be used for data collection and analysis.

Read More Insights From Our Team

View All

Take your company further. Unlock the power of data-driven decisions.

Go Further Today