A circular puzzle of Google Analytics bar chart logomark.

GA4 Guide Chapter 4: Before You Implement: How to Plan

Tim Wilson
Senior Director of Analytics
Sep 13, 2022

If an ounce of planning is worth a pound of cure, when it comes to digital analytics implementations, an ounce of planning is worth a pound of impact. In this chapter, we will go through the research, thought work, and key decisions that are part of the analytics implementation planning process. These will pay dividends throughout the rest of the process for successfully adopting Google Analytics 4 (or, if neglected, will add risk and cost to the rest of the process).


No analytics implementation is perfect, complete, or pristinely maintained. But, every analytics implementation represents the accumulated decisions and tradeoffs that have been made as the business, the website, and data priorities have evolved. This is valuable information to factor into the go-forward processes for capturing and maintaining site data.

A great starting point is to create an inventory of the current recurring reports that include digital analytics data and conduct an assessment of them.

Assessment Criteria

How This Factors into Later Planning

Who produces the report?

This individual/team will be stakeholders and subject-matter experts for subsequent planning steps, as well as part of the audiences for both the communication planning and training.

How is it prepared (Excel, Data Studio, Google Sheets, the Google Analytics web interface, etc.)?

When planning and communicating the migration to Google Analytics 4, stakeholders will need to know how to either maintain continuity as they transition to a new data source or what they will need to change in their processes if continuity is not feasible.

Is it heavily used? By whom, and to what end?

Just because a report is currently produced does not mean that it is valuable. Transitioning to a new platform is an opportunity to update and streamline reporting.

What aspects of the report (what specific metrics) are most referenced?

Over time, recurring reports tend to grow as new data gets added to them. Rarely is data removed, even if stakeholders no longer need or use the information. Diving one level deeper into each report to understand the relative value of the specific information within the report is key to the prioritization exercises that occur later in the planning process.

Recurring reports are not the only way that digital analytics data is used. Your organization likely has many ad hoc analyses where an analyst or stakeholder does a deeper dive on some aspect of the data to answer a specific question or validate a specific hypothesis. Depending on how your organization manages this ad hoc work, getting a sample of what those analyses were and what data was used and to what end may or may not be feasible.

In addition to determining what data is most actively used (and how) in the current environment, identifying any data that is known to be missing or unreliable is also a valuable input to the planning. This does not need to be an exhaustive exercise. Simply asking key stakeholders if there is data that they are regularly wishing they had access to and, if they have examples, getting some detail on the business need behind that data, can help you identify key considerations for the new implementation.

If your current digital analytics platform is integrated with other systems—pushing data into them or pulling data from them—then gather the details around those integrations. These may be in-house platforms or external systems or partners. For instance, is the data being piped—automatically or manually—into a business intelligence (BI) tooll? If so, determine how and how often that is occurring, as you will need to engage with those platform owners to develop a plan for the transition.

Finally, assess the maintainability and reliability of the current implementation. The implementation of Google Analytics 4 is an opportunity to address some current technical debt while also minimizing the growth of future technical debt. Ultimately, you will need to decide whether you will be primarily migrating your existing implementation or implementing entirely from scratch, and the risks of carrying forward existing maintenance headaches is a key consideration for this.


No digital analytics implementation can anticipate every possible need and requirement, but the best implementations are designed based on a structured set of requirements that lead to a solution design that is robust, extensible, and flexible. When done well, these implementations enable the platform to actually address unanticipated future requirements because they were built out with a thorough and structured set of capabilities.

One way to achieve that thoroughness is by approaching the needs of the platforms from two directions:

  • Considering the core data needs for ongoing monitoring and measurement performance
  • Brainstorming and then prioritizing a broad set of specific questions the platform may be expected to answer

There will be some inherent overlap across both of these steps, which is expected (if there is not, then one or both steps might be incomplete). Ultimately, these provide a rich set of business expectations against which the solution design can be assessed.


A measurement plan is a “top down” approach, in that it focuses on key performance indicators (KPIs) for a specific audience. Most websites have multiple measurement plans, as there are different stakeholder groups tasked with driving business value in different ways: the overall site owner(s), the digital marketing team, the ecommerce team, the events team, the talent acquisition team, etc. The research described in the previous section should provide you with a good list of who those key groups are, as well as starting points for a measurement plan for each one.

The purpose of the measurement plan is to capture, in a clear and structured manner, the answers to the following questions:

  • What is the group trying to achieve? This is a concise articulation of the business outcomes for that team. It may or may not include directly stated metrics and data.
  • How will they know if they’ve done that? This is the follow-on question to the above and is a list of the metrics that will be used—as either direct measures or proxy measures—to determine if the group is achieving what it intends.
  • What are the key drivers of those results? This can be both additional metrics that directly feed the key metrics identified in the previous question (e.g., lead form conversion rate is a driver of total leads), as well as filters/slices that will be expected to be available (e.g., last touch traffic sources for leads).

A downloadable template for developing a measurement plan is available here.


This is a “bottom up” approach to requirements, in that it captures specific data requirements through the lens of stakeholder questions. Like the measurement planning, the current state research is an excellent starting point for compiling this list of questions.

The first step here is compiling as exhaustive a list of questions as possible. Depending on your site and organization, you may want to categorize them by functional area, but the main goal is to get as comprehensive a list as possible. Then, review the questions and prioritize them as to how critical it is that they be answerable with the implementation. Any question that directly links to something in a measurement plan should be a high priority. But, every question cannot be a high priority. The second tab on the template referenced above includes a structure for capturing and prioritizing these questions. Finally, add in the dimensions and metrics that will be needed to answer each question.

The purpose of developing and prioritizing this list is to provide context as the solution is being designed. There are tradeoffs with any implementation. Just because a question gets flagged as a low priority doesn’t mean it won’t be included, but it may mean that it’s incorporated in a way that means the data is available, but not as readily and easily available as the answers to higher priority questions.


This can seem like a scary step in the planning process, and it does have significant implications. But, if you conducted the research and requirements gathering described earlier, this decision is usually self-evident.

The following are some reasons that would favor starting fresh rather than simply migrating the existing implementation into the Google Analytics 4 paradigm of events and attributes:

  • The current digital analytics implementation has incurred high technical debt due to its organic evolution over time
  • The measurement planning exercise uncovered significant gaps in the data that is currently available
  • Much of the data that is currently used is accessed through ad hoc and manual processes (i.e., there is not significant data from the current platform that is being automatically reported on a recurring schedule)
  • There is no or very little data from the current implementation that is integrated with other platforms (CRM, data warehouse, BI tools, etc.)

Ultimately, this decision is a judgment call. Even if you decide to start fresh, the legacy implementation may still be a useful reference to ensure that key data does not inadvertently get dropped in the new implementation.


At some point during the implementation of Google Analytics 4, a semi-panicked stakeholder will ask what your plan is to provide them with historical data—data that pre-dates when the solution was implemented. There are two aspects of this:

  • There will not be a pure apples-to-apples comparison of data from the legacy platform and Google Analytics 4. Some metrics that stakeholders are used to may simply go away (e.g., bounce rate, time on site), while other metrics will be similar in intention, but captured and calculated differently (e.g., users, sessions). So, even as historical data access is maintained, its utility for pure year-over-year or month-over-month comparisons is somewhat limited.
  • The duration—how many months/years of legacy data needs to be retained. By definition, this is a temporary issue. Time marches on, and, before they know it, Google Analytics 4 will have been in place for as much history as they care to look at. While this is not “the answer,” it is an important perspective to maintain: “legacy data access” is an inherently temporary solution, so you should have a bias towards “low effort” and “good enough for an interim solution” approaches.

The duration of the need for legacy data is driven by when you implement Google Analytics 4. Since the platform can run in parallel with Universal Analytics, there is no harm in having an extended period where both are in place on the site.

The earlier you can get Google Analytics 4 in place—collecting data—the better, as that will dictate when sufficient historical data is available natively in the platform to enable it to be the sole source of historical digital analytics data.

But, you still need a plan and mechanism in place to house historical data and provide stakeholders with access to it. What that mechanism is will depend on how you are currently collecting data and how much historical data is needed (again, while stakeholders may feel like they need 5-10 years of historical data, the reality is that site user experiences and user behavior itself continues to evolve rapidly, so the practical uses of data older than 13 months tend to be limited).

Your options and approach will differ based on your needs as well as your current implementation (free Universal Analytics vs. Analytics 360) and your new Google Analytics 4 implementation (free vs. Analytics 360). Some important considerations:

  • You will almost certainly want to extract historical data from the Universal Analytics native environment into another location.
  • This other location can be as robust as Google BigQuery or as basic as Google Sheets.
  • The more robust the storage destination, the more flexibility there will be to access different slices of the historical data.

The most robust approach is to use the native integration between Universal Analytics and Google BigQuery. Even if you do not currently have an Analytics 360 account, with a 1-month contract, you can backfill the previous 13 months of data into BigQuery. And, even with the free version of Google Analytics 4, you can stream hit-level data into BigQuery. Keep in mind, though, that the underlying data schema inside BigQuery is fundamentally different between Universal Analytics (each row is a session, with hits within the session nested in specific columns) and Google Analytics 4 (each row is an event). And, accessing data from BigQuery requires creating SQL queries to aggregate the data for specific uses.

At the other extreme, you can simply extract static exports of the data: key metrics broken down by key timeframes and key dimensions. This may result in multiple sheets in Google Sheets (or multiple Google Sheets) and, depending on the detail included in the exports, may require working around various data export limits within the platform.


Migrating from Universal Analytics to Google Analytics 4 is much more than a minor upgrade of an existing platform. It is a change in both how the underlying data is captured and structured, as well as how the data is accessed.

And, as described earlier, these changes mean that some measures will simply not be available (but new measures are available!), and some measures have behind-the-scenes changes to how they’re calculated, which means they may have the same name in both environments, but they aren’t actually exactly the same measure (“sessions” being, perhaps, the most prominent example of this).

This means that fear, uncertainty, and doubt will slither from you to your analyst peers to your stakeholders, and even to your executives. The best way to address concerns is to plan, communicate, and train early and often.

There will be surprises—unpleasant and pleasant—along the way, and proactive communication is one of the best ways to keep that to a minimum. Be very deliberate with identifying and grouping all of your affected stakeholders: who they are, the extent to which they directly interact with digital analytics tools and data, the extent to which they use the digital analytics data, and what their key uses of the data are. The measurement planning exercise described earlier should help with this, although there may be stakeholders who were not part of that process. When it comes to communication, err on the side of communicating more broadly.

For each group, consider what types of training they will need. Depending on your organization, you may be able to work with your human resources or internal learning department to formalize training plans that can then be tracked for completion. Identify the different types of training that will be needed for each stakeholder group. This will vary depending on the needs of your specific organization, but may include the following as different training areas:

  • An overview of the differences between Universal Analytics and Google Analytics 4, including the benefits of the new platform and the pending sunset of Universal Analytics by Google.
  • The Google Analytics 4 data model—from pageviews and events to events with parameters.
  • The native web interfaces for Google Analytics 4: Reports and Explore.
  • Using SQL to extract Google Analytics 4 data from Google BigQuery.
  • Accessing Google Analytics 4 data from other systems within the organization where it will be available.

Once the different training needs are identified, identify who and how the training will be developed and delivered. For some topics, training developed by various consultancies and agencies may be sufficient, but, if using such material, make sure the content strongly aligns with the identified need. You do not want stakeholders getting into a training and feeling like it is not relevant to their roles!

Using the implementation timeline, build a communication plan that includes:

  • What communications will be sent to ensure stakeholders are aware of what will be. happening, why, and when
  • What training resources will be available and who is expected to take advantage of them and when.
  • What the mechanism is for stakeholders to get any of their questions answered and concerns addressed before, during, and after the migration.

Communication is one of the easiest aspects of an implementation to overlook. Typically, the people responsible for the communication are deeply embedded in the process and suffer from the curse of knowledge—they wildly overestimate the broader knowledge of what is happening among the broader set of stakeholders. And, they (rightly) realize that proactively and broadly communicating will invite inbound questions and concerns that will need to be addressed. Nevertheless, limiting the communication will invite much larger issues farther into the implementation: missed requirements, upset stakeholders, and skepticism about the implementation itself.

Tim Wilson
Senior Director of Analytics

Read More Insights From Our Team

View All

Take your company further. Unlock the power of data-driven decisions.

Go Further Today