TrustRadius: an HG Insights company

Google Content Experiments (discontinued)

Score7.3 out of 10

74 Reviews and Ratings

What is Google Content Experiments (discontinued)?

Google Content Experiments was a tool that can be used to create A/B test from within Google Analytics. It has been discontinued since 2019, and Google now recommends using its Google Optimize service for A/B testing.

Categories & Use Cases

Top Performing Features

  • Split URL testing

    Test out larger design changes by splitting your site traffic across two different landing pages to identify which site performs the best. It can be used to test the impact and feasibility of things such as new designs, personalization efforts, and new site architecture.

    Category average: 8.2

  • Multivariate testing

    Ability to test multiple site design changes at once across one or multiple variations and identify which variation impacts conversion rates, or other predefined goals, the most.

    Category average: 8.2

  • Visual / WYSIWYG editor

    Set up A/B testing campaigns using a WYSIWYG editor to create site versions and preview design changes before testing them. These editors often don’t require coding knowledge in order to operate them.

    Category average: 7.4

Areas for Improvement

  • Visitor recordings

    Watch recordings of user sessions to gain insights on site visitor behavior and identify areas to improve site visitor experience.

    Category average: 8.5

  • Preview mode

    Preview your experiment before running it live on your site or app.

    Category average: 7

  • Experiment workflow and approval

    Ability to assign different phases of the experiment process to your team and approve next steps for an experiment or campaign.

    Category average: 6.7

Integrates well with Google Analytics

Pros

  • Real time data
  • Personalization

Cons

  • Back end development needed for detailed tests.
  • Some training required.

Return on Investment

  • Increased ROI
  • Faster decision making

Other Software Used

Optimizely, Mixpanel, Adobe Acrobat DC

Good tool for optimizing websites

Pros

  • Easy to use and implement.
  • Easy to understand the reports.
  • Seamless integration with Google Analytics.

Cons

  • No support for multivariate testing, a feature that was pulled from the product when it was rebranded as Google Content Experiments from Google Website Optimizer.

Return on Investment

  • Improved goal conversion rates.
  • Increased time spent on site.

Google Content Experiments: Quick and easy (and free) A/B/n testing you should already be using.

Pros

  • Free
  • Easy to use if you already use Google Analytics - literally a few minutes to set up a test
  • Fully integrated with Google Analytics - so you can use your existing conversion goals, and no confusion over metrics.

Cons

  • Multivariate testing
  • Ability to drop losing variants during a test
  • Ability to manually choose split of traffic between variants

Return on Investment

  • Higher quality leads
  • Reduced sales team effort (due to less low quality leads)
  • Higher conversion from visitor to lead

Limited Testing Solution, but Not Without Value

Pros

  • It is free
  • Tests are easy to set up. Experiment creation is a 4-step process that can be completed with limited knowledge of Google Analytics
  • Experiments is integrated with your Google Analytics account, utilizing existing goals or allows you to create new goals for testing directly within the set up area
  • Allows for advanced segmentation, filtering, and traffic allocation

Cons

  • No multivariate tests: Google Content Experiments tests on an A|B|n platform. As such, you must create full iterations of landing pages to test against each other. If you are testing a full layout change, the A|B|n model will suit your needs just fine. However, if you’d like to test multiple elements at one time or a small element, like button color, you would need to create a new landing page for each different color button you will test. This can become quite cumbersome.
  • Basic testing platform: This platform will also present challenges if you would like to test the influence of one element across multiple pages. You will need a much more sophisticated tool for this type of testing.
  • Content Experiments defaults to a Multi-Armed Bandit algorithm. This automatically adjusts the amount of traffic allocated to each variation based on sample size and performance metrics, resulting in less traffic to "under-performing" variations. In theory the Multi-Armed Bandit is much more efficient than your classic A|B testing method, concluding tests much quicker while reducing potential revenue loss. Unfortunately, we have found that for testing with small sample sizes, like those commonly conducted for B2Bs or SMBs, the multi-armed bandit has the potential to create an invalid test that will never declare a winner. This is because traffic to the variation is typically reduced so severely and so quickly that there is not a significant enough sample size to give it a competitive opportunity.
  • Testing on a specific segmentation is challenging: To test on a distinct segment of your visitors requires more advanced knowledge of Analytics to set up a new filter view and cannot be completed within Google Content Experiments.
  • Not ideal for marketers who want independence from IT: Content Experiments requires a snippet of code being placed on one experiment page for each test. It also requires a full version of the test page with a unique URL.

Return on Investment

  • Utilization of Google Content Experiments has allowed us to consistently conduct more tests, resulting in a 141% improvement in overall conversion rate.

A superb A/B Testing tool for smaller teams.

Pros

  • Has a great analytics engine in its backend which uses multi-arm bandit methodology- and thus can perform multiple variations at once.
  • Multi-arm bandit means that it's also really effective in finding the winning solution.
  • It can be based on Analytics Goals via Optimize so you can drive things that are important to the business.

Cons

  • Their documentation is not the best and it's quite a steep learning curve.
  • They also don't tell you particularly well what sorts of things you should be testing.
  • Compared to other suppliers of A/B testing tools- it needs a simpler interface. Optimize is starting to answer that - but is still quite Beta-like.

Return on Investment

  • Doing good experiments/Optimize has helped to take out the guesswork of the things we want to implement.
  • We have done fairly complex changes such as changing navigation and managed to see improvements outcomes immediately before we have to request developer.
  • Our teams have become more data centric in how they approach changes.

Other Software Used

Google Tag Manager, Mouseflow, Tint, ObservePoint