Whether launching a first test or scaling a sophisticated experimentation program, Optimizely Web Experimentation aims to deliver the insights needed to craft high-performing digital experiences that drive engagement, increase conversions, and accelerate growth.
I have not used another software all that similar to Crazy Egg. I find Crazy Egg to be a unique tool to incorporate into your analytics, though I have seen similar software out there.
Crazy Egg a bit low on features and has a not so friendly interface. But depending on the complexity of your team/projects/experience in digital marketing, it is a great place to start. It's budget friendly. If you have an advanced analytics or A/B testing solution it's a nice …
We've never used ClickTale and it was obvious that they offered a lot for businesses. The reason why we chose Crazy Egg is because it fit our business needs and scale perfectly. We do not utilize the software as much as we should; however, when we do use it, we gain a lot of …
The company actually still uses Optimizely and Google Analytics as well. Optimizely works well with Crazy Egg because we can do beta testing not only to see if the conversion rate goes up (Optimizely), but also see how the user is interacting with the change (Crazy Egg). Google …
We use crazyegg in conjunction with optimizely. They aren't really direct competitors. Crazy is graphical heatmap tracking of a page and doesn't A|B test. We need Optimizely to split out our testing variations for us and then we use Crazy Egg to supplement our knowledge for a …
Optimizely is highly intuitive, allowing marketers or non-technical folks to run experiments without complicated coding. It also allows for various types of experimentation, including A/B tests, multivariate tests, and personalization. This capability will enable teams to run …
Optimizely Web Experimentation was more robust and able to handle the broad array of sites we run than VWO. It has been a great platform to easily add additional sites onto, but still providing a universal overview of all of them, making management a simple task.
unbounce's Visual Editor is what I'd expect out of Optimizely Web Experimentation, but I believe it's missing. Otherwise, Optimizely Web Experimentation is better.
Monetate, but the tool was a little too far out of our budget. We picked Optimizely because it was cost effective and our ROI was justified based upon a defined baseline to increase conversion and continue the strategy of A/B testing.
Features
Crazy Egg
Optimizely Web Experimentation
Testing and Experimentation
Comparison of Testing and Experimentation features of Product A and Product B
Crazy Egg
-
Ratings
Optimizely Web Experimentation
8.0
150 Ratings
3% below category average
a/b experiment testing
00 Ratings
9.0150 Ratings
Split URL testing
00 Ratings
8.5122 Ratings
Multivariate testing
00 Ratings
8.4127 Ratings
Multi-page/funnel testing
00 Ratings
7.9116 Ratings
Cross-browser testing
00 Ratings
8.186 Ratings
Mobile app testing
00 Ratings
8.166 Ratings
Test significance
00 Ratings
8.4136 Ratings
Visual / WYSIWYG editor
00 Ratings
8.1121 Ratings
Advanced code editor
00 Ratings
8.0114 Ratings
Page surveys
00 Ratings
6.217 Ratings
Visitor recordings
00 Ratings
8.418 Ratings
Preview mode
00 Ratings
7.6133 Ratings
Test duration calculator
00 Ratings
7.8101 Ratings
Experiment scheduler
00 Ratings
8.2101 Ratings
Experiment workflow and approval
00 Ratings
7.881 Ratings
Dynamic experiment activation
00 Ratings
7.665 Ratings
Client-side tests
00 Ratings
7.888 Ratings
Server-side tests
00 Ratings
7.244 Ratings
Mutually exclusive tests
00 Ratings
8.174 Ratings
Audience Segmentation & Targeting
Comparison of Audience Segmentation & Targeting features of Product A and Product B
Crazy Egg
-
Ratings
Optimizely Web Experimentation
8.2
142 Ratings
4% below category average
Standard visitor segmentation
00 Ratings
8.4137 Ratings
Behavioral visitor segmentation
00 Ratings
7.7113 Ratings
Traffic allocation control
00 Ratings
9.1135 Ratings
Website personalization
00 Ratings
7.8103 Ratings
Results and Analysis
Comparison of Results and Analysis features of Product A and Product B
+ I strongly believe that this tool helps when a firm has good user count (depends on business model) as most of these tools are data friends. More data - more valuable insights+ Best fit if someone who is looking for deeper insights of individual page - Not suggested for very fewer visits of a website. Suggested toimprove better visit count
I think it can serve the whole spectrum of experiences from people who are just getting used to web experimentation. It's really easy to pick up and use. If you're more experienced then it works well because it just gets out of the way and lets you really focus on the experimentation side of things. So yeah, strongly recommend. I think it is well suited both to small businesses and large enterprises as well. I think it's got a really low barrier to entry. It's very easy to integrate on your website and get results quickly. Likewise, if you are a big business, it's incrementally adoptable, so you can start out with one component of optimizing and you can build there and start to build in things like data CMS to augment experimentation as well. So it's got a really strong a pathway to grow your MarTech platform if you're a small company or a big company.
Provides heatmaps that shows you the elements on your site that are and aren't performing well.
Provides scrollmaps so you can see how far down a page users are scrolling and which content never gets seen.
Screenshots show you how your website looks across a variety of different devices.
Provides a type of clickmap called confetti that enables you visualise clicks by segments - device, new/returning visitors, campaigns and other metrics.
The Platform contains drag-and-drop editor options for creating variations, which ease the A/B tests process, as it does not require any coding or development resources.
Establishing it is so simple that even a non-technical person can do it perfectly.
It provides real-time results and analytics with robust dashboard access through which you can quickly analyze how different variations perform. With this, your team can easily make data-driven decisions Fastly.
The largest thing we've struggled with is the Optimizely integration. I've contacted customer service a few times to get it properly setup. Customer Service is always friendly and helpful; they provide clear steps to get it setup. Unfortunately despite clear instructions, they are tedious, and if not completed in the correct order, the integration with Optimizely does not work. My success rate with the integration is less than 55%.
It's a great tool considering how inexpensive it is. If used correctly and you have a plan for tracking your websites, this tool can make a world of a difference. If you are not going to sit down and take the time to make a plan for how to use this tool, I would say it is not worth your time. Yes, you can look at items on your website that need to be changed, but without a consistent plan, other important items that need changing can be lost in the mix. Make sure you have enough time and energy to invest in this and it will be well worth it
I rated this question because at this stage, Optimizely does most everything we need so I don't foresee a need to migrate to a new tool. We have the infrastructure already in place and it is a sizeable lift to pivot to another tool with no guarantee that it will work as good or even better than Optimizely
It's not clear what features there are. The navigation icon is not labeled. It's hard to know where to start when you're first logging in as a first-time user. It's hard to know how to set up anything and there aren't many helpful tutorials in-product. I don't want to be kicked out of a help center or read the documentation.
Optimizely Web Experimentation's visual editor is handy for non-technical or quick iterative testing. When it comes to content changes it's as easy as going into wordpress, clicking around, and then seeing your changes live--what you see is what you get. The preview and approval process for sharing built experiments is also handy for sharing experiments across teams for QA purposes or otherwise.
It's slow to post data, and slow to get a snapshot to finally be active (i.e. not pending). Not intolerable, but would be nice to see data within a couple hours. Often have to wait to the next day.
I would rate Optimizely Web Experimentation's availability as a 10 out of 10. The software is reliable and does not experience any application errors or unplanned outages. Additionally, the customer service and technical support teams are always available to help with any issues or questions.
I would rate Optimizely Web Experimentation's performance as a 9 out of 10. Pages load quickly, reports are complete in a reasonable time frame, and the software does not slow down any other software or systems that it integrates with. Additionally, the customer service and technical support teams are always available to help with any issues or questions.
I think support is an area where Crazy Egg is lacking. I would love to have a quarterly check-in with a Crazy Egg rep to understand what kinds of changes have been made to the platform and what is on the horizon. I also think a quick consulting sessions with a rep could be extremely beneficial, as I'm sure there are ways to use the tool that we haven't even thought about yet that would be extremely insightful for our team.
They always are quick to respond, and are so friendly and helpful. They always answer the phone right away. And [they are] always willing to not only help you with your problem, but if you need ideas they have suggestions as well.
The tool itself is not very difficult to use so training was not very useful in my opinion. It did not also account for success events more complex than a click (which my company being ecommerce is looking to examine more than a mere click).
In retrospect: - I think I should have stressed more demo's / workshopping with the Optimizely team at the start. I felt too confident during demo stages, and when came time to actually start, I was a bit lost. (The answer is likely I should have had them on-hand for our first install.. they offered but I thought I was OK.) - Really getting an understanding / asking them prior to install of how to make it really work for checkout pages / one that uses dynamic content or user interaction to determine what the UI does. Could have saved some time by addressing this at the beginning, as some things we needed to create on our site for Optimizely to "use" as a trigger for the variation test. - Having a number of planned/hoped-for tests already in-hand before working with Optimizely team. Sharing those thoughts with them would likely have started conversations on additional things we needed to do to make them work (rather than figuring that out during the actual builds). Since I had development time available, I could have added more things to the baseline installation since my developers were already "looking under the hood" of the site.
Hotjar is more expensive than Crazy Eggs, and we needed a tool to fit the budget for small comp. With more time, we could have tested it deeply also to have a better opinion, it seems to be great too
The ability to do A/B testing in Optimizely along with the associated statistical modelling and audience segmentation means it is a much better solution than using something like Google Analytics were a lot more effort is required to identify and isolate the specific data you need to confidently make changes
Its reliability (not scaleability, as the question asks for, sorry) is pretty good but through our testing we know that some clicks do not get recorded. It doesn't bother us a lot because we look at the aggregate of thousands of visits, but we do know it misses things. As for scaleability, it's about right. You really don't want zillions of clicks per snapshot - the screen just turns to 100% dots and you lose the ability to differentiate different screen areas. We find that 25,000 clicks for a page gives us a really good view.
We can use it flexibly across lines of business and have it in use across two departments. We have different use cases and slightly different outcomes, but can unify our results based on impact to the bottom line. Finally, we can generate value from anywhere in the org for any stakeholders as needed.
We're able to share definitive annualized revenue projections with our team, showing what would happen if we put a test into Production
Showing the results of a test on a new page or feature prior to full implementation on a site saves developer time (if a test proves the new element doesn't deliver a significant improvement.
Making a change via the WYSIWYG interface allows us to see multiple changes without developer intervention.