How One Tool Helped 25 Agency Marketers Manage Website Experiments – Part 1
This is a guest post by Desiree van der Horst from Fingerspitz on how her agency used Effective Experiments to manage website experiments. This is her contribution to our series “A Week in the Life of a CRO.”
Whenever I open my mailbox, the chances are high that there are new & unread newsletters in it. CXL, Rockboost, InVision, UserTesting, and so on. That’s okay – I like the stuff they’re writing. It’s inspirational.
The newsletters give me new ideas to try out on or for my own clients, or maybe for other marketers’ clients. They shed light on things we need to test right now, or whether those things can wait a couple of months.
Or maybe we don’t have a client to test this new idea for, but in the future, we might.
I have about 8 fresh newsletters a day. That’s about 40ish per week. And then there are my 25+ marketer colleagues who also read other industry publications and come up with ideas. Every. Day.
You do the math.
When I started at Fingerspitz 3 years ago, things were very different. The team was much smaller, only about 10 people.
As a beginner in marketing, I needed to learn the basics, like SEA, SEO, social media advertising, and implementing Google Tag Manager and Google Analytics. It seemed like there were 65 different tools to manage website experiments.
It was the pre-Slack era, where everything was documented in Trello or Evernote. At the time, that worked.
Flash forward to today. Our company has grown quite a lot – award-winning growth. With that growth, our team’s knowledge and experience increased. Thankfully, experimenting is now at the forefront of our minds, as it should be.
Almost everything that we do is an experiment.
Daily tasks can range from checking if the same ad works best on Facebook or Instagram, testing if adding an emoticon to your page title increases your organic CTR, and monitoring if an AdWords DSA/RLSA campaign works better than just a DSA or an RSLA campaign.
This begs the questions: do the changes that we make have significant positive outcomes for our clients? Can we learn from the ideas of other marketers and their clients?
Carrying out experiments has become part of our daily work life, but the main challenge is to easily structure them. Nobody wants to spend more time on them than necessary. It shouldn’t feel like an obligation.
We all want to spend our time as efficiently as possible!
Fingerspitz started looking for a shiny new tool to add to our list. Thankfully, we had our graduate intern Ella doing some research for us.
What did we ask ourselves?
She came up with different questions that needed to be answered:
• How is Fingerspitz sharing its knowledge?
• What are the expectations and wishes of the marketers?
• How can we structure, centralize, and shape the sharing of our knowledge?
• Which tools are there, and which one fits Fingerspitz best?
Let’s start with how we shared our knowledge because doing so is one of Fingerspitz’ core values. “Samen Super Slim” is our credo, which means – roughly translated – “being super smart, together.” But at that time, we used multiple tools and different channels to store and share information.
Tools we used
Some of these tools included:
1. Slack channel knowledge – FYI: for tips and tricks, short updates, articles that are relevant for a short period, and blogs that you still needed even 6 months after the fact. When there were tasks involved with any articles or news items, they were shared in the second Slack channel. This served as our working knowledge base.
2. Slack channel knowledge – experiments. These were written down in the STAR-method, which is a good idea, but in practice mostly forgotten or not fully executed. Most of the time, we just announced the start of the website experiments, but not how they ended. Screenshots & quick copy-paste texts were how we reported the results.
3. Best practice documents (resources, presentations, training material, etc.) were stored in multiple Trello boards or on our local server in different folders. Not everybody knew of the existence of all the boards or had access to all of them. Who did and didn’t have access to these documents was not clear.
We had different channels & tools, often for the same purpose. This clearly wasn’t a very structured way to manage our website experiments! Our new platform needed to provide a way to centralize all of our ideas and knowledge. It needed to be like a library, without the dusty books and the complex look-up tables.
To find out what the demands and wishes of the marketers were, Ella did interviews. This way, we came up with a feature list.
With this feature list, we could map what marketers really wanted. This ordering of priorities helped us decide what we wanted for our site. We made a list of our site priorities, which I’ve included below.
1. We want a great search bar to find documents quickly.
2. We want to be able to categorize our items.
3. We want to be able to add experiments and share them with colleagues.
4. We want to have a changelog to stay up to date.
5. We want to be able to watch videos within the tool.
6. We want to be able to upload our own content.
7. We want to be able to upload already existing documents.
8. We want to have a different shell for clients.
Knowing our priorities made it much easier to look for a tool. We now knew what Fingerspitz needed, and what the marketers wanted.
Armed with this information, we just started testing out tools – Mett, GrowthHackers Projects, Jira, etc. None of them offered what we were really searching for.
We then asked ourselves, hypothetically, would we stop using Trello or Slack if we had one tool to manage website experiments in which we could have it all? No. We were looking at it from the wrong angle.
Slack is perfect for those “only relevant for one week” kind of articles and ideas. Trello was still great for storing how-to guides, PDFs, checklists, and items we needed in the long run.
We were only missing a tool to structure and manage our website experiments.
What did our ideal tool achieve?
With this shifted focus from documentation to experimentation, our demands & wishes also changed – for management as well as the marketers. We knew this about our ideal tool:
1. There needed to be a template, so we could identify trends.
2. Prioritization should be possible.
3. We needed to categorize and label experiments.
4. Filtering on winning experiments should be possible.
5. We needed to be able to share the status and results of the experiments.
6. We needed the ability to work with multiple colleagues on one experiment.
7. A general report per marketer & client should be possible.
8. Clients should have their own specific environment.
We tried out some new and familiar tools to see which ones had the best fit. We looked at Growthhackers Projects Experiment Engine and Effective Experiments.
Since I’m writing this for the Effective Experiments blog, it must be obvious which tool in our investigation came out best!
Experimenting (and documenting experiments) shouldn’t be difficult. We still have a long road ahead, but we are definitely taking the right steps towards it!