The evolution of user onboarding at Databox
PLG advisor Yaakov Carno on four major experiments and double-digit activation improvements
👋 Hi, it’s Kyle Poyar and welcome to Growth Unhinged, my weekly newsletter exploring the hidden playbooks behind the fastest-growing startups.
The second most-read post of last year was ’s deep dive into Miro’s user onboarding evolution. I wanted to bring you more of these stories that explore real-life product growth experiments.
Today I have a special edition in collaboration with — PLG advisor, brilliant illustrator and author of the newsletter. Yaakov unpacks how Databox improved activation from 30% to 40%+. (This post may be too long for email; read the full story here.)
I started working with Databox as a PLG advisor six months ago, focused on helping the team improve activation and unlock the next stage of growth.
Databox is an analytics platform that makes it easy for companies to centralize their data then use it to make better decisions and improve performance. The company started in 2012 targeting larger enterprises, but pivoted to SMBs and self-service around 2017. With a tiny sales team and no investment in paid marketing channels, Databox has grown to over $8M ARR and 7,000 customers through organic channels like SEO and partnerships.
The company was looking to take growth to the next level; however, before investing into paid acquisition channels we focused on activation. Improving activation would lay a solid foundation for faster and more efficient growth — making the most out of every signup and $$ spent on acquisition.
Since August 2023 we have:
Implemented 20+ product improvements.
Increased activation from around 30% to over 40%.
Reduced the average time-to-value from 18 hours to less than an hour.
I’m going to let you in on how we approached activation and end-user success at Databox. I hope to educate and inspire you with new ways to think about your product experience. Let’s dive in.
The four major experiments
Our path has already involved over 20 enhancements. I'll highlight the four pivotal experiments that reshaped our product experience.
The first thing I did when I started working with the team was to deep dive into the data. I was on the hunt for insights and clues that could give me a stronger sense of where users were getting stuck and dropping off. I investigated every potential drop-off from the moment a user signed up until they activated.
The drop-off that bothered me the most was before users even got to see the product. The onboarding flow required new sign-ups to connect at least one of their data sources before they could progress in the user journey. Data showed that only 35% of new sign-ups were connecting. That meant 65% of new sign-ups didn’t even interact with the product.
This is why the first experiment we ran was to explore how users reacted when landing in the product without having any of their own data to play around with. Would they be more curious to connect their data after seeing what they’d be able to do with it or would it increase confusion and friction?
We didn’t want to eliminate the existing flow yet and so we decided we would run an A/B test.
Experiment 1: Exploring the product before connecting data
Right out of the gate, we established the two different approaches to getting users started:
The first approach (the existing A flow) was pretty straightforward: users had to connect a data source before they could do anything else.
The second approach (B flow) was more laid-back, letting users dive in and check out the app without connecting anything first, giving them a chance to poke around and see what's what before hitting them with the data setup.
These two paths laid the groundwork for all the changes we've made since.
My thinking was, maybe users just don't have enough info about what Databox can do for them, and this lack of understanding and context was creating a barrier making it feel like a chore to connect their data instead of an exciting step towards solving their problem. So, we aimed to enhance motivation and curiosity by allowing them to experience the simplicity of creating a dashboard and exploring features before the data connection step.
Lesson in activation
The foundation of successful product activation is building a path that connects the user from where they are today (with a problem) to where they want to be (problem solved). Use this equation to measure if you’re designing this experience in the user journey:
Motivation - Friction = Potential of User Moving Towards Activation
That being said, remember that not all friction is bad. A little bit of friction in the beginning can actually pave the way for a seamless path towards activation (which is them experiencing the value they were hoping for).
This first experiment was playing around with these variables trying to reduce friction (early forced data connection before using or even seeing the product) and increase motivation (boost curiosity, context and education by letting users explore the product first — giving them more of a reason to take the step to connect their data).
But here's the catch—without your own data, it's tough to see the real magic of the tool. The way I view the path to activation is not necessarily proving all of the product’s value as quickly as possible, but rather giving users a tangible experience that they are moving towards the solution they’re looking for and proving that the product can solve the problem if they continue with it.
In Journey B we also introduced personalization in the user experience. We added three questions to onboarding flow:
“What type of data do you work with?”
“What tools do you use?”
“What would you like to do first?”
We used the answers to the first two questions to add personalized suggestions and templates when a user entered into the product. Then we asked the third question right before they entered into the product and used the answer to redirect users to different empty states.
This final question ended up causing more damage than good. The data was clear that new users didn’t fully understand the difference in functionality of metrics vs. dashboards and it was actually causing more drop-off because of the confusion caused. It may have failed, but this was a powerful learning in how users perceived and unlocked the product’s value, which ultimately led us towards bigger improvements for our product education.
Lesson in activation
Users need to be treated as complete beginners and this needs to be taken into consideration in how you phrase/label certain features/functions. You need to educate them on something they most likely didn’t know before even if they have experience in the overall area.
We got the results after collecting around two months of data. They were pretty disappointing at first. We saw an overall decrease in activation and fewer people were connecting data sources in the B flow.
The interesting part was that even with the higher initial drop-off we were seeing more users activating in the B flow with higher average activation scores. And there were more PQLs and conversions. In other words, a lower activation rate BUT the users that did activate were way better set up with higher chances of long-term success and conversion.
We knew we were onto something, but we were still far from unlocking the results we knew we could achieve. We had confirmed that when users got to see the product first and understood why they needed to integrate their data, they would end up becoming more successful in the long-run. We still had to solve for the users who were dropping off without ever connecting their first source.
This forced us into finding more drop-offs and understanding the why behind each of them. Mixpanel, LiveSession, and user interviews/surveys were our three guiding lights that helped us discover nuances in the user journey. Armed with these insights, we designed a guided onboarding journey for the B flow where users actually build their first dashboard in just a couple clicks before entering the product.
Before we explore the experiment, I want to share how our activation definition was evolving and how it was guiding our efforts.
Defining and measuring activation
Originally, our activation definition was simple with just one milestone. Activation meant a user connected a data source and created a metric or dashboard.
We wanted to get deeper insight into how users were getting value from the product and we added some new definitions and milestones. We now started looking at activation with setup, aha and habit milestones, and we added “connected a data” as a separate milestone.
Looking at the data we clearly realized that “one data source + one metric or dashboard” wasn’t the best way to measure users getting value. Even though users were getting past the first step, it wasn’t setting them up for longer-term success. So we now monitor our new definitions closely:
Set up: “One data source + one dashboard + changed/edited the dashboard”
By adding just that one variable we saw quickly how many users weren’t even touching their dashboard.
Aha: “2 data sources + 2 dashboards” OR “1 dashboard with metrics from 2 different data sources on the same board”
Here we were measuring more accurately the real value of Databox, which is the power of bringing multiple sources together.
Habit: “Reached Aha + activation score>40 + 7 unique days active”
Here we were measuring if the value realization was indeed leading towards adoption.
We have unique custom events for each of these milestones in Mixpanel so that we can easily and accurately measure them all.
And we have a team of data wizards dedicated to improving and refining our different activation and PQL scores. (There is a lot of work that goes on behind the scenes for this.)
Experiment 2: Guided onboarding
Coming back to the second experiment — the new guided onboarding we designed for the B flow — we were laser focused on the "Set Up" milestone. Inspired by the onboarding successes of SaaS giants like Airtable and Miro, we crafted a new B flow.
This redesigned journey gave users an experience of how easy it was to build a dashboard and got them to set it up before they even entered the product. This would give users a huge head start and get their creativity flowing right away.
We included ready-made metrics that could be added to a dashboard preview with a click of a button. These were personalized to the user’s tools, which they answered in the previous screen. We also made sure to make this not only a technical set up but a delightful one by asking users to name their new dashboard and customize its color. Finally, we added a progress bar to boost motivation and were intentional with friendly, educational copy throughout the journey.
The results came in. The initial drop off of users connecting a data source remained, which bothered us. But we saw an increase in how many users were actually getting set up and playing around with the product, leading to the type of activation we wanted to create.
The beauty of this flow was that by letting users into the product without making them to connect their data, we forced ourselves to take responsibility for every detail of the product experience. We had no choice but to constantly improve the product’s first impression and give users a stronger reason to connect data.
In an updated iteration we found a balance, combining our learnings from both the A and B flows while adding the forced connection back into the onboarding flow. This time the data connection step was only required after the user set up their first dashboard.
We had finally got both the results we wanted — more users connecting their first data source AND more users getting past the set up milestone. Combined, these results seriously boosted the chances being fully activated and becoming a PQL.
Experiment 3: Reverse trial
We were simultaneously exploring the idea of implementing a reverse trial. This meant that every new user would start on a premium plan for 14 days and once their trial ended could either upgrade or remain on the free plan without premium features.
We had noticed that 20% of new users were hitting either premium feature gates or usage limits within the first couple days of signing up. Even though users had the option to start a free trial whenever they hit a limit, half of them never did and simply disappeared.
Implementing a reverse trial at Databox was smoother than anticipated, thanks to existing infrastructure for an opt-in free trial. We rolled out the first version within weeks. By refining our communication strategies and ensuring consistency across customer interactions, we were set to launch.
After about a month of data the results were clear:
Activation rate for users in reverse trial: 39%
Activation rate for users NOT in reverse trial: 33%
This strategy didn't just improve activation; it aligned with our goal to showcase more of the product's value to users, allowing users to explore premium features right away and removing any initial barriers.
While reverse trials aren’t a new invention, I still see many teams underutilizing this method. I’ve heard things like “users won’t understand they’re in a trial and this will ultimately cause more friction when the trial ends” and “they might get upset because they just wanted to sign-up for the free plan” and many more excuses.
These concerns can be fixed easily with a little bit of educational communication and guidance in the product. If you clearly explain that they won’t be charged and can simply stay on the free plan after, it should remove their financial concerns. And if you reframe the trial as a gifted opportunity that lets users get more value without any commitment or cost, most users will actually end up appreciating the gesture and having a better first experience of your product.
Experiment 4: “Getting started” set up screen.
Even though we were quickly increasing the number of new users creating their first dashboard and connecting their first data source, over 50% of those users weren’t taking the critical next step which was to actually go play around with the dashboard and make it their own.
In this experiment we transformed the empty home screen into a “Getting started” screen, an empty state with a tailored checklist to help users get set up and guide their next steps in the product. We aligned the steps of the checklist in the new screen with the set up activation milestone. It would remain until users completed the initial checklist and only then would they access the next screen, which would be automatically pre-populated with metrics we gathered during set up.
The results were fantastic! Such a simple addition seriously boosted the number of users that took those first critical steps. Starting at less than 15% and within 3 weeks over 25% of users were now completing the set up milestone and moving towards deeper product adoption.
Wrap up
The biggest realization that I’ve gained from full-time PLG advising over the last year is that at the end of the day users are just people.
I know that may sound funny or a little weird, but honestly it’s transformed the way I view and advise on product completely. If you want to build powerful products that users love, then you need to truly empathize with people and understand what influences their decision and what’s going to help them.
All of these experiments were ultimately influenced by the goal of helping each user solve their problem. If you focus more on how to add real value, then everything else will start to fall into place.
Try adopting this lens and you’ll see for yourself how much easier it becomes to figure out what you need to do in order to improve your product.
Recommended resources:
Follow Yaakov on LinkedIn and by subscribing to his newsletter, Product Led Growers. Yaakov currently has two open spots for advising — if you’re looking for help unlocking self-serve activation, just send him a DM on LinkedIn.
Check out other activation insights including the evolution of Miro’s onboarding, in-product experiments from 7shifts and self-serve onboarding mistakes.
Meet other product-led GTM leaders at this year’s PLGTM Summit in San Francisco on April 16-17. Growth Unhinged readers get 50% off (use code Poyar). I can’t make this one 😔, but I spoke last year & it was a great event.
Catch up on the latest product benchmarks by joining my webinar tomorrow with Pulkit (Chameleon) and Ibrahim (Amplitude).
Thanks to you both sharing, this is great. It really mapped out the flow and reasoning. Amazing!
A lot of important ideas and insights here that we will apply to Valio. Really appreciate Kyle's willingness to share this work and to put it in context for us.