Collective work on interface projecting with the help of RealtimeBoard

Mentors of “Interfaces” column on continue publishing cases and tools which will be useful for representatives of the industry. Philip Konzarenco shares Anna Boyarkina’s, a marketing manager of RealtimeBoard collective work service, story of UX/UI-solutions elaboration.

RealtimeBoard is a service for a collective work which looks like an online-board and allows to visualize processes in a convenient way, work with content, discuss design and plan.

We launched in 2012 (the first public beta) and product teams, UX-designers, marketing specialists, consultants and other people of creative professions actively use us (650 000 people from all over the world).

Tests and experiments in our work are something we can’t imagine it at all. We use our product for working on interface and UX-solutions.


What services help us to make UX/UI-solutions

In order to conduct experiments and do it efficiently, we use a set of tools which allows us to know what users do in a product, use the data for designing, design and check how efficient our experiments are.

KISSmetrics is a system of personalized analytics which allows us to reveal problems and look where we head for (we track AARRR-metrics on user cohorts) and, surely, check successfulness of our experiments.

KISSmetrics may be set up in such a way that you can get data on key user actions. Moreover, reports may be also adjusted and allow to look at data as funnels, cohorts and any other parameters.

That’s the way one of reports looks like (it’s a funnel report):


“Yandex.Webvisor”, Inspectlet are the services which allow to watch interesting cartoons about user actions. They help to correlate your expectations and user reality.


  • RealtimeBoard is a service where we visualize, collect hypotheses, gather user action map and discuss variants of mockups for further realization and testing.
  • Optimizely is a service for beta-testing. It allows to set up versions for experiments, goals and track clicks on necessary buttons for you. We use our own system inside the product. We send all the data to KISSmetrics in order to have a common database.


There are also some services and tools we use from time to time but the ones we’ve already mentioned are enough.

A basic scenario for all experiments looks like this:


  • According to the results of cohort analysis (we get data from KISSmetrics), we discover a problem. For example, it’s a low user activation.
  • We start digging into a problem and we move from a key metric (activation) to more low-level ones. Let’s assume, we’ve found out that after registration users fall off on one of the first screens. Then we make hypotheses, evaluate a potential influence on the metric, our confidence in the fact that it works and complexity (price) of checking them (it’s also called ICE method — Impact, Confidence, Effort).
  • We visualize and project variants of solving the problem on an online-desk, all team takes part in this and we make several iteration with the help of our working team and then final ones go to a designer for drawing.
  • Mockups appear on RealtimeBoard were we approve them and then they go to development and are given to users.
  • We collect data, watch whether our hypotheses are proved and, as a rule, we plan another experiment.



Onboarding is everything or a little bit about activation

We conduct a great many of tests and experiments: letters, website pages (including various elements), registration forms, place and look of tool buttons.

I want to tell you about our experiments with the first user entry.

What’s the problem or come back and I’ll forgive everything

The first user entry is what defines his/her behavior in a product (an activation, returns and a purchase). We repeated it once again and it was evident for everybody that onboarding is what we need to do.

It was even more evident when we looked at data and started to look for a possibility of increasing Retention. We found a potential source on a previous funnel step and it was Activation.

We had some problems with an activation index (the service is very flexible and various categories are used for different tasks). In order to reveal it we started to watch what actions made during the first session were likely to lead to further returns.

Due to this we looked at Retention firstly and decided to find out what action that a user makes during the first session lead to further service visits.


Having analyzed everything that users do in the service during the first visit, it turned out that those who add files or some text notes to a board are more likely to come back and continue working with RealtimeBoard. It was clear what actions to stimulate.

If you can’t define your activation index yet, watch what users who come back do. And you will surely find some patterns.


Numbers and facts

Point 0. The first user entry before a beginning of an experiment looked like that: after a basic welcome-scenario an empty desk was created where a person could do anything. A tool is rather simple and a user could understand everything by clicking all buttons. It looked this way:


Clear but a little bit empty and lonely, isn’t it?

We took a look at analytics data and saw that the difference between a factual activation index and our target index is approximately 15 per cent points (which is a lot).



What to do? We decided that it’s necessary to create tips for users on these target actions. Before making our own tip variant, we collected references of other products of a similar audience profile and that are examples of successful and popular products.




Together with a designer and a copywriter we created several tip variants, the process is on the picture below. There’s a map of all screens on the board and we also discuss logical consequence, visual solutions and texts there.

As a result, we created 2 types of tutorials which we called “static ones” (a screen blocks, a user clicks next-next-next and completes a tutorial) and an “interactive one” (a user has to perform a target action in order to go further).

Variant 1. Static. A user needs to click Next button.


Variant 2. Interactive. A user needs to make a target action in order to go further.



Checking hypotheses

Initially, we made bets on the interactive tutorial as a user makes a necessary action at once (a behavioral pattern forms) and sees the success (a feeling of a progress).

Step 1. We launched an A/B-test and it lasted 7 days. 1624 users took part in it. Apart from the data from an analytics system we also checked Inspectlet — watched cartoons about users.

As a result: more people refused from the interactive tutorial. Only 8,4% completed this kind of tutorial while 74,3% finished a static one.

Activation and Retention indices have risen for all users. It was 8% and 16% for those who completed the interactive and the static tutorial correspondingly.


Thus, the static tutorial, a simpler variant for realization worked better for us.


Step 2. It would be easier to choose the winner variant but thought about how we can enhance it. As a result, there were 3 variants in another test — a standard variant, a variant without closing by clicking a cross and an obligatory tutorial (without an option to avoid it).

Despite the fact that each additional screen or step is a potential loss of a user, we decided to create an obligatory tutorial because there were only 3 steps and they were rather simple.


The test took 7 days and 1192 people took part in it. Results showed that a static tutorial which can’t be closed by clicking a cross worked the best.


Step 3. We added the chosen variant to the common welcome-scenario and started to optimize each step where users stopped.

That’s the way we present results in our team (a per cent of users who go further is shown in the stickers). It allows to make decisions quite quickly:




The experiment is rather simple, though, it may seem that a simple addition or refusal from an element in a screen increase conversion into a target action several times. For example, now the index of tutorial completion is about 80% and we’ve increased Activation index by 16%.

Retention has also increased by 8%.

What we’ve learnt:


  • Users should be led but not “hand in hand”. Anything can work in your case.
  • If you project tips for the first entry, it would be better to make them easy to adjust to various changes and of a universal form. Ideally, create a framework, according to which you can create all screens or tips for different scenarios (because one tutorial isn’t created for good).
  • Even the simplest and “cheapest” actions can provide a good result.



What’s next?

We continue to experiment. Our product is very flexible and it’s used for various tasks. That’s why we plan to adapt the first entry for each task type (Jobs to be done approach).

WalkMe: simple user experience in a moment

There’s no doubt that customer support helps to build trust, keeps long-term customer retention and enhances brand loyalty. But many companies don’t give it the attention it deserves. And, consequently, your users start looking at…

29 apps that will to enhance your office productivity

Our lives have become easier due to technologies. Now, when we all have smartphones and tablets, there is a great amount of apps that can track how many miles you’ve walked and how much time…