Virtual conferencing tool

Recorder is an application for conference speakers to record their speech and send it to the virtual conference organizers. It took 5 months, 13 iterations and 22 user tests, to create it from idea to production. 12,891 speakers used Recorder in the first 5 months.

About the project

Virtual conference organizers don't like live streaming because of connectivity issues. In order for the audience to have a quality virtual conference experience, the presentations need to be delivered. Recorder is an application that allows speakers to record their talk directly in the browser and send it to the conference organizers with a few clicks. The client had the first version of the product ready. My goal was to redesign the product and optimize it based on feedback.

What I did

  • In-depth interviews with 11 speakers
  • In-depth interviews with 12 producers
  • Designing a prototype in Figma
  • Prototype testing with 24 speakers
  • User interface design

I used these methods

  • In-depth interviews
  • Double diamond
  • Gamification
  • Usability testing
  • Prototyping
  • Decision trees

Project timeline

Recorder 2.0

First, we needed to create an MVP. The MVP needed to be drafted within 14 days due to the conference being in September 2020.

Design Sprint Workshop

I used the methods from the Design Sprint and in the initial workshop we used the Braindump exercise to identify some key issues from the last version, we used the Dot-votting method to define priorities and we created HWM questions. The Crazy-Eights exercise helped us create the first sketches of the solution.


Based on the output from Crazy-Eights, I designed a prototype in Figma.

User testing

I tested the prototype with 5 speakers and 4 producers via video call. The speakers were given a link to the prototype and shared the screen with me. I gradually gave them the tasks they were trying to accomplish. I got additional feedback from the production people. The production people must have processed hundreds of pre-recordings so they had very good insights into what the speakers usually struggle with.


After the first two calls, I found that some of the signs are misleading and the speakers expect them to mean something else. I rewrote them and found no major problems in the rest of the user testing.

The MVP deadline was met- the MVP was designed and tested within 14 days.

We later renamed the MVP to Recorder 2.0.During the design of Recorder 3.0, we actively collected feedback on Recorder 2.0 and proposed optimizations based on that feedback.

Design system

We created a design system for recorder 3.0. The design system was based on Material Designu and Atomic frameworku.

Recorder 3.0

The third iteration of Recorder was to be an iteration based on user feedback combined with the implementation of new features. Recorder 3.0 was intended to be a user-friendly tool that would be easy to use.

Double diamond process

In designing this stage I used the Double diamond process. Based on the feedback from the MVP, I defined the main problems I would like to solve in this iteration.

01 - Research

I started with in-depth interviews with users. I was interested in what the speakers thought about giving talks and what they thought about virtual conferences. I interviewed 6 speakers. Each of these interviews lasted approximately 40 minutes. I prepared some questions and took notes on what the speakers talked about.

Research questions:

  • Do you have experience in giving lectures?
  • What tool did you use?
  • Tell me about your last lecture.
  • How did you feel?
  • Are you afraid when recording something?
  • How do you prepare for it?

Key findings

I found that speakers prefer the usual conferences because they can see the audience and adjust the tone and speed of delivery based on the audience. 5/6 of speakers said they prefer live streaming to pre-recording because the live audience doesn't expect everything to be perfect. They feel that in pre-recording everything has to be perfect like in a movie. 4/6 speakers prepare their speech carefully word by word. They would like to be able to pause the recording and edit out the part where it wasn't perfect.They all told me that they think the thing they miss most about virtual conferences is the coffee breaks and networking.

Feedback from MVP

Feedback from the MVP was positive. There was a form at the end of the MVP asking for feedback. The average rating from the 6800 speakers was 3.9 / 5 (median 4 stars).

02 - Define

In-depth interviews with producers

After researching, I started interviewing the producers because they send the speakers instructions to record their presentations.There can be up to 6,000 preliminary presentations per event. I presented them with the research data and asked them about their experiences with specific issues. We found that some of the problems we identified in the research section were not major problems and over 95% of the speakers had no problem with this, so we decided to focus on the major problems.

Issues we focused on

One of the most common feedbacks on the MVP was that they lack an editor and if they want to correct some part of the talk, they have to re-record the whole talk. We knew this already with MVP, but unfortunately there was no time to implement it. It was the number one priority in Recorder 3.0, the second priority was to allow multiple versions to be recorded and shared with colleagues.

03 - Design

After we defined the problems we wanted to focus on in this iteration, I started to prototype the solution. I designed an editor, a versioning system, an interface for uploading, onboarding, and a special mode for multi-speaker group presentations.


First I suggested wireframes. Based on my bad experience with user testing low-fidelity prototype (wireframes) with end users, I decided to test it only with production users. They gave me valuable feedback and after a few iterations the wireframes were finished.

High-Fidelity prototype

The graphic design of the high-fidelity prototype was based on the established design system.

I wanted to make sure everything would work as planned, so I ran 22 user tests and over 13 iterations on this project. I also consulted with two external designers to make sure the design system would work and be scalable in the future.

04 - Development

I prepared the project specifications for the developers and the product was ready in 1.5 months.

Final version

The product began to be used immediately. Annually, the recorder is used by over 30,000 speakers from around the world at various virtual conferences. After speakers submit a recording, they have the option to send us feedback on the recorder (1-5 stars). The median from over 10,000 speakers is 4 stars.

PS: I am are very proud of this project😊