Digital product designer with more than 10 years of experience working at companies large and small, and a track record of delivering simple, easy-to-use products that solve real user problems.
๐ถ Loves corgis
๐ธ Street photographer
๐ฎ Tex-Mex lover
โฝ Soccer fanatic
๐ Too many dad jokes
Nutrisense
๐ผ Senior Product Designer
๐ 09/2022 - 03/2024
๐ Remote๐ฃ 1000's of Members
Nutrisense is a Series A-backed startup which utilizes CGMs to provide real-time glucose data to users in order to understand the impact food and exercise has on each individual's body.
Meal Logging
One of the key components of the Nutrisense experience is being able to track how different events correlate with changes to one's glucose throughout the day. In order to do so, users can log the foods they eat as well as other activities throughout the day such as exercise and sleep.
Nutritionist Tab
Another key component of the Nutrisense experience is the interactions a member has with their assigned nutritionist. This experience, however, was duct taped together using Intercom and had been comingled with the customer support experience. In this project, we wanted to make the Nutritionist experience a first-class citizen within the app by creating a new tab dedicated to a member's interactions with them.
DBK Labs is an app design studio I founded with the goal of fixing user experience issues larger companies seem to neglect. My category of focus was streaming services, such as Netflix and Hulu.
Hypothesis
I treated the entire company as one large experience design problem, and my hypothesis was that if I fix annoying issues with streaming services, users would find enough value to pay actual money for them.
Background
This endeavor grew out of a Chrome extension I created for Netflix called Netflix Classic to fix several personal frustrations of mine with the Netflix experience in my browser. The main frustrations included auto-playing trailers Netflix would force you to watch, strange hover states on posters which were awkward to scroll over, and showing the prompt asking what profile to use whenever you return to the site.
The extension received a large positive reception, which confirmed to me that fixing annoyances with streaming services could be something people would be willing to pay for.
One of the other main annoyances bothering me personally (and I assumed for others as well based on the reception to my previous fixes) was that I had to go into the browser to watch Netflix. This caused me to have to navigate away from other work I was already doing in the browser, and the context switching would get confusing. In other words, there was no standalone Netflix app for Mac, and I wanted to fix that.
Long story short, I doubled down on these problems to create a macOS app for Netflix that both allows users to instantly launch the app from their Mac dock and fix major experience issues that these services were not fixing. After the Netlix app garnered a decent amount of initial attention, sales, and press, I continued on this path to develop a suite of macOS apps for all major streaming services with experience issues.
Process
To identify experience issues in each app, I did several things:
Conducted an audit of the entire streaming service, testing out all features and functionality
Scoured online user forums to look for issues most complained about
Surveyed friends and family about things that particularly bothered them
Used my experience to pick out issues that users were likely to complain about
After releasing an app, I also listened to customer feedback and released updates to the apps to fix pain points I may have missed beforehand
If you are interested in my visual design or landing page work, I designed all icons and landing pages myself.
Outcome
10's of thousands of sales. The experience issues I identified were clearly enough of a problem that people were willing to pay money out of their own pockets to improve these experiences, validating my overall hypothesis.
Press
Udemy
๐ผ Product Designer
๐06/2015 - 02/2019
๐San Francisco๐ฃ 40 million students & 50k instructors
Udemy is a two-sided, online educational platform where instructors can create and manage online courses, and students can discover and take courses.
Case Studies:
Other Work:
Meal Logging Case Study
๐ Summary
One of the key components of the Nutrisense experience is being able to track how different events correlate with changes to ones glucose throughout the day. In order to do so, users can log the foods they eat as well as other activities throughout the day such as exercise and sleep.
Up until this point, the meal logging experience had been โdesignedโ by engineers, and there were many, many problems with it. The goal of this project was to determine all of the problems with the current experience and design a best-in-class meal logging experience as the output.
๐ง๐ปโ๐ป๐ฉ๐ฝโ๐ป Team
1 product manager, 2 engineers, and 1 product designer (me).
โฐ Duration
3 months
๐ง Problem
At a high level, there were significant issues with the design of the existing meal logging experience, including but not limited to:
Too many taps required to log a meal
Tap targets not obvious
Not clear how to access Recent and Favorite Meals
No indication of what is and is not required
Multiple bugs
Bad search experience
๐ Process
To make sure all of the problems with the current experience were properly evaluated, I
Reviewed all customer support tickets related to meal logging
Reviewed comments in our members Facebook community related to meal logging
Spoke with our in-house nutritionists who work with members 1 on 1
Conducted a heuristic analysis of the existing experience
Conducted unmoderated user research with different meal logging tasks
I also conducted a competitive analysis to see what other best-in-class meal logging experiences look like, such as MyFitnessPal and Lose It!.
After gathering all of this data, I consolidated all of the information and grouped the problems into 5 buckets:
It requires too many taps to add a meal
Confusing / Overwhelming design
I already log meals in a different app
Barcode scanner does not work well
Food database is not good
Setting Principles
Upon analyzing this process, it became clear that we as a company needed to have a guiding principle of what we wanted users to do regarding meal logging. There were several options:
Require members to add individual ingredients with accurate quantities and nutritional information
Allow members to log a meal in as much or as little detail as they want, for example only a photo or only a text description.
Guide members to log the right level of detail based on their unique situation.
Nutritionists within the company wanted members to log meals with accurate nutritional information.
However, requiring members to be detailed about what they ate would increase complexity, number of taps, and the time it takes to log a meal in general. We know that this will lead to a decrease in amount of meals logged, and we as a product team do not want that to happen.
It was difficult to get a consensus from others within the company on a perspective, but I kept the overarching goal of the project, to making meal logging easier, as a key principle during the process.
Explorations
Armed with a solid understanding of the problems, I explored many different solutions, including both small fixes to low-hanging fruit as well as more significant restructuring options.
After reviewing these explorations with the PM and engineers on the team, we decided to go forward primarily with the first option, asking info in stages. This primarily addressed the problem of the experience being overwhelming and unclear of what needs to be done.
High Fidelity Designs
At this point, I created a fully functioning prototype in Figma of the multi-step option and started reviewing it with others in the company, including the rest of the design team, the product team, and nutritionists.
Review
During the review process, I started to get an uneasy feeling that while the design did solve the second problem of being confusing and overwhelming, it did not make it quicker for those that did not want to add every type of information.
We only had 1 week remaining of allotted time to finalize the designs, but I felt strongly that this was not the right direction to go. We had to pivot.
Pivot
In conjunction with the need for a pivot, I had been digging into the docs on the 3rd party library we used for our food database. It just so happened that they just released a new natural language processing feature which can convert text to a list of ingredients with accurate quantities instantly.
I quickly threw together a prototype of another option which I thought better addressed all of the problems and which utilizted this new NLP technology and shared it with the team.
This new solution also gave us the best of both worlds. Logging a meal was quick and easy, yet it could still include individual ingredients with accurate quantity sizes.
The head of engineering was amazed, and everyone else internally loved it too. It made accurately logging a meal as simple as texting a friend. No more messing with convoluted input fields and buttons and switches.
Outcome
Armed with a design we were confident solved the problems, I worked closely with engineering to implement the designs over the next few weeks.
Feedback from members was very positive, and nutritionists reported a significant drop in confusion from members about how to log their meals.
One member even said: โMeal logging in Nutrisense is even easier than MyFitnessPal now. I love it!โ
What's Next
Several pieces of feedback we received throughout this process did not fit into the tight timeline for this round, but a few other improvements that could be made to this experience to truly make it best-in-class includes:
Voice to text
Ability to favorite ingredients, not just meals
Faster entry / new entry points, such as iOS widgets, Siri and Alexa integration, and MyFitnessPal integration
Merchandising
๐ Summary
A series of projects related to merchandising:
Global Nav Bar
๐ Summary
One of my earlier projects at Udemy was to redesign the global nav bar. The nav bar contains the Udemy logo, category selector, search bar, shopping cart, notifications, account info, and more.
Become an Instructor Landing Page
๐ Summary
The Become an Instructor page is the first page prospective instructors will see prior to signing up for Udemy. The goal of this page is to convert a prospect to sign up. As such, it is crucial to clearly communicate the benefits of teaching on Udemy in a straight-foward manner while answering the questions they have before they even think about it.
Instructor Analytics Dashboard
๐ Summary
This project focused on the instructor side of the marketplace, and the overarching goal of the project was to empower instructors to better understand their performance on Udemy and make better decisions as a result. Performance here refers to financial performance, marketing performance, and course quality as evidenced by student behavior in the courses. By decisions we are referring to instructors improving existing courses, improving marketing tactics, and creating more courses that will be successful.
๐ง๐ปโ๐ป๐ฉ๐ฝโ๐ป Team
1 product manager, 3 engineers, 1 product designer (me), and 1 user researcher.
โฐ Duration
~ 9 months
๐ง Problem(s)
Based on in-depth user research on instructors, it was made clear that they do not have a good grasp of how they are doing on Udemy, and instructors are doing all sorts of unusual behavior to try to figure this out. As a result, a team was formed to address this, with the hypothesis being that by highlighting more information in a simple, easy-to-use way, instructors will perform better. Some specific problems that made it difficult include the fact that analytics data is scattered all over the product making it hard to track performance effectively, and the analytics displayed is missing key information.
For example, there is no way to track revenue on a per course basis over time unless manually entering it into a spreadsheet every month (some instructors were doing this sadly). Instructors also don't have a good way to track enrollment numbers and ratings scores over time. Spreadsheets are used by instructors to track these as well. At the time, it was impossible for instructors to access information on the number of students when in the Analytics section of the course.
Additionally, Google Analytics, which many instructors used to get data on how their course landing pages were performing, was to be deprecated in a few months due to GDPR issues. If there is not some sort of replacement, instructors will be very upset.
๐ญ Opportunity
In addition to addressing the problems, we also recognized that this presented a big opportunity to highlight the incredible reach instructors are having globally. We know this is information that excites instructors from marketing emails in the past, and this is something that will help motivate instructors to continue teaching on Udemy and create more courses.
๐ Plan
After considering several options including squeezing additional analytics within the existing analytics of each course, our team decided to start fresh and create a new tab called Analytics which will house all instructor analytics in the future, both aggregate and per-course. We chose this direction because it will address our hypothesis that a central location where instructors can quickly and easily understand how they are performing will fix the problems identified. We launched it as a closed beta to gather feedback and improve as we go. We planned not to change any other existing analytics locations for now, but will use this as a building ground to start pulling in more and more data that lives in these other locations.
๐ Process
We ran a very iterative design process with a strong emphasis on user research, as it is difficult to predict how users will react when using live data and when users are in many different performance situations. For example, some instructors only have one course on Udemy, while others have hundreds. Some instructors only have hundreds in revenue, while others have millions. A second reason is that instructors have notoriously been very sensitive to any changes to the platform and have not been shy about making their feelings known online.
We planned on breaking this project out into four phases as it was so large:
Phase 1 - General structure and Overview tab
Phase 2 - Students tab
Phase 3 - Traffic & Conversion
Phase 4 - Course Engagement tab
We first released the product as a closed beta, consisting of 14 instructors that were a mix of beginner, intermediate, and expert. We planned for it to last 2 months, but extended it an additional 2 months. We did rolling releases every 2 weeks, and had baseline, midpoint, and end calls. We also required twice weekly diary entries from the participants.
The process for each phase started with a 1 pager laying out the context, problem, and hypotheses we had as a team. Then, I conducted an audit of our existing analytics offerings, did competitive research, sketched out lots of ideas, discussed with the team, quickly threw together prototypes and conducted user research, tweaked designs, passed on to eng, and ran in our closed beta.
โ๏ธ Designing the Overall Structure
One of the first steps when designing the overall structure was to conduct an audit of all existing instructor analytics throughout the product. Then, I created a list of potential data points to include, taking in to account what we learned from user research with instructors and their existing behaviors (tracking stats over time in spreadsheets).
I also researched a lot of analytics dashboards from other companies, including Google Analytics, Shopify, Weebly, Etsy, YouTube Studio, Mixpanel, Skillshare, Airbnb, and Spotify. I did this to spark ideas and identify best practices around data visualization. A general pattern emerged from this research whereby there was an overview page consisting of the most important pieces of information, and then each piece linked to its own page with more detailed information. After this, I had a general idea of how I wanted the structure to work.
The overview tab would consist of the most important information from each section, and to dig deeper you can go into an individual tab.
At this point, our team was happy with the progress made, and we decided to move forward with this design and structure to push to our closed beta.
Because I coded the prototype using HTML, CSS, and JavaScript and utilized existing style guide components, handoff to engineering was very easy. All design specs were determined from the prototype, and I offered support when any questions about the design came up.
โ๏ธ Designing the Students Tab
Based on the predetermined structure, we wanted the Students tab to be a place to dig deeper into the instructor's students at both an aggregate level and on a per course level.
The problem we were trying to solve for this section is the same as the overarching goal of the project - to provide instructors deeper insight into their courses in order to improve and create more courses.
Our hypothesis for this section was that by giving instructors more insight into who their students are, they will have a better handle on who their courses appeal to so that they can create better courses for them in the future. An additonal hypothesis is that they will feel more connected to their students, understand the global reach they are having, and feel more motivated as a result.
To address this hypothesis, our team brainstormed different data points we could surface that would give them more insight into their students.
Things we considered included:
Countries students are from (list and map)
Languages students speak
Search terms students are searching for
What types of courses students are also enrolled in
What other topic students are interested in
Reason for enrolling (personal or professional)
Student level (beginner, intermediate, advanced)
Photos of students (if possible)
How many students are repeat customers
Based on this, I explored through sketching and wireframing many different potential designs for this data.
Then, we conducted user research with several wireframes in order to see if the data we were presenting was understandable and actionable, as well as trying to figure out what was most important to instructors. During the user research, I quickly modified the wireframes when it became clear some information was confusing or not useful and tried other ideas throughout the process.
After user research, I met with the team one final time to see what was feasible from an engineering perspective, as well as what data we could actually use from our student surveys.
After this, I refined a prototype until we were confident in the design. Again, eng used the prototype as the design spec and I offered support whenever any questions came up.
This design was then pushed, piece by piece, to our closed beta group as I started on the next section.
โ๏ธ Designing the Traffic and Conversion Tab
The problem we were trying to address with the Traffic and Conversion tab was that instructors do not have an easy and straightforward way to understand their marketing performance. This includes the performance of their outside marketing efforts as well as the way they present their courses on Udemy. As a result, they are left unsure whether they should try to improve the way they are marketing and presenting their courses to students. Additionally, it is unclear what actions they can and should take to improve the way they market and present their courses to potential students.
We hypothesized that by giving instructors better insight into how they are doing from a marketing perspective, they can better market their courses and be more successful overall.
The first step I took to address this problem was to analyze the existing analytics we provide instructors related to marketing performance.
I also conducted a competitive analysis to help generate ideas of potential data points to share and how to represent them visually.
Then, I ran a kickoff brainstorm with my team to discuss different data points, how useful we think they might be, and what is feasible from eng as well. From the brainstorm, we discussed:
Course landing page data:
Visits real-time, today, yesterday, this week, this month
Conversion rate (how many went on to purchase)
Bounce rate (how many landed and immediately left)
Promo video watch rate (and impact on conversion)
Time spent on landing page
Course discovery on Udemy
How many visitors to my landing page came from Udemy discovery traffic
What pages did they come from? Browse, search, home, etc
If search, what was search term used?
If browse, what page?
How effective is each source in terms of conversion?
Today, yesterday, this week, this month
Wishlisted?
In shopping cart?
External traffic
What traffic sources are driving traffic to landing page
What is % breakout of traffic sources driving traffic to landing page
How effective is each traffic source in terms of conversion?
Today, yesterday, this week, this month
How many conversions came from my existing student base?
Promo announcements:
Did my promo announcement lead to visits to my landing page?
Did my promo announcement lead to conversion on my landing page?
Tracking changes
When I changed X(course description, target audience, etc), did it have an impact on conversion rate?
Journey Map
To help better understand all different potential data points that could be useful to understand, it was important to understand all different ways a student can discover and enroll in a course.
Sketching / Ideation
From here, I started sketching out different data points and ways of representing them. In this step, I tried to consider as many different options as possible and not limiting myself by any constraints.
Prototyping
I then jumped straight into prototyping in code, as I continued to iterate through different data visualizations. After several cycles of this, we ran a phase of user testing to see instructors reactions to the data.
User Testing Findings
Too much data confuses and overwhelmes the user
There are two types of people - those that just want high level information and those that want to dig deeper into the numbers
Most of the information on an aggregate level is not useful here
Most instructors do not understand / think about the course discovery process. The depth it goes to is people searching on Udemy
Affiliates stealing their money is a real thing
The Udemy engine is a mystery to users, but are we willing to reveal any of it to instructors? There are no actions that can be taken, except for improving SEO
Wording needs to be very clear when dealing with data
Understanding the actions that are happening on their CLP is very useful to understand. Can we interpret behavior even more? Like how many people were โpresold" and just came to the page to buy the course
Positive reaction to the way we broke out coupon code performance. Maybe do even more here
It would be useful to see referrals from all websites, not just from the ones you drove, to understand if you can capitalize on organic traffic from blogs or other sources
If there isnโt an action that can be taken, whatโs the point of seeing the information
A lot of numbers are shown over a period of โAll Timeโ, but what is more useful for some is monthly figures
After these findings, I modified the design for a few more cycles until we were happy with it. Then, again bit by bit, eng started pushing out the designs to our closed beta group.
โ๏ธ Designing the Course Engagement Tab
Problem: Currently, it is really challenging for instructors to understand how their students are engaging with and consuming their courses due to: 1) the usability and discoverability of the existing engagement analytics and 2) lack of exposed data, particularly around practical activities.
Hypothesis: We hypothesized that by helping instructors better understand how their students are engaging and consuming with their courses, instructors are able to:
Stay motivated to continuously create courses and knowledge on Udemy as instructors realize that their students are spending time learning from their course material.
Generate ideas for future course topics and teaching methods by gaining deeper understanding of students specific learning interests within a published course.
Make informed decisions when instructors decide to improve existing courses.
Current Analytics
Currently, instructors have access to some course engagement analytics within each course's dashboard page. However, it is clear from previous rounds of user research that it is not very discoverable, and it is hard to understand. It is also overwhelming in the way that it is displayed. A common source of confusion was understanding what the double bar chart meant.
Competitive Analysis
I also conducted a competitive analysis to see how others are displaying consumption data across products from learning management systems to Wistia to the Spotify Artist dashboard. I was also particularly curious how Spotify determined what a "play" is. It turns out a song is counted as a "play" if at least 30 seconds are listened to. This is relevant as I was trying to figure out if there was a way to represent a student "watching" a lecture.
Brainstorm
I then ran a brainstorm with my team to try to generate ideas with the prompt being: "How might we best communicate that students are engaging with an instructorโs course(s)?"
Data Analysis
Digging deeper into the question of lecture engagement, it was important to understand how students were currently watching lectures to see how to represent a "watch" or other types of engagement with a lecture. To do this, I consulted with a data scientist at Udemy to help determine how far students are getting within a lecture. From this, we thought we found that more than 75% of students get through 76% of a lecture (taking in to account students jumping around and skipping information they don't care about. However, in doing this analysis, we also learned that the way we record this information is flawed and the data could not be useful in this analysis unfortunately.
Sketching / Ideation
In this phase of the design process, I explored lots of different data points, raised lots of questions, and started on very rough design layouts.
Prototyping
Once I had some questions answered and ideas narrowed, I started exploring different ways to visually represent the data. I did this in code to get a realistic feel quickly and easily.
User Research
In the middle of prototyping, we also conducted further user research to see if the information we were presenting was useful, usable, and actionable. One thing that came out of this was that instructors were completely unaware of the bookmark feature, and found this information very useful. They also particularly liked seeing results from practice activities, such as quizzes and assignments. They previously lacked the ability to access any of this data. Additionally, just like with Traffic & Conversion, some instructors liked just looking at the overall figures, while others liked to go into the very detailed information. This helped inform the design to put key overall figures at the top, but also have the more detailed information lower down.
Finalize
After doing the user research, the prototypes were tweaked a little more until we were happy with it as a team. Then, we pushed it out bit by bit to our closed beta participants.
Outcome
The new Performance section was extremely well received by instructors. During Udemy's annual instructor conference, the VP of Product revealed the first 2 phases of the product, and here is what instructors had to say:
Practice Tests
๐ Summary
Practice tests are tests designed to prepare a student for a standardized test, most commonly a certification exam. They are typically timed, have a large (100+) set of questions, and require a certain score to pass. The goal of this project was to add practice tests as a feature instructors could create for their students.
๐ Background
Prior to the development of the practice test feature, the only other testing feature on Udemy was quizzes. However they were designed to be short, quick activities to reinforce material just taught by the instructor. Quiz mechanics enforced the material by preventing a student from advancing to the next question until they got the right answer.
A new practical activity in the course taking experience was very uncommon during my time at Udemy. This project came out of a special group that was short-lived, but did get enough agreement within the company that such a feature would be attempted in order to create an additional revenue stream and enrich the learning experience.
Certification Prep was a thriving category on Udemy, and before introducing this feature, instructors of this type of course did several things to overcome the limitations of the Udemy platform. This included linking to other websites that had practice tests or trying to hack the quiz feature for this purpose. Using the quiz feature was not a good experience for students, however, because of how different the mechanics of a quiz were from a certification exam. The last thing the company wanted was to have students leave the platform for a better certification exam prep experience if they didnโt have to.
๐ง๐ปโ๐ป๐ฉ๐ฝโ๐ป Team
In the beginning, another designer was assigned to this project in addition to 2 other projects at the same time, however it quickly became clear that this was not sustainable. As a result, I was moved from working on course discovery projects to the lead product designer on Practice Tests.
One of the first things I was told was that there was an extremely aggressive timeline (the business side thought this would require little design/eng since we already had the existing quiz functionality. They didnโt understand the true scope this project required). I was also told that engineering was already working on core functionalities of the project at the same time design was happening due to the condensed timeline involved. It was clear to me that this was going to be anything but a normal project.
โฐ Duration
3 months
๐ Process
There was already a general idea of the core features necessary for Practice Tests, however much still needed to be flushed out and close work with engineering was still required to get this project to completion. It was clear that the product manager and engineering team were desperate for a designer solely dedicated to the project, and I was thrilled to get to work on a project in such a challenging environment. I am always up for a challenge.
The first step I took was to meet individually with the previous designer, product manager, engineers, and business team involved in order to get caught up on the project as quickly as possible.
The second step I took was to conduct background research on the certification prep category. Even though the project was already in motion, I wanted to approach this as a fresh project to familiarize myself with the subject matter and help identify problem areas not already considered. Several things I did to familiarize myself included:
Conducting a competitive analysis of other practice test experiences on the internet:
Sitting for an actual certification test put on by a testing company to better understand the needs from the student's perspective
Reviewing existing courses on Udemy to see how instructors were currently managing without a practice test feature. This also let me see the type of questions asked on these practice tests, and the way answers were written.
Reviewing existing quiz functionality
Reviewing the previous designer's work up to this point: (note that while some work was done by the designer up to this point, as Lead Designer I had the agency to review all work and change anything done so far that I disagreed with. All design was my responsibility in the end)
General user flow
Progress Bar with timer and question number
Pause button
Stop button
Mark for Review Button
Test Results page (partial)
After doing all of this background research, I noted several gaps in the original design that were missed or just not thought out well-enough previously. The gaps I identified were:
Multi-select answer and question submission
Question to question navigation
Students could not navigate to the previous question or jump to another one, a holdover from quiz mechanics. The only option was to advance to the next question. This was not the way practice tests should work, nor the way real certification exams function.
They would only see the option to review skipped questions at the very end of the test.
Test results page design
What happens when the instructor makes changes to the test? There needed to be some way to manage different versions of the test, particularly in regards to students reviewing their test results. Imagine getting a question wrong originally, but then when you come back the next day, that question is now marked as correct. No bueno.
What happens when time runs out?
Immediately end test and show results?
Let student continue answering questions?
Give student the option to continue but the result will not count as pass?
Let student continue to answer but grade them based on what they answered before time ran out?
To go about this, I first prioritized the features with the product manager and engineers based on what was a requirement for launches, then went about with designing each. I would generally sketch lots of different ideas, then move to wireframes. Then, I created a high-res prototype in HTML, CSS, and JavaScript. The reason I did this was to get a sense of the interactivity of the practice test features we were building. This is, after all, a very interactive experience with a lot of moving parts such as a count-down clock, answer selection, and question navigation. It also allowed me to demonstrate the progress and decisions we were making to the product manager, design team, and entire business team. This also let me test the usability of the features we were building on the fly by conducting user research with random people in the company. Finally, the prototypes were used as the design spec for engineering (with any questions answered in person).
Multi-select Answers and Answer Submission
One of the first things I tackled which was a definite requirement for launch was allowing for multiple answers to a question, not just one answer. The answer design for quizzes simply had a number next to each answer. When a user selected an answer, it would immediately tell the user if the answer was correct or incorrect. This was both incompatible with a multi-select answer scenario in addition to not being the behavior a student would experience taking an exam.
To change this, I first knew the answer style would have to change. Concurrently, there was a big effort across the company to move all designs on Udemy to use re-usable components from a style guide. As such, I knew the design should try to use existing form components if possible.
Because we already had radio buttons and check marks in our style guide, I decided to use these form components for answer selection. I also told eng to modify submission behavior so that the user would not see results when selecting an answer and instead only advancing to the next question if the user presses the "Next" button.
We received feedback from users that the standard form elements were quite small to press, especially with so much unused space on the rest of the screen. After this feedback, we extended the reusable component with the addition of a bordering box extending the majority of the allotted width to allow for ease of selection.
Question-to-Question Navigation
Problem: A standard feature of most certification exams and therefore practice tests is the ability to navigate between questions. As mentioned above, the existing platform practice tests is built on is quizzes, and quizzes did not support this mechanic.
Process: To approach the addition of this functionality to the practice test design, I utilized my knowledge of the different types of navigation needed for a test as a result of the competitive research as well as taking the real certification exam.
There were two main types of navigation:
A simple back button
A more complex method of jumping to any previous question with the abilty to see which ones were skipped or marked for review
With this in mind, I quickly iterated through a lot of different designs:
In the process of exploring, I remembered the course taking platform within which practice tests will live contained a Q&A feature for regular lectures, and this behaves by sliding a panel out from the right of the screen. I thought this would be perfect to utilize as I could stay consistent with the placement of a button in the layout that reveals questions, as well as stay consistent with the contents of the panel too. Another requirement was for the user to view just the questions they had marked for review or skipped, so they could come back to these to review whenever they wanted throughout the test-taking experience. I settled on a dropdown filter design that sat at the top of the question list after trying several different designs.
After discussing the designs with the broader design team and VP of Design, I made a couple further tweaks, then prototyped it out in HTML, CSS, and JavaScript to demonstrate the interactivity and use as a spec for engineers. You can find one of the prototypes below:
Test Results Page
For the test results page, I took what the previous designer did as a starting point, but still went through the whole design process to try to get the best design possible. I wireframed out quite a few diferent designs, with the goal of creating a hierarchy of data based on what we thought was most important to the student in this scenario.
While the previous designer put equal weight between Correct, Incorrect, Skipped, and Elapsed Time, I believed the primary data point students cared about most was % correct. I also believed the original design was quite cluttered, so I tried to simplify the design by only displaying one pie chart including all data instead of splitting it out into multiple charts. One additional change made was the position of the "Review Questions" button, as this is the primary action they can take. It was previously located below "Knowledge areas".
Once a student took the practice test multiple times, results other than the latest would show collapsed in an accordion with data showing the most important information. I modified this as well to fit within the hierarchy established.
Standalone Practice Test Bundles
The launch of this feature was also different than a lot of other feature releases at Udemy, as an instructor account manager worked with some of the most popular certification prep instructors on Udemy to prepare Practice Tests for the launch. Additionally, product marketing was needed to promote this new feature to give it as a good a shot as possible to be successful. Because of all of the other teams involved, the launch was set to a fixed date.
Only 3 weeks before the launch, the business team agreed that it would be a good idea to not only include practice tests within an existing course, but also sell a set of practice tests as a standalone unit consisting solely of practice tests. Clearly, our team was concerned because of the enormity of the changes occurring at the last moment, but due to the company structure and decision making at time, we had to try our best to launch with something resembling this. I tried to explain the complexity of such a request, but it fell to deaf ears.
To approach this request, I took a step back and reviewed the entire course creation experience, marketplace experience, and course taking experience, identifying all areas that didnโt make sense for this sort of bundled practice test-only course. There were many.
We also didnโt know what to even call this object, as we had never sold anything other than a course before at Udemy. Should it still be called a course? Should the visual representation be differentiated in the marketplace to signify this?
Because of the extreme time crunch, we went with the easiest approach from an eng perspective while at the same time doing my best to not have customers completely confused. We did this with the promise that after launch, we would be given time to properly design this feature.
Outcome / Results
Despite the hectic process, we successfully launched the features into the market. After a year, the number of practice tests as well as the revenue generated was lauded as a success by the company, the feature still exists to this day, and it is touted as a benefit when signing new accounts up for Udemy for Business as well. I canโt reveal financial figures, but it almost did as much in sales alone as the entire UFB team did in its first year.
Course Comparability
๐ Background
Previous research conducted by the User Research team at Udemy revealed one of the biggest problems facing prospective students is distinguishing between the myriad of very similar courses on Udemy. This problem is caused as a result of natural competition between instructors on the platform, and in addition Udemy did not put restrictions on the number of a certain type of course. For example, if you want to learn SQL and you search on Udemy, you will see thousands of courses. There are differences between these courses, but it takes a long time and it is a confusing experience to suss out the differences and find the right course for you. As a result, the company fears they are losing customers because they give up before finding what they are looking for.
๐ง๐ปโ๐ป๐ฉ๐ฝโ๐ป Team
1 product manager and 1 product designer (me). The engineering team on this project was remote, and it was very difficult to try to involve them collaboratively. Additionally, the culture in the company at this stage was transitioning from one where product managers dictate what gets built to a whole team collaboration, so engineering was not involved yet.
๐ง Problem
As mentioned in the Background section, the problem was that it was difficult for prospective students to find the right course due to an overwhelming and confusing experience due to so many similar courses. We also knew that our search algorithm was not that good at this time (however another team was working on improving this).
Hypothesis
My product manager and I ran a kickoff session to discuss the problem and come up with hypotheses on how to address the problem. The first step we took was to map out the different user flows a prospective student on Udemy takes during the course discovery process. This let us view all of the potential areas we could try to improve out in the open and stimulate better discussion on the topic.
We also discussed many possible solutions we hypothesized could address the problem at hand.
As you can see from the whiteboard, we considered:
Comparison tool
Quickview on hover of course cards with additional information
Adding the subtitle of the courses
Adding instructor responsiveness to questions they receive from students within a course
How active discussion boards are in the course
When the course was last updated
How many students are watching right now
Curated experience like a questionare to narrow down options shown
Guided experience like course playlists
Course categorization
While we thought about and did work on many of these hypotheses across the product, the search results page was one of the biggest offenders of this problem, so we decided to tackle it first.
๐ Process
To approach this problem with regards to the search results page, I first conducted a competitive analysis to start generating lots of ideas for improvement that other companies had decided would help improve their discovery experience. For this, I looked at a wide range of companies, from those in the same category such as Coursera to completely different products such as Google and Amazon.
I also conducted an audit of all existing metadata being shown on each course within the search results page, as well as all other possible metadata we had on a course and new metadata we would have to collect for the first time. One tricky bit I had to consider with this project was that all metadata was written by the instructor and as such had variable lengths. I actually had to count the max number of characters of a course title, for example (no one on the team actually knew what this was beforehand funny enough).
I also started sketching out the general structure within which I needed to fit all metadata.
After more discussion with the product manager, I created a lot of different prototytpes with many different pieces of metadata included to help better visualize potential designs and stimulate discussion on which routes to try during user research.
Something to note is that after seeing these prototypes, we felt confident enough in the fact that the course subtitle would help with the problem described, we decided to include this information without testing. We did, however, test several designs that included other pieces of information.
User Research
We conducted interviews with 13 prospective Udemy students to try to determine what information is most helpful to them when trying to compare similar courses. In these sessions, we first asked the person to search for a topic and describe what they look at when deciding between courses. Then, we showed them several of the prototypes to get their reaction to this new metadata.
Learnings:
Course fit (VALIDATED): Understanding course fit is more important at the search results part of the discovery process. Hence things like course goals, target audience and curriculum are important for comparison.
Users understand at a high level what the course is about by looking at the title and subtitle, and hence course goals donโt add much value.
Target Audience (VALIDATED)
Curriculum tells them about specific topics that taught in the course (VALIDATED)
Instructor fit (INVALIDATED): Understanding quality of the instructor comes next after the course fit question. Hence reviews, instructor information, etc. comes after short listing certain courses and before making the final decision. This information does not need to be included in the search results page.
Reputation/Legitimacy (INVALIDATED):
Most helpful and critical review (INVALIDATED): Reviews: โMost helpful reviewโ feels like instructor picked the most favorable review by themselves. When asked if โMost critical reviewโ would help solve this, multiple users mentioned reviews donโt matter at this stage of filtering.
Number of active students (students currently taking) (INVALIDATED): Users did not take our platform as social, and did not place any importance on showing the faces of students currently taking the course. However, one user took this as a proxy for how updated or current the course is. We have heard repeatedly that users want to know when the course was created/updated to know if the course is relevant.
Active discussions โ(INVALIDATED)โ: Users felt that active discussions did not really add value, since they did not know if these discussion were questions, or just useless comments or something else.
Current active students (INVALIDATED)
Instructor responsiveness (INVALIDATED)
Some users just expect Udemy to provide solid recommendations rather than them making the effort of finding a course, while others search for specific topics when they know what they want to learn.
Post-Research
Two of the most important takeaways from the research was 1) that just including the title and subtitle give students a lot of information about course fit and 2) that students really liked seeing the curriculum at this stage. One of the first things they would do after clicking on a course was to go directly to the curriculum of the course anyway.
As a result, I went back to the drawing board to explore deeper how to visualize the curriculum, as the full curriuclum would not fit in the place originally laid out in the prototype.
After many discussions with the design team and the product manager, we decided to run a multivariate test between 3 different designs. The existing design, the new design with subtitle but not the curriculum, and the new design with subtitle and curriculum. Our primary measure of success was impact on revenue, however we also monitored bounce rate, clicks on search results, and time spent before purchase after visitng search page.
In the end, the new version with just the subtitle performed best. Our guess as to why is that the curriculum design caused less items to be visible at one time, but we are not positive.
Course Card Popover
Using the learnings above, we also wanted to address the problem of distinguishing between courses on our featured and cat/subcat pages by including more information within some sort of popover. This was a short turnaround project since we already had lots of research and thinking done on this topic, so I quickly sketched out a few ideas, prototyped it out in the browser, and ran with it. We were also inspired by some competitive research such as on Audible.com's homepage.
After conducting a multivariate test with metadata and just the subtitle; metadata, the subtitle, and "What you'll Learn"; and without a popover at all, here is the winning design based on impact on revenue as the primary metric:
Learnings
One of the biggest learnings I had from this project was that despite thinking more information would always be better, that is not always the case. There is a fine line between not enough and too much information based on the stage of decision-making the user is in. Another was that, despite how tempting it is to go with a complex and clever design such as with the Curriculum, sometimes your hypothesis is wrong and you have to go with a less fancy design.
Amazon.com
๐ผ Design Technologist, ACS Global Browse UX Team
๐06/2014 - 02/2015
๐Seattle๐ฃ 275 Million Customers
The Amazon Creative Services Global Browse UX Team designs and develops interactive reusable components that live on Amazon Browse/Category pages across the globe. The main goal of this team is to elevate the retail browse experience for Amazonโs customers, and the widgets created are used by millions of people every day.
During my time on this team, I operated as a Design Technologist, a role somewhat specific to Amazon in title, but not in intention. As a member of a cross-functional team of about 10, I operated closely with both ux designers, web developers, and an SDE to create delightful and interactive experiences for Amazon's customers. I played an active role in everything from design ideation and concept development to prototyping and iterating on design concepts to final front-end development depending on the project.
Farmplicity (acquired 2014)
๐ผ Co-founder/CEO/Lead Designer
๐01/2013 - 05/2014
๐St. Louis๐ฃ Hundreds of chefs and farmers
HTML5CSS3jQueryTwitter BootstrapRuby on Rails
Farmplicity is an online marketplace where farmers can sell local ingredients directly to chefs in an easy-to-use virtual farmers market. Farmplicity was founded in January 2013 with the intention of making local ingredients easier to obtain, thus benefiting local economies and, over time, reducing prices. To date, over 100 farmers and 100 chefs have utilized Farmplicity in the St. Louis market.
For this project, I worked on a team of four to develop the concept, create the business plan, and launch the company. I then assumed the role of CEO and led all aspects of the company for six months. Additionally, I led design for the entire project. This included designing the logo and building the brand, creating marketing materials, and designing and developing the front-end of the website.
For the design of Farmplicity, I wanted to achieve a warm, homey, down-to-earth feel that would lend itself well to the concept of local food. I also wanted to create an air of professionalism while still being fun and a little quirky.
The first step was to create a logo. I went with a script font that was fun, colorful, and exciting, in the hope that our company would bring new life to the local food distribution industry.
To start on the website, I built out a user flow of the web app. While the main home page is accessible to everyone, farmers and chefs have different permissions - farmers only have the ability to upload listings and chefs only have access to the marketplace.
I then began wireframing out each of the individual pages, to give a blueprint of the componenets needed to build out in code.
After sketching out wireframes, I developed each page using HTML5, CSS3, and jQuery within the Bootstrap framework. The pages were then integrated into the Ruby on Rails app in coordination with a back-end developer. The "finished" results can be seen below.
Pantastic
๐ผ Founder
๐ 12/2012 - 03/2013 ๐St. Louis
HTML5CSS3jQueryTwitter Bootstrap
Pantastic was a company I founded that allowed anyone to order three-foot-long prints of iPhone panoramas for $25, at the click of a button. This was my first foray into startups and building out websites, and the learnings I gleaned from this experience were invaluable.
I have always had a passion for photography, and I remember how difficult it was pre-iPhone to take panoramas. The panorama naturally lends itself well to large format printing and wall display, as its dimensions make it difficult to view at smaller sizes. While the iPhone allowed for panorama capture, there was no good way to display the beautiful panoramas people were taking. This is why I started Pantastic. I wanted to make it dead-simple to get a large, beatiful panorama on your wall.
While knowing very little at the time about developing a website, I leaned on the Bootstrap framework to get the following website up and running. With a large-format printer and lots of shipping boxes in tow, Pantastic was up and running!