Codebaby's AI avatar
Exploring why small businesses hesitate to adopt AI-powered avatars and how usability testing points the way forward.
CodeBaby is an innovative online platform empowering users to create interactive, AI-driven 3D avatars. This usability evaluation aimed to address the client's concern about low conversion rates from free trials to paid plans, particularly among small businesses lacking technical expertise. For this study, a team of four graduate students from Pratt Institute conducted a series of moderated usability tests to evaluate the ease of creating, customizing, training, and deploying avatars using the CodeBaby portal.. The goal was to ensure the platform was intuitive and approachable for these users.
We uncovered significant friction points in avatar creation, interaction, and integration workflows. Users often experienced uncertainty about task completion, encountered confusing system feedback, and struggled with deploying avatars effectively, impacting their confidence and likelihood to convert to paid plans.
Our journey began with a kickoff meeting with CodeBaby’s COO to clarify business goals, target users, and key pain points—primarily, why non-technical users weren’t converting to paid plans. Drawing on these insights, we defined our research focus and developed a testing protocol that mirrored real-world onboarding and deployment scenarios.
Key details
We recruited eight participants—mostly small business owners and professionals with limited tech backgrounds—through PanelFox and our networks. Each participant was screened for their familiarity with generative AI and web development, ensuring our research reflected CodeBaby’s targetted user base: people eager to adopt new technology, but needing extra clarity and support. By focusing on non-technical small business users—CodeBaby’s target market—we ensured our insights addressed real adoption barriers, not just technical edge cases. Our findings now guide improvements that make the product more accessible and effective for its core audience.
8
small business participants with non-technical backgrounds tested CodeBaby’s usability
Why?
Non-technical users weren’t converting to paid plans.
How?
We ran usability testing based on real onboarding and deployment scenarios, focusing on the core chat and avatar features.
Who?
Participants with:
Moderate skills in generative AI (avg. 3.38/5)
Basic to moderate skills in web development (avg. 2.88/5)
Research Process
The process included recruitment, consent, task-based usability testing (using a think-aloud protocol), post-test interviews, and a synthesis session to extract actionable insights.
Usability Hurdles & Solutions
100%
of participants struggled with navigation at some point.
6/8
found the preview avatar page confusing
Insight 1: Avatar Experience
Problem 1.1
The chat experience was unclear and inconsistent, causing users to feel confused about about how to navigate the conversation.
Codebaby features an in-browser chat assistant called Wizard, designed to help users navigate the website. Within the Avatar Portal, users can also create their own custom avatars to interact with. The top view displays the Wizard chat interface, while the bottom view shows a preview of the user-created avatar chat, highlighting key UI differences between the two experiences.
💬 “It would be nice to see my questions, since it disappears, I don’t understand what is happening”
Recommendation 1.1
Simplify and unify the chat experience to reduce confusion and improve usability.
Standardize icon and chatbox placement across all avatar screens, Wizard and Preview Avatar to ensure a consistent layout. Remove overlapping text elements and place icons together to avoid confusion and guide user focus.
Problem 1.2
6 out of 8 noted uncertainty due to lack of feedback while the avatar processed their input
The system did not show any visual indicator during response delays, making it unclear if the input was received. Users assumed the system was frozen or their internet was failing, which reduced their trust and disrupted the experience.
💬 “I wish there was an indicator that shows my result is processing my question and not a internet issue from my end”
Recommendation 1.2
Add an animated typing indicator to the chat interface to provide visual feedback during response processing.
Problem 1.3
5 out of 8 users mentioned that they don’t fully understand the purpose or value of the avatar preview.
The lack of cues and lengthy responses left users unsure of what to do next, making the experience feel passive and limiting the avatar’s perceived value.
💬 “Maybe I don’t even know what I want to learn, there should be something to help me ask the right questions.”
Recommendation 1.3
The chat experience was redesigned to feel more guided, approachable, and easy to follow.
Auto-generated follow up questions were added to guide users on what to ask next. Avatar responses were made brief to reduce overwhelming the users.
Insight 2: Preview Avatar Page
Problem 2.1
Only 1 out of the 8 users tested discovered how to change the background in the preview page.
Several participants found the preview page confusing, with difficulties navigating key actions such as exiting the preview and changing the background. Preview Avatar page is where users go to check out the avatar they created. Users can also change the background to a preview of their website browser to see how the avatar would look like on their website.
💬 “I don’t understand what I’m supposed to do here.”
💬 “This page is really loud.”
Recommendation 2.1
Streamline the process to upload an Image and change the preview page
Adding a clear call-to-action to upload a preview of the user’s website page
We delivered our findings to CodeBaby’s team, who responded enthusiastically, noting our research validated some of their hypotheses while also surfacing new, actionable insights. The client expressed appreciation for our user-centered perspective and indicated that several recommendations would be prioritized in the next product sprint.
If we were to continue:
Run follow-up tests after design changes.
Explore onboarding for even less technical users.
Evaluate long-term user retention.
Key Takeaways
User testing validates real pain points
Clarity is essential for non-technical users
Clear steps, plain language, and intuitive navigation are crucial for building user confidence.
Small UX changes drive big impact
Targeted tweaks to chat, navigation, and onboarding lis expected to bring better user engagement and satisfaction.