Hiya AI Phone

Hiya AI Phone

Building trust in an AI-powered calling experience

Building trust in an AI-powered calling experience

Building trust in an AI-powered calling experience

UX Research

Interaction Design

Trust & Safety Design

Design Strategy

Usability Testing & Analysis

OVERVIEW

When Hiya launched a new AI-powered calling app, our users were curious but hesitant. The app could identify spam calls, summarize conversations, and auto-create events, but users weren’t sure they could trust what it was doing behind the scenes.


I led a 10 week usability research sprint to uncover friction points, evaluate user trust, and deliver design recommendations that made these AI features more transparent and trustworthy.

WHAT I DID

I planned and led an end-to-end usability research from participant screening to synthesis and presentation. I partnered with 4 researchers, 4 designers, and 1 senior researcher over 10 weeks.

Built on Origami Studio and Lottielab.

THE CHALLENGE

Hiya launched a new AI-powered mobile calling app with features for scam detection, call summarization, calendar event creation, and incognito calling. Despite the launch, Hiya noticed lag in adoption across its flagship AI features. The challenge was whether users trusted these features, could find them easily, and would adopt and use them in their daily workflows.

How can we help users feel confident that Hiya protects their privacy and truly delivers on safety?
3 KEY PAIN POINTS WE IDENTIFIED

Users couldn’t tell what the AI was doing or why.

Users lacked control over AI-generated actions.

Users weren’t sure when private calling features were active.

TARGET USERS

Busy business professionals: small business owners, real estate agents, and service workers

We targeted towards busy professionals who rely on their phones daily and value both productivity and privacy.

RESEARCH GOALS

We began with a broader set of 8 exploratory questions covering onboarding, feature accessibility, interaction with AI features, and user behavior. After early participant screening and alignment with PM and Research, we narrowed them to the 4 questions that directly impacted adoption, trust, and AI workflows.

Ease of Use

Can users successfully navigate Hiya’s key features (onboarding, scam & event detection, incognito mode)?

Effectiveness & Usefulness

Can users easily navigate and manage incoming and outgoing calls using the dial screen?

Effectiveness & Usefulness

What challenges do users face when completing tasks such as answering, summarizing, or managing calls?

Confidence & Perceived Safety

Do users feel confident that Hiya protects their privacy and accurately detects scam?

TRANSLATING OUR FINDINGS

From 30+ screened participants, we conducted 13 moderated sessions that surfaced over 15 usability issues grouped under 3 themes: discoverability, clarity, and trust.

Affinity Mapping User Feedback

Affinity Mapping User Feedback

Affinity Mapping User Feedback

Likert Scale Responses to Identify Patterns

Likert Scale Responses to Identify Patterns

Likert Scale Responses to Identify Patterns

INSIGHT 1

Calendar Event detection was useful but rigid.

FINDINGS

Auto-generated events were valued but lacked flexibility in structure and control.

Rated 4.9 out of 5 for usefulness, but professionals couldn’t change end times, add guests, or sync it to Google/Outlook, which limited feature adoption.

BEFORE

Confusing and rigid event creation reduced clarity and control.

The calendar icon looked tappable, misleading people into unplanned detours.

Events showed only start times, leaving meetings open-ended and uneditable.

All events defaulted to the system calendar, frustrating users loyal to Google or Outlook.

AFTER

Users regained control and confidence through flexible event creation.

Added color-coded event states [Unconfirmed] & [Confirmed] for instant clarity.

Introduced third-party calendar integration so users could save events where they actually manage their time.

Integrated a review and edit step to adjust titles, times, and details before saving.

INSIGHT 2

Incognito Mode caused confusion & distrust.

FINDINGS

Users were not sure private calls were truly private.

6 of 8 found the feature, but only 4 trusted it. There was no visual indicators Incognito was active, behavior varied between speaker and earpiece, and nothing confirmed the call wasn’t recorded after it ended.

BEFORE

Inconsistent behavior led to missing signals and distrust.

On the dial pad, discovery varied by speaker or handset mode, making activation unreliable.

No confirmation cue told users whether recording had stopped.

People were left unsure the feature even worked, which undermined its purpose.

AFTER

Clear confirmations turned uncertainty into trust.

Added an onboarding intro explaining how Incognito works and what protections it provides.

When active, users now hear a verbal confirmation and see a visual pop-up that reaffirms privacy is enabled.

INSIGHT 3

Spam Alerts were too easy to miss.

FINDINGS

Half of users missed spam cues.

In-call haptic and visual cues weren’t noticed, undermining the app’s core promise of safety.

BEFORE

Weak visibility made critical warnings easy to miss.

The “Potential Scam Call” label sat unnoticed beneath the number.

Users faced basic [Accept] or [Decline] options with no context.

Weak haptic feedback failed to interrupt attention.

The AI offered no explanation for its risk judgment, which left users unsure why they should trust it.

AFTER

Alerts became visible and impossible to miss.

Added a pre-call banner with sharp haptic vibration and distinct tone to command attention before connection.

Designed a bold warning screen that clearly states risk and gives users time to decide safely.

Surfaced an AI explainer to show the reasoning behind alerts. “We flagged this number based on reported spam patterns”

These findings represent 3 major insights from our comprehensive usability testing of the Hiya application.

For more detailed findings, design recommendations, and the full usability testing case study, please reach out to me directly!

These 3 findings represent the major insights from our comprehensive usability testing of the Hiya application.


For more detailed findings, design recommendations, and the full usability testing case study, please reach out to me directly!

These 3 findings represent the major insights from our comprehensive usability testing of the Hiya application.


For more detailed findings, design recommendations, and the full usability testing case study, please reach out to me directly!

IMPACT

Increased task success by 15% after applying design recommendations.

Influenced roadmap planning by prioritizing trust signals, transparency, and platform consistency.

Improved user trust and understanding of Hiya’s AI features (spam detection, Incognito Mode, event creation).

Revealed cross-platform inconsistencies and reliability gaps between Android and iOS users, informing fixes and onboarding improvements.

REFLECTION

Because Hiya’s app was newly launched, we encountered several technical issues during usability sessions, including onboarding failures that prevented some users from progressing. I worked closely with engineering to surface these issues immediately so they could ship quick fixes and unblock testing. Additionally, during planning, our research and design teams flagged that our initial test script was too long for a single session. I partnered with them to condense the flow while still preserving the core insights we needed across trust, clarity, and task success.

Hiya is currently only available in the U.S., but expanding internationally would require adapting scam detection, call behaviors, and trust cues to different cultural norms. For example, spam-call patterns vary widely by region, and trust signals (e.g. colors, haptics, labels, and AI explanations) would need localization to match local expectations and regulatory requirements.