TikTok RD Assistant

An efficiency-focused AI platform for TikTok’s development team, to streamline knowledge access and daily workflows.

My Role

My Role

UX Design Intern

Responsibility

Responsibility

AI Design, Visual Design, User Research

Team

Team

UX Designers,

Product Managers,

Software Engineers

Duration

Duration

2024/11-2025/1

2 months

The Mission

Design Smart Q&A to transform MagicSurvey into a trusted, workflow-integrated AI research assistant, enabling users to query internal knowledge, and research documents with transparent source attribution.

The Mission

Design Smart Q&A to transform MagicSurvey into a trusted, workflow-integrated AI research assistant, enabling users to query internal knowledge, and research documents with transparent source attribution.

The Impact

+15%

+15%

+15%

Retention rate

Retention rate

Retention Rate

+18%

+18%

+18%

Monthly Active User

Monthly Active User

Monthly Active User

The Impact

+15%

Retention Rate

+18%

Monthly Active User

The Impact

+15%

Retention rate

+18%

Monthly Active User

What I have done

  1. Led the end-to-end design for the Smart Q&A, from early concept to final delivery.

  2. Synthesized 15+ insights from competitive analysis and user research to guide design decisions.

  3. Created high-fidelity and improved usability & discoverability with 10+ UX iterations.

  4. Aligned cross-functional teams to resolve interaction and implementation challenges.

  1. Led the end-to-end design for the Smart Q&A, from early concept to final delivery.

  2. Synthesized 15+ insights from competitive analysis and user research to guide design decisions.

  3. Created high-fidelity and improved usability & discoverability with 10+ UX iterations.

  4. Aligned cross-functional teams to resolve interaction and implementation challenges.

  1. Led the end-to-end design for the Smart Q&A, from early concept to final delivery.

  2. Synthesized 15+ insights from competitive analysis and user research to guide design decisions.

  3. Created high-fidelity and improved usability & discoverability with 10+ UX iterations.

  4. Aligned cross-functional teams to resolve interaction and implementation challenges.

  1. Led the end-to-end design for the Smart Q&A, from early concept to final delivery.

  2. Synthesized 15+ insights from competitive analysis and user research to guide design decisions.

  3. Created high-fidelity and improved usability & discoverability with 10+ UX iterations.

  4. Aligned cross-functional teams to resolve interaction and implementation challenges.

The Outcome

The Outcome

The Outcome

The Outcome

Home Page

An inviting entry point for curiosity and idea discovery


An inviting entry point for curiosity and idea discovery

Home Page

An inviting entry point for curiosity and idea discovery

Answer Page

A structured, source-backed answer flow with quick navigation.

Answer Page

A structured, source-backed answer flow with quick navigation.

The Process

The Process

The Process

The Process

Design Goal

Create a unified AI that consolidates internal research knowledge, making insights proactively accessible, understandable, and trustworthy?

1

1

Passive and narrow usage

AI only supported generation tasks (reports or surveys) and couldn’t proactively answer research questions.

2

2

Unclear source transparency

AI responses didn’t show their sources, reducing user trust in the outputs.

3

3

Isolated AI capabilities

AI functions were tied to specific modules and couldn’t access across all internal documents.

Target Users

Collaborated with PM, I set 240 surveys and 10 interviews to better understand our user goals

PMs & Researchers

Age: 20–40 years old, predominantly male.

Traits: Tech-driven, analytical, efficiency-focused.

Goal 01

Quickly query past research documents

31% of users struggled to quickly query past research documents and access relevant insights across internal projects.

Goal 02

Extract actionable insights confidently

22% rarely relied on AI answers due to unclear sources, highlighting a trust gap in the experience.

Competitive Research

I also conduct a research including 4 AI tools to understand how AI can guide users, establish trust, and integrate into workflows.

Insights:

Recommended prompts reduce activation friction.

Recommended prompts reduce activation friction.

Mode selection clarifies intent.

Mode selection clarifies intent.

Source transparency drives trust.

Source transparency drives trust.

Design Decisions

Design Decisions

Design Decisions

Design Decisions

Feature 01: Dynamic Entry to Highlight the New Q&A

A dynamic icon distinguishes the new entry, with hover revealing a brief explanation. Also I designed a top banner to promotes the feature to increase visibility and click-through rates.

Architecture Trade-off

Architecture Trade-off

After atheteam review, I realized the new AI Q&A function was fundamentally different from existing features, creating an architecture conflict. So I designed this distinct entry that opens to a separate page, keeping it set apart from the current structure while highlighting its uniqueness.

Previous Design

Previous Design

Previous Design

Previous Design

Previous Design

Separate Page

Separate Page

Separate Page

Separate Page

Separate Page

Feature 02: Recommendation Cards to Guide AI Exploration

Recommendation cards surface suggested questions upfront, helping users quickly explore AI capabilities, structure their research, and reduce cognitive load when starting a survey.

Discussion with other members before

Discussion with other members before

Discussion with other members before

Discussion with other members before

Designer: Put the questions on the top?
Designer: Put the questions on the top?
Designer: Put the questions on the top?
Designer: Put the questions on the top?

Keeps the input frame at the bottom for consistency.

Shifts visual focus to the recommendation cards, weakening the input frame’s hierarchy.

PM: Keep the cards expanded?
PM: Keep the cards expanded?
PM: Keep the cards expanded?
PM: Keep the cards expanded?

Larger card exposure makes them easier to notice.

Less engaging than the collapsed version and less memorable for new users.

Feature 03: Mode-Specific Guidance for Effective Q&A

By providing different modes, the input frame guides users to ask questions in the right context, and makes AI proactively usable across different research scenarios.

Feature 04: Clear Separation of Sources to Build Trust

By separating internal (Lark) and external sources, the feature increases transparency, builds confidence in AI responses, and helps users decide which insights are more reliable.

Driving Execution

When vendor constraints affected development quality, I led QA, creating annotated comparisons and collaborating with engineering to ensure design intent.

Thoughts:

Thoughts:

This was my first close collaboration with SDEs, where I learned that simple design details can be technically complex. It reinforced the importance of engineering alignment and rigorous QA to deliver a polished UI.

Impact

+15%

Monthly Active User

More users engaged with the new Smart Q&A feature, showing stronger adoption and repeat usage.

+18%

Retention rate

Users returned to MagicSurvey more often, indicating sustained value and trust in AI-assisted survey creation.

+15%

+15%

Monthly Active User

Monthly Active User

More users engaged with the new Smart Q&A feature on MagicSurvey, showing stronger adoption and repeat usage.

More users engaged with the new Smart Q&A feature, showing stronger adoption and repeat usage.

+15%

Monthly Active User

Monthly Active User

More users engaged with the new Smart Q&A feature on MagicSurvey, showing stronger adoption and repeat usage.

+18%

+18%

Retention Rate

Retention Rate

Users returned to MagicSurvey more often, indicating sustained value and trust in AI-assisted survey creation.

+18%

Retention Rate

Retention Rate

Users returned to MagicSurvey more often, indicating sustained value and trust in AI-assisted survey creation.

+15%

Monthly Active User

More users engaged with the new Smart Q&A feature, showing stronger adoption and repeat usage.

+18%

Retention Rate

Users returned to MagicSurvey more often, indicating sustained value and trust in AI-assisted survey creation.

Takeways

Takeways

Takeways

Takeways

Being a Proactive Learner

It's common to design for a product you've never used, or a field you're not familiar with. Always stay curious and proactive in learning to prepare for new knowledge.

Listen Carefully, Speak Boldly

Listen Carefully,
Speak Boldly

It's common to communicate with people who think differently than you do. Always listen carefully to understand, and speak boldly to contribute your perspective.

↓ Photos with my amazing team — grateful for them all.