Challenge
Role: Strategic Lead, Product Designer (Staff level)
Timeframe: 6 months
Team: Product, Engineering, Data Science, Growth
Surface Area: Sales Hub Pro onboarding (multi-team, revenue-critical flow)
Impact:
✔ The initiative generated $270K in revenue
✔ +22% activation rate
✔ -28% support dependency
✔ Scalable onboarding framework across 5 product lines
Context:
When I joined the initiative, onboarding was fragmented across multiple product groups. Each team owned “their part” of setup, but no one owned the onboarding system holistically. The experience was checklist-based, feature-oriented, and heavily dependent on Customer Success intervention.
Note to Reader: There is no before state as for Pro+ Tier users, there was no product led onboarding for these customers. They had to either pay for onboarding or do it themselves if they got the fee waived!
Activation had plateaued. Support dependency was increasing. Onboarding decisions were being made locally within teams, without a unifying strategy tied to business outcomes. The company was investing in product-led growth, but onboarding was not designed to support that motion. At the same time, the company was investing heavily in AI capabilities across the product.
The opportunity was clear: onboarding should not be a static flow - it should be an adaptive intelligence layer powered by real account context.
This was not a UI problem.
It was a system design and intelligence orchestration problem.
Strategic Reframe
Rather than iterating on the existing checklist, I worked with product leadership to reframe onboarding entirely. We shifted from:
Feature completion → Value realization
Linear flows → Milestone-based progression
Manual intervention → Contextual AI guidance
Local ownership → System-level strategy
I positioned onboarding as a strategic activation layer within the PLG motion - not a static entry experience. This reframing aligned stakeholders around a new principle:
Onboarding should dynamically guide users toward measurable outcomes tied to expansion potential.


Process
My focus was not just shipping improvements, it was actually creating a compounding growth system.
I introduced:
• Clear milestone-based activation definitions
• A structured experimentation framework tied to milestone velocity
• Defining and helping other teams define the hypothesis in their product areas.
• Cross-team alignment around a single growth KPI
• The ability to self serve data and understand if an experiment has reach statistical significance using internal tools.
This allowed onboarding to operate as an ongoing optimization surface rather than a one-time redesign.
Designing the AI Layer
I led the definition of how AI would operate within onboarding - not just where it would appear.
We structured the system around three intelligence layers:
Account State Detection
We defined signals that evaluated setup completeness, usage depth, and workflow maturity.
The system continuously scanned:
• Data imports
• Pipeline configuration
• Automation rules
• User behaviour patternsThese signals determined onboarding stage dynamically.
2. Next Best Action Engine
Rather than prescribing a fixed order of tasks, the system generated context-specific recommendations based on detected gaps.
Examples:
• If no automation existed → Suggest automation template
• If pipeline incomplete → Trigger guided setup
• If usage plateaued → Recommend workflow optimizationThis required close collaboration with engineering to define trigger logic and fallback behavior when signals were incomplete.
We intentionally designed the AI as assistive - not autonomous.
Users retained control while benefiting from contextual nudges.
Milestone-Based Progression
Instead of tracking “tasks completed,” we defined measurable value milestones.
For example:
• First pipeline created
• First automated workflow live
• First successful deal closedAI recommendations were anchored to these milestones - ensuring onboarding aligned with real business impact. This prevented the common AI trap of generating suggestions that feel clever but lack outcome relevance.
Trust, Transparency, and Guardrails
One of the biggest challenges was designing AI guidance that users trusted.
We introduced:
• Clear reasoning indicators (“Recommended because no automation detected”)
• Editable suggestions rather than auto-applied actions
• Visible progress tied to business goals
• Fallback states when confidence thresholds were low
I worked with data partners to establish confidence scoring thresholds before surfacing recommendations.Shipping AI in onboarding required balancing proactivity with user agency.
The goal was guidance, not automation anxiety.


Outcome
Organizational Impact
The AI onboarding framework required alignment across three onboarding squads and multiple AI initiatives.
My contributions included:
• Defined principles for AI-guided product experiences
• Education around experimentation and PLG motions.
• Standardized guidance components within the design system
• Established confidence and trigger governance for AI surfaces
• Partnered with PM leadership to connect activation milestones to expansion revenue
• Mentored designers in modeling behavioral and system logic
• Became a trail blazer for AI tools and upskilling other designers
This initiative shifted onboarding from being “owned by feature teams” to being treated as a shared intelligence infrastructure layer.
Execution Tradeoffs
To deliver measurable impact within a single planning cycle:
We prioritized rule-based intelligence layered with contextual AI prompts before investing in deeper predictive modeling. We narrowed milestone definitions to ensure clean measurement.
We removed legacy setup flows that conflicted with the adaptive framework - despite internal resistance tied to feature exposure goals. These decisions ensured coherence and measurable results rather than shipping fragmented AI surfaces.
Results - Within two quarters:
Activation increased by 22%
Time-to-value decreased by 40%
Onboarding-related support tickets dropped by 28%
The initiative generated $270K in revenue
More importantly, onboarding evolved from a static entry flow into a scalable AI-powered growth engine.
The intelligence framework is now being extended into expansion and optimization workflows beyond initial onboarding.
How I influenced beyond the project
✔ Established a horizontal onboarding design framework
✔ Mentored 2 UX designers in systemic modeling
✔ Drove cross-team alignment across Growth, PM, Eng
✔ Created shared design tokens for onboarding components
✔ Built first A/B experimentation naming standards


AI-First Onboarding System @ HubSpot
Transforming complex activation into contextual, data-driven guidance
Role
UX DRI
Industry
AI · B2B
status
Live
Year
2025

Challenge
Role: Strategic Lead, Product Designer (Staff level)
Timeframe: 6 months
Team: Product, Engineering, Data Science, Growth
Surface Area: Sales Hub Pro onboarding (multi-team, revenue-critical flow)
Impact:
✔ The initiative generated $270K in revenue
✔ +22% activation rate
✔ -28% support dependency
✔ Scalable onboarding framework across 5 product lines
Context:
When I joined the initiative, onboarding was fragmented across multiple product groups. Each team owned “their part” of setup, but no one owned the onboarding system holistically. The experience was checklist-based, feature-oriented, and heavily dependent on Customer Success intervention.
Note to Reader: There is no before state as for Pro+ Tier users, there was no product led onboarding for these customers. They had to either pay for onboarding or do it themselves if they got the fee waived!
Activation had plateaued. Support dependency was increasing. Onboarding decisions were being made locally within teams, without a unifying strategy tied to business outcomes. The company was investing in product-led growth, but onboarding was not designed to support that motion. At the same time, the company was investing heavily in AI capabilities across the product.
The opportunity was clear: onboarding should not be a static flow - it should be an adaptive intelligence layer powered by real account context.
This was not a UI problem.
It was a system design and intelligence orchestration problem.
Strategic Reframe
Rather than iterating on the existing checklist, I worked with product leadership to reframe onboarding entirely. We shifted from:
Feature completion → Value realization
Linear flows → Milestone-based progression
Manual intervention → Contextual AI guidance
Local ownership → System-level strategy
I positioned onboarding as a strategic activation layer within the PLG motion - not a static entry experience. This reframing aligned stakeholders around a new principle:
Onboarding should dynamically guide users toward measurable outcomes tied to expansion potential.

Process
My focus was not just shipping improvements, it was actually creating a compounding growth system.
I introduced:
• Clear milestone-based activation definitions
• A structured experimentation framework tied to milestone velocity
• Defining and helping other teams define the hypothesis in their product areas.
• Cross-team alignment around a single growth KPI
• The ability to self serve data and understand if an experiment has reach statistical significance using internal tools.
This allowed onboarding to operate as an ongoing optimization surface rather than a one-time redesign.
Designing the AI Layer
I led the definition of how AI would operate within onboarding - not just where it would appear.
We structured the system around three intelligence layers:
Account State Detection
We defined signals that evaluated setup completeness, usage depth, and workflow maturity.
The system continuously scanned:
• Data imports
• Pipeline configuration
• Automation rules
• User behaviour patternsThese signals determined onboarding stage dynamically.
2. Next Best Action Engine
Rather than prescribing a fixed order of tasks, the system generated context-specific recommendations based on detected gaps.
Examples:
• If no automation existed → Suggest automation template
• If pipeline incomplete → Trigger guided setup
• If usage plateaued → Recommend workflow optimizationThis required close collaboration with engineering to define trigger logic and fallback behavior when signals were incomplete.
We intentionally designed the AI as assistive - not autonomous.
Users retained control while benefiting from contextual nudges.
Milestone-Based Progression
Instead of tracking “tasks completed,” we defined measurable value milestones.
For example:
• First pipeline created
• First automated workflow live
• First successful deal closedAI recommendations were anchored to these milestones - ensuring onboarding aligned with real business impact. This prevented the common AI trap of generating suggestions that feel clever but lack outcome relevance.
Trust, Transparency, and Guardrails
One of the biggest challenges was designing AI guidance that users trusted.
We introduced:
• Clear reasoning indicators (“Recommended because no automation detected”)
• Editable suggestions rather than auto-applied actions
• Visible progress tied to business goals
• Fallback states when confidence thresholds were low
I worked with data partners to establish confidence scoring thresholds before surfacing recommendations.Shipping AI in onboarding required balancing proactivity with user agency.
The goal was guidance, not automation anxiety.

Outcome
Organizational Impact
The AI onboarding framework required alignment across three onboarding squads and multiple AI initiatives.
My contributions included:
• Defined principles for AI-guided product experiences
• Education around experimentation and PLG motions.
• Standardized guidance components within the design system
• Established confidence and trigger governance for AI surfaces
• Partnered with PM leadership to connect activation milestones to expansion revenue
• Mentored designers in modeling behavioral and system logic
• Became a trail blazer for AI tools and upskilling other designers
This initiative shifted onboarding from being “owned by feature teams” to being treated as a shared intelligence infrastructure layer.
Execution Tradeoffs
To deliver measurable impact within a single planning cycle:
We prioritized rule-based intelligence layered with contextual AI prompts before investing in deeper predictive modeling. We narrowed milestone definitions to ensure clean measurement.
We removed legacy setup flows that conflicted with the adaptive framework - despite internal resistance tied to feature exposure goals. These decisions ensured coherence and measurable results rather than shipping fragmented AI surfaces.
Results - Within two quarters:
Activation increased by 22%
Time-to-value decreased by 40%
Onboarding-related support tickets dropped by 28%
The initiative generated $270K in revenue
More importantly, onboarding evolved from a static entry flow into a scalable AI-powered growth engine.
The intelligence framework is now being extended into expansion and optimization workflows beyond initial onboarding.
How I influenced beyond the project
✔ Established a horizontal onboarding design framework
✔ Mentored 2 UX designers in systemic modeling
✔ Drove cross-team alignment across Growth, PM, Eng
✔ Created shared design tokens for onboarding components
✔ Built first A/B experimentation naming standards


