Haoxuan’s AI Product Finder: How Building a Recommendation System helps with College Admissions
- BetterMind Labs

- Jan 16
- 4 min read
Introduction: AI Product Finder: How Building a Recommendation System helps with College Admissions
Product discovery looks simple from the outside. You type what you want, a system suggests options, and you choose. Because of this familiarity, many student-built product recommendation systems feel shallow. They retrieve items, rank them loosely, and stop there.
Admissions officers are not impressed by that.
What they evaluate instead is whether a student understands why recommendation systems are difficult, where they fail, and how AI should support decision-making rather than replace it. A strong project treats recommendation as a reasoning problem, not a lookup problem.
Haoxuan Liu’s AI Product Finder reflects this deeper approach. The project shows how hands-on experimentation, guided mentorship, and iterative thinking turn a common idea into a meaningful AI system.
Why Product Recommendation Is a Serious AI Problem
Recommendation systems sit at the intersection of several hard problems:
User intent is often incomplete or ambiguous
Preferences change over time
Popularity bias skews results
Over-personalization limits discovery
In real-world systems, recommending the “right” product is less about accuracy and more about decision quality. Poor recommendations overwhelm users or push them toward suboptimal choices.
Haoxuan’s project began by recognizing a core truth:
The challenge is not finding products. It is helping users decide.
That framing alone elevates the project beyond surface-level implementations.
From Idea to Definition: Framing the Right Problem
Many students start recommendation projects by asking, “How do I suggest products?” Haoxuan instead focused on a more disciplined question:
“How can AI narrow options based on user needs while keeping recommendations relevant and understandable?”
This led to clear system goals:
Capture user preferences meaningfully
Match those preferences to product attributes
Rank results rather than force a single answer
Allow flexibility as inputs change
From an admissions standpoint, this shows problem definition skill, one of the strongest predictors of success in research and engineering.
System Design: Learning Through Hands-On Labs
One reason Haoxuan’s project developed depth was the emphasis on hands-on labs rather than abstract lectures. These labs forced engagement with practical constraints early.
Key components of the AI Product Finder included:
Product feature representation
Preference matching logic
Ranking and filtering mechanisms
Iterative testing with different inputs
Each lab exposed trade-offs. Overly strict filters produced empty results. Loose filters overwhelmed users. Balancing these extremes required experimentation, not guesswork.
This is exactly how professional AI systems are built.
The Role of Mentorship in Technical Growth
Haoxuan noted that mentorship played a critical role in improving his AI skills. From an admissions perspective, this matters more than many students realize.
Mentorship does not mean giving answers. It means:
Asking why a design choice was made
Pointing out blind spots
Encouraging iteration rather than acceptance
In this project, mentorship helped Haoxuan move beyond a functional system to a thoughtful one. Feedback guided refinements in how recommendations were ranked, how user inputs were interpreted, and how outputs were presented.
Admissions officers consistently favor students who can absorb feedback and adjust their thinking.
Recommendation Systems: Where Many Student Projects Go Wrong
Understanding why this project works requires comparing it to common pitfalls.
Typical Student Product Finder
Hard-coded filters
Minimal personalization
No reflection on user decision-making
Haoxuan’s AI Product Finder
Preference-aware recommendation logic
Ranking rather than binary selection
Emphasis on usability and relevance
This difference signals readiness for advanced coursework, where systems are judged by outcomes, not just execution.
Technical Learning Without Overstatement
One strength of Haoxuan’s project is restraint. It does not claim to revolutionize e-commerce. Instead, it focuses on learning core AI concepts correctly.
The project reinforced:
Feature selection and representation
Matching algorithms and ranking logic
Evaluating outputs qualitatively, not just quantitatively
This approach avoids a common admissions red flag: inflated claims unsupported by evidence.
Growth Through Iteration

At the start, the system produced generic recommendations. Early versions treated all preferences equally, which diluted relevance.
Through hands-on labs and mentor feedback, Haoxuan learned to:
Weight preferences differently
Refine ranking logic
Test edge cases where preferences conflicted
The final system demonstrated clear improvement over its initial iterations. This evolution is exactly what admissions committees look for when evaluating long-term projects.
What This Project Signals to Admissions Committees
From a reviewer’s perspective, the AI Product Finder communicates several important traits:
Applied reasoning: The student understands the real problem being solved
Learning mindset: Skills improved through labs and feedback
Technical judgment: Awareness of trade-offs in recommendation logic
Professional awareness: Treating AI as a decision-support system
These signals matter far more than flashy terminology or excessive complexity.
Why Hands-On, Guided Programs Produce Better Outcomes
Projects like this rarely emerge from purely self-directed learning. Without structure, students often stop once something “works.”
Guided programs accelerate learning by:
Forcing students to test assumptions
Introducing real constraints
Encouraging iteration rather than completion
Haoxuan’s reflection about mentorship and labs aligns with what admissions officers repeatedly observe: students grow fastest when challenged consistently.
Frequently Asked Questions
Is an AI product finder too common for college applications?
It is common in name. Projects with strong reasoning and iteration are rare.
Does mentorship reduce authenticity?
No. Authenticity comes from ownership of decisions, not isolation.
What do colleges value more: complexity or clarity?
Clarity. Clear reasoning beats unnecessary complexity every time.
Can this project support majors beyond computer science?
Yes. It connects to business, data science, and human-computer interaction.
Final Perspective and Where to Go Next

Recommendation systems shape how people make choices every day. Building one responsibly requires understanding users, trade-offs, and system limitations.
Haoxuan Liu’s AI Product Finder shows how hands-on labs combined with thoughtful mentorship can significantly improve both technical skill and problem-solving ability. The project is not memorable because it is flashy. It is memorable because it is considered.
Programs like the AI & ML initiatives at BetterMind Labs are designed to support this kind of growth, pairing structured labs with close mentorship so students move beyond surface-level AI and develop real judgment.
To explore similar student projects or learn how guided, applied learning shapes admissions-ready outcomes, visit bettermindlabs.org.
Interested in more Projects? Read about Maansi’s AI Note Taker Bot





Comments