Back to Catalog

Perfume Recommendation with Sentence-BERT

IntermediateGuided Project

We meet people every day and making a memorable impression on others is not easy. Memories are heavily linked to our sense of smell. Thus a handpicked personal perfume can not only evoke our feelings of happiness and energy but also project a message of our unique identities to others we meet. As the notes of perfume will unfold over time, selecting the right one from different brands' collections by smelling is not effortless. Why not use Machine Learning to build a perfume recommender system? This guided project teaches you how to build such a system based on documented perfume notes.

4.4 (17 Reviews)

Language

  • English

Topic

  • Machine Learning

Enrollment Count

  • 191

Skills You Will Learn

  • NLP, BERT, Recommendation, Sentence Embedding, Embeddable AI

Offered By

  • IBM

Platform

  • SkillsNetwork

Last Update

  • May 17, 2024
About This Guided Project
You are a Data Scientist hired by a fragrance retailer who recently wants to boost its online sales by deploying a recommender system on its website. You are asked to design this system where a customer will be recommended five perfumes that have similar notes to his/her most recent perfume searched. What you have in hand is a text file that contains the notes of all the perfumes that the retailer carries from its primary perfumer. It's time to start building the perfume recommendation system!

In this guided project, you will first learn about Sentence-BERT for transforming the notes of the perfumes into semantically meaningful sentence embeddings that can be compared in terms of similarity metrics such as the cosine-similarity. Then you will use the embeddings to build a recommender system that outputs similar types of perfumes, as per the requirements of your manager at the fragrance retail company! 



A Look at the Project Ahead

After completing this guided project you will be able to:


  • Describe the working mechanics of Sentence-BERT
  • Compute sentence embeddings using Python's sentence-transformers framework
  • Perform Semantic Textual Similarity (STS) analysis on sentence embeddings
  • Build a perfume recommender system 

What You'll Need

For this guided project, it's recommended that you have a basic understanding of the BERT framework (Bidirectional Encoder Representations from Transformers) and the Transformers network. It would also be helpful if you have prior experience working with the Pandas library for manipulating data frames in Python. We also used PCA (Principal Component Analysis) for the purpose of visualizing the embeddings of the perfume notes.

Frequently Asked Questions


Do I need to install any software to participate in this project?
Everything you need to complete this project will be provided to you via the Skills Network Labs and it will all be available via a standard web browser.

What web browser should I use?
The Skills Network Labs platform works best with current versions of Chrome, Edge, Firefox, Internet Explorer, or Safari.

Instructors

Roxanne Li

Data Scientist at IBM

I am an aspiring Data Scientist at IBM with extensive theoretical/academic, research, and work experience in different areas of Machine Learning, including Classification, Clustering, Computer Vision, NLP, and Generative AI. I've exploited Machine Learning to build data products for the P&C insurance industry in the past. I also recently became an instructor of the Unsupervised Machine Learning course by IBM on Coursera!

Read more

Joseph Santarcangelo

Senior Data Scientist at IBM

Joseph has a Ph.D. in Electrical Engineering, his research focused on using machine learning, signal processing, and computer vision to determine how videos impact human cognition. Joseph has been working for IBM since he completed his PhD.

Read more

Contributors

Cindy Huang

Data Science Intern at IBM

Hey there! I'm a senior at the University of Toronto studying data science. My passion for machine learning lies in NLP and using technology to improve human experience.

Read more