Skip to content

Implemented Encoder of Attention is All You Need Paper in pytorch.Using Encoder of Attention Transformer to build basic recommendation system in pytorch from scratch.

Notifications You must be signed in to change notification settings

Harsh3231qubit/Attention-is-All-You-Need

Repository files navigation

Basic Implementation of Encoder of Attention Transformer (Attention Is All You Need)

πŸ“„Paper link: Attention Is All You Need -> arXiv:1706.03762

πŸ“‚ This Repo Includes

 βœ… Encoder implementation in PyTorch with explanations of critical concepts.
      
 🏘️ A dataset on which a recommendation program was built.
      
 πŸ€– An apartment recommendation program using a custom Encoder built from scratch in PyTorch.

πŸ”§ Encoder Implementation in PyTorch Key components covered:

 ✨ Residual Connections
  
 πŸŒ€ Positional Encoding
  
 🧠 Self-Attention and Multi-Head Attention

 πŸ“ Why dot product of query and key vectors can get large and how to scale them properly

🏠 Dataset and Recommendation Program

  • The dataset consists of apartment listings with various columns.

  • For building the recommendation system, I primarily used the TopFacilities and PropertyName columns.

  • The model leverages the Attention Mechanism to capture relationships between facilities, creating a vector representation for each apartment by averaging the
    embeddings of its facilities.

  • I used cosine similarity to find the most similar apartments and torch.topk() to retrieve the top-k most similar listings to the apartment a user is viewing.

About

Implemented Encoder of Attention is All You Need Paper in pytorch.Using Encoder of Attention Transformer to build basic recommendation system in pytorch from scratch.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published