loading
loading
loading
🔥 In this DataHour, Shiladitya will be discussing the fundamentals of Transformers. Transformers are the backbone powering many state-of-the-art models in the field of Natural Language Processing (NLP). In this session, we deep-dive into the self-attention mechanism which is central to the Transformer architecture. We will also be coding the self-attention mechanism from scratch. Along with this, the attendees will get a broad understanding of the transformer model ecosystem (specifically Huggingface). 🔥 Prerequisites: A basic understanding of Python programming language and familiarity with Linear regression would be helpful. 🔥 Who is this DataHour for? - Students & Freshers who want to build a career in the Data-tech domain. - Working professionals who want to transition to the Data-tech domain. - Data science professionals who want to accelerate their career growth 🔥 About the Speaker Shiladitya Banerjee is currently working as a Data Scientist at PhonePe, and has 9+ years of industry experience applying machine learning models in the digital advertising domain and currently in the Fintech domain. Before PhonePe, he worked with several other organizations like- NVIDIA, Pubmatic, and Walnut App. He completed his Master of Technology from IISs Bangalore in the year 2013 and his Bachelor of Engineering in the year 2010.