Registered
Prizes
In this DataHour, Shiladitya will be discussing the fundamentals of Transformers. Transformers are the backbone powering many state of the art models in the field of Natural Language Processing (NLP) .
In this session we will dive deep into the self attention mechanism which is central to the Transformer architecture. We will also be coding the self attention mechanism from scratch. Along with this, the attendees will get a broad understanding of the transformer model ecosystem (specifically Huggingface).
Prerequisites: A basic understanding of Python programming language and familiarity with Linear regression would be helpful.
Note: E-certificates will be provided within 24 - 48 hours of the session only to those who have attended the entire webinar. Please make sure to join the zoom webinar with your correct name and email address to ensure that your certificate is properly credited to you.
Shiladitya Banerjee
Data Scientist at PhonePe
Shiladitya Banerjee is currently working as a Data Scientist at PhonePe, and has 9+ years of industry experience applying machine learning models in the digital advertising domain and currently in the Fintech domain. Before PhonePe, he has worked with several other organizations like- NVIDIA, Pubmatic and Walnut App. He completed his Master of Technology from IISs Bangalore in the year 2013 and his Bachelor of Engineering in the year 2010.
Please register/login to participate in the contest
Please register to participate in the contest
Please register to participate in the contest