Registered
Prizes
As amazing as state-of-the-art machine learning models are, training, optimizing, and deploying them remains a challenging endeavor that requires a significant amount of time, resources, and skills, all the more when different languages are involved. Unfortunately, this complexity prevents most organizations from using these models effectively, if at all. Instead, wouldn’t it be great if we could just start from pre-trained versions and put them to work immediately?
This is the exact challenge that Hugging Face is tackling. Its tools make it easy to add state-of-the-art Transformer models to your applications. Thanks to the open-source libraries of Hugging Face, developers can easily work with 5,000+ datasets and 50,000+ pre-trained models in 160+ languages. In fact, with over 65,000 stars on GitHub, the transformers library has become the de-facto tool for developers and data scientists who need state-of-the-art models for natural language processing, computer vision, and speech.
In this DataHour, Julien will introduce you to Transformer models and what business problems you can solve with them. Then, he’ll show you how you can simplify and accelerate your machine learning projects end-to-end: experimenting, training, optimizing, and deploying. Along the way, he’ll run some demos to keep things concrete and exciting!
Prerequisites: Enthusiasm for learning and basic concepts of Machine Learning & Python.
Julien Simon
Chief Evangelist at Hugging Face
Julien is currently Chief Evangelist at Hugging Face. He's recently spent 6 years at Amazon Web Services where he was the Global Technical Evangelist for AI & Machine Learning. Prior to joining AWS, Julien served for 10 years as CTO/VP Engineering in large-scale startups.
You can follow him on Linkedin, Twitter and Youtube.
Please register/login to participate in the contest
Please register to participate in the contest
Please register to participate in the contest