Registered
Prizes
While building ML/DL models one has to take care of a large volume of data which increases model run time and the complexity to understand the predictions.
Dimensionality reduction is a way to deduce large columns of data to few, the transformed data explains data in a precise manner and makes the process of generating output easy for classic ML models. The process is nothing but the transformation of data from a high-dimensional space into a low-dimensional space so that the low-dimensional representation retains some meaningful properties of the original data, ideally close to its intrinsic dimension.
In this DataHour, Varun will discuss PCA and Factor analysis to understand how dimensional reduction is performed.
Prerequisites: Enthusiasm of learning Data Science and basic understanding of data dimensions, Eigenvalues, EigenVectors and Linear equations
Varun Behl
Data Scientist at Adobe
Varun is currently working as Data Science Engineer with Adobe in Bangalore. He has more than 5 years of experience in the Data Science profession and has worked with various kinds of data covering industries like Telco, Ecommerce, Pharma, Retail, Musigma and Web Analytics. He also has keen interest in the research side of machine learning with NLP, forecasting, segmentation and recommendations being areas which have contributed significantly for enhancing the existing scope of traditional modeling techniques.
Connect with Varun at Linkedin
Please register/login to participate in the contest
Please register to participate in the contest
Please register to participate in the contest