Base location in Pune. Open to Remote working options
AI coaching that goes beyond traditional speech analytics
Cogito (www.cogitocorp.com) is the market leader in human-cantered AI coaching, with over a decade of advanced R&D and hundreds of millions of analysed phone conversations. Our unique combination of rich human behaviour insights with realtime streaming natural language processing, unique to Cogito, allows us to combine the best of both worlds for a truer understanding of employee behaviour and customer sentiment in real-time.
Cogito (www.cogitocorp.com) is advancing the way people communicate by applying machine learning to enhance conversations in real time. This is your chance to be a part of an industry-leading team that is making a real difference using the latest cloud technologies for building highly-scalable products. Cogito performs in-call voice analysis and delivers real-time guidance to call centre agents and unprecedented insights to managers to an impressive portfolio of Fortune 500 clients.
Cogito is searching for a Data Engineer to join our organization. The ideal candidate has a mix of data modelling, dimensional modelling, analytical and strong software engineering and QA testing skills.
You have experience in participating in the product design phase, modeling and documenting data requirements. You can translate data requirements into database schemas and document models. You are good in establishing data standards, creating data dictionaries and business glossaries to provide clarity around business and data definitions. You enjoy working with various teams, like Data Science, Customer Success and Customer Support to support their data needs.
You have worked on operational datastores, data warehouses, data marts and data lakes in your previous roles and understand the role of each of these in the enterprise SaaS environment. You have experience in data migrations and have moved large production workloads from legacy systems to Kubernetes based containerized microservices architecture successfully. You possess very strong troubleshooting skills and have developed software using CI/CD pipelines in the cloud native Kubernetes environment.
You have strong software engineering skills - you are very confident in writing good code and have solid experience writing unit tests, integration tests, and test fixtures. You have experience in writing ETL and data pipeline jobs, packaging them into Docker containers and running them in Kubernetes based environments. You have used Apache Airflow (or AWS Glue ) to create high performing data pipelines in the past.