Holcim : Data Engineer

Brief Description of position:

About Holcim

Holcim is the global leader in building materials and solutions and active in four business segments: Cement, Aggregates, Ready-Mix Concrete and Solutions & Products. It is our ambition to lead the industry in reducing carbon emissions and accelerating the transition towards low-carbon construction. With the strongest R&D organization in the industry and by being at the forefront of innovation in building materials we seek to constantly introduce and promote high-quality and sustainable building materials and solutions to our customers worldwide - whether they are building individual homes or major infrastructure projects. Holcim employs over 70,000 employees in over 70 countries and has a portfolio that is equally balanced between developing and mature markets.

About The Role

Overview

Holcim’s APAC IT Services team has the charter to build competitive edge to the business by proactively building world class high quality innovative IT, Digital, Analytics and Business Process Improvement solutions/Services for APAC Region with focus on Operating Companies (OpCos) e.g. ACC, Ambuja in India, Australia & New Zealand, Philippines and Bangladesh. There is also to mandate to set up a Global Digital Hub focused on Advanced Analytics, Digital Business Platforms and Digital Solutions to provide service across the globe in phased manner. To enable this vision, IT team strives to focus on:

  1. Co-Creating IT, Digital & Data Strategy that not only aligns with business priorities but also creates and enables new business models that add competitive advantage to business.
  2. Build Products and Platforms for enabling Business and Tangible Value Creation
  3. Building thought leadership and expertise in Emerging technologies & Data-driven Operations and creatively apply it to different needs of the business for Value Creation.
  4. Driving continuous Business Process Simplification, Improvement and Excellence
  5. Implementation of projects in Agile manner within time and budget addressing critical business needs leveraging appropriate technology
  6. Driving and Ensuring Value Creation, Adoption, Change Management and supporting Value Capture for various implemented projects.
  7. Building and Running a world-class NextGen IT to enable Business Outcomes

The Data Engineer will play an important role in enabling business for Data Driven Operations and Decision making in Agile and Product-centric IT enviornment.

 

Responsibilities

  • To design and build solutions to ingest data from variety of sources in to Data and Analytics platforms at Holcim including cloud platforms like AWS. Exposure to GCP an added advantage
  • Build data pipelines that clean, transform, and aggregate data from disparate data sources. Strong analytical skills in working with structured and unstructured datasets
  • Develop and optimize scalable data pipelines, architectures & data sets and build new API integrations
  • Build processes to support data mining, data transformation, data structures, data modelling, metadata and workload management
  • Creating custom software components using languages sych R, Python etc. and tools to merge different systems together and develop a strong analytics infrastructure
  • Support Data Scientists to build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics
  • Enable delivery of business solutions on Holcim’s analytics platform that includes data security, governance, cataloging, preparation, automated t esting, and data quality metrics.
  • Automate, optimize, migrate and enhance existing solutions.
  • Provide high operational excellence guaranteeing high availability and platform stability
  • Collaborate with analytics and business teams to improve data models, increasing data accessibility and fostering data-driven decision making

Education / Qualification

  • BE / B. Tech from IIT or Tier I / II colleges
  • Certification in Big Data Technologies
  • Certification in Cloud Platforms AWS or GCP

Experience

  • Total Experience of 8-10 years
  • 5+ years experience in data engineering which includes data ingestion, preparation, provisioning, automated testing, and quality checks.
  • 3+ Hands-on experience in Big Data cloud platforms like AWS, Data Lakes, and Data Warehouses
  • 3+ years of Big Data and Analytics Technologies. Experience in SQL, writing code in spark engine using python, scala or java Language. Experience in Spark, Scala and Kafka technologies
  • Experience in designing, building, and maintaining ETL systems like SAP BW
  • Experience in data pipeline and workflow management tools ( such as Azkaban, Luigi, Airflow etc.)
  • Application Development background along with knowledge of Analytics libraries, open-source statistical and big data computing libraries
  • Familiarity with Visualization and Reporting Tools like Qliksense (preferred), Power BI, Tableau

Key Personal Attributes

  • Constructive and Collaborative Team Player
  • Innovative and Continuous Improvement Mind-set
  • Business focused, Customer & Service minded
  • Strong Consultative and Management skills
  • Good Communication and Interpersonal skills
  • Confident in advising, developing and articulating solution
  • Result oriented and with a work ethic of delivering on-time and in scope
  • Open to Change and Disruption and Attitude to challenge the Status Quo

Language Requirements

  • Fluent Written and Spoken English with good command on Business Communication
Minimum Work Experience:
8 years
Maximum Work Experience:
10 years
Mandatory SkillSet:
Statistics & EDA, SQL, Python, Machine Learning, Dashboard, Cloud Computing.
Support

Feedback

We believe in making Analytics Vidhya the best experience possible for Data Science enthusiasts. Help us by providing valuable Feedback.