Publicis Sapient : Manager – Data Engineering

Apply
Brief Description of position:

Publicis Sapient Overview

We at Publicis Sapient, enable our clients to thrive in Next and to create business value through expert strategies, customer-centric experience design, and world-class product engineering.

The future of business is disruptive, transformative and becoming digital to the core.

In our 20 + years in IT, never before have we seen such a dire need for transformation in every major industry - from financial services to automotive, consumer products, retail, energy, and travel.

To make this transformative journey a reality in these exciting times, we seek thought leaders and rock stars who will: 

  • brave it out to go do the next; “what will be” from “what is”
  • exhibit the optimism that says there is no limit to what we can achieve 
  • deeply-skilled, bold, collaborative, flexible
  • Reimagine the way the world works to help businesses improve the daily lives of people and the world. 

Our people thrive because of the belief that it is both our privilege and responsibility to usher our clients and the world into Next.

Our work is fueled by 

  • challenging boundaries, 
  • multidisciplinary collaboration, 
  • highly agile teams, and 
  • the power of the newest technologies and platforms.

If that’s you, come talk to us!

This is the world-class engineering team where you should build your career.

Job Summary: 

As Manager, Data Engineering, you will be responsible for translating client requirements into design, architecting and implementing Cloud & Non Cloud based big data solutions for clients. Your role will be focused on delivering high quality solutions by independently driving design discussions related to below aspects:

  • Data Ingestion, Transformation & Consumption, 
  • Data Storage and Computation Frameworks, 
  • Performance Optimizations,
  • Infrastructure, Automation & Cloud Computing,
  • Data Governance & Security

The role requires a hands-on technologist with expertise in Big Data solution architecture with strong programming background in Java / Scala / Python, should have experience in creating Data Ingestion pipelines for streaming and batch datasets, creating ETL/ELT data pipelines using distributed computing frameworks like Spark, Strom, Flink etc, orchestrating data pipelines, should have experience in setting up secure big data platform. You are also required to have hands-on knowledge on at least one of AWS, GCP, Azure cloud platforms.

Role & Responsibilities:

    1. Provide technical leadership and hands-on implementation role in the areas of data engineering including data ingestion, data access, modeling, data processing, visualization, design and implementation.
    2. Lead a team to deliver high quality big data technologies based solutions either on premise or on Cloud. Manage functional & non-functional scope and quality
    3. Help establish standard data practices like governance and address other non-functional issues like data security, privacy and quality
    4. Manage and provide technical leadership to a data program implementation based on the requirement using agile technologies
    5. Participate in workshops with clients and align client stakeholders to optimal solutions.
    6. Consulting, Soft Skills, Thought Leadership, Mentorship etc.
    7. People management, contributing in hiring and capability building 

Experience Guidelines: 

Mandatory Experience and Competencies: 

#

Competency

1

Overall 8+ years of IT experience with 3+ years in Data related technologies

2

3+ years of experience in Big Data technologies and expertise of 1+years in data related Cloud services (AWS / Azure / GCP) and delivered at least 1 project as an architect.

3

Mandatory to have knowledge of Big Data Architecture Patterns and experience in delivery of end to end Big data solutions either on premise or on Cloud.

4

Expert in Hadoop eco-system with one or more distribution like Cloudera and cloud specific distributions

5

Expert in programming languages like Java/ Scala and good to have Python

6

Expert in one or more big data ingestion tools (Sqoop, Flume, NiFI etc), distributed messaging and ingestion frameworks (Kafka,Pulsar, Pub/Sub etc) and good to know traditional tools like Informatica, Talend etc.

7

Expert in at least one distributed data processing frameworks: like Spark (Core, Streaming , SQL), Storm or Flink etc.

8

Should have worked on MPP style query engines like Impala , Presto, Athena etc

9

Should have worked on any of NoSQL solutions like Mongo DB, Cassandra, HBase etc or any of Cloud based NoSQL offerings like DynamoDB , Big Table etc.

10

Should have good understanding of how to setup Big data cluster security – Authorization/ Authentication, Security for data at rest, data in Transit. 

11

Should have basic understanding of how to manage and setup Monitoring and alerting for Big data cluster. 

12

Should have worked on any of Orchestration tools – Oozie , Airflow , Ctr-M or similar.

13

Worked on Performance Tuning, Optimization and  Data security


Preferred Experience and Knowledge:

#

Competency

1

Excellent understanding of data technologies landscape / ecosystem.

2

Well versed with pros and cons of various database technologies like Relational, NoSQL, MPP, Columnar databases

3

Good Exposure in development with CI / CD pipelines. Knowledge of containerization, orchestration and kubernetes engine would be an added advantage.

4

Well versed in in multi-dimensional modeling like start schema, snowflakes, normalized and de-normalized models

5

Exposure in data governance, catalog, lineage and associated tools would be an added advantage.

6

Well versed with Software as a service, Platform as a service and Infrastructure as a service concepts and can drive clients to a decisions

7

Thought Leadership – blogs, key note sessions, POV/POC, hackathon

8

Certification in either one of the cloud platforms or big data technologies

 

Personal Attributes: 

  • Strong analytical and problem solving skills
  • Strong communication skills verbal, written and visual presentations
  • Strong coordination and negotiation skills
  • Self-starter who requires minimal oversight
  • Ability to prioritize and manage multiple tasks
  • Multi geo experience and distributed delivery experience in large programs

Education

Bachelor’s/Master’s Degree in Computer Engineering, Computer Science, or a related field.

ABOUT US 

Publicis Sapient is a digital transformation partner helping established organizations get to their future, digitally-enabled state, both in the way they work and the way they serve their customers. We help unlock value through a start-up mindset and modern methods, fusing strategy, consulting and customer experience with agile engineering and problem-solving creativity. As digital pioneers with 20,000 people and 53 offices around the globe, our experience spanning technology, data sciences, consulting and customer obsession – combined with our culture of curiosity and relentlessness – enables us to accelerate our clients’ businesses through designing the products and services their customers truly value. Publicis Sapient is the digital business transformation hub of Publicis Groupe. For more information, visit publicissapient.com

Minimum Work Experience:
3 years
Minimum Qualification:
Graduate
Mandatory SkillSet:
Spark, NoSQL, AWS, Azure, Python, GCP, Java, SQL, Sqoop.

Feedback

We believe in making Analytics Vidhya the best experience possible for Data Science enthusiasts. Help us by providing valuable Feedback.