Publicis Sapient : Senior Manager – Data Engineering

Brief Description of position:

Publicis Sapient Overview

We at Publicis Sapient, enable our clients to thrive in Next and to create business value through expert strategies, customer-centric experience design, and world-class product engineering.

The future of business is disruptive, transformative and becoming digital to the core.

In our 20 + years in IT, never before have we seen such a dire need for transformation in every major industry - from financial services to automotive, consumer products, retail, energy, and travel.

To make this transformative journey a reality in these exciting times, we seek thought leaders and rock stars who will: 

  • brave it out to go do the next; “what will be” from “what is”
  • exhibit the optimism that says there is no limit to what we can achieve 
  • deeply-skilled, bold, collaborative, flexible
  • Reimagine the way the world works to help businesses improve the daily lives of people and the world. 

Our people thrive because of the belief that it is both our privilege and responsibility to usher our clients and the world into Next.

Our work is fueled by 

  • challenging boundaries, 
  • multidisciplinary collaboration, 
  • highly agile teams, and 
  • the power of the newest technologies and platforms.

If that’s you, come talk to us!

This is the world-class engineering team where you should build your career.

Job Summary: 

As Senior Manager in Data Engineering, you will be responsible for architecting and implementing big data solutions for clients. Your role focused on high quality strategy, definition and delivery of solutions involving:

Data Integration, Transformation & Consumption 

Data Storage and Computation Frameworks, Performance Optimizations

Infrastructure, Automation & Cloud Computing

Data Governance & Security

The role requires a hands-on technologist with expertise in Big Data solution architecture, provide strategic and tactical direction to customers in the areas of various Industry domains.

As data engineering practitioner, you should have a point of view and understanding of build vs. buy, performance considerations, hosting, business intelligence, reporting & analytics. Ideally, you have experience in integrating data platforms with scenarios like segmentation, targeting, consumer 360 view, etc.

Role & Responsibilities:

    1. Provide inputs to define and execute strategic roadmap for enterprise data architecture by identifying the current landscape and future business goals
    2. Provide technical leadership and hands-on implementation role in the areas of data techniques including data access, integration, modeling, visualization, mining, design and implementation
    3. Lead a globally distributed team to deliver high quality solutions. Manage functional & non-functional scope and quality
    4. Help establish standard data practices like governance and address other non-functional issues like data security, privacy and quality
    5. Help establish best practice like standards and guidelines for design & development, deployment, support and analytics and mining
    6. Help establish best practice in acquiring, storing and analyzing structured, semi-structured and un-structured data from the enterprise and outside like social data
    7. Manage and provide technical leadership to large data programs or multiple programs implementation based on the requirement using agile technologies
    8. Define and drive key transformational programs like customer 360 degree view for the clients
    9. Functional understanding related to digital transformation & customer data platforms 
    10. Run workshops with clients and align client stakeholders to optimal solutions
    11. Consulting, Soft Skills, Thought Leadership, Mentorship etc.
    12. People management, contributing in hiring and capability building 

Experience Guidelines: 

Mandatory Experience and Competencies: 

#

Competency

1

Overall 12+ years of IT experience with 5+ years in Data related technologies

2

5+ years of experience in Big Data technologies and expertise in 1+ cloud related data services (AWS / Azure / GCP) 

3

Expert in Big Data Architecture Patterns and experience in delivery of end to end Big data solutions

4

Expert in in Hadoop eco-system with one or more distribution like Cloudera and cloud specific distributions

5

Expert in programming languages like Java/ Scala and good to have Python

6

Expert in one or more big data ingestion tools (sqoop, flume, NiFI etc), streaming frameworks (kafka, pub/sub etc) and other big data technologies used in building end to end pipeline. Good to know traditional tools like Informatica, Talend etc.

7

Good Exposure in development with CI / CD pipelines. Knowledge of containerization, orchestration and kubernetes engine


Preferred Experience and Knowledge:

#

Competency

1

Excellent understanding of data technologies landscape, strategy, assessments and RFPs

2

Well versed with pros and cons of various database technologies like Relational, NoSQL, MPP, Columnar databases

3

Well versed in in multi-dimensional modeling like start schema, snowflakes, normalized and de-normalized models

4

Good exposure in data governance, catalog, lineage and associated tools

5

Worked on Performance Tuning, Optimization and  Data security

6

Knowledge in master data management and building customer 360 degree views

7

Well versed with Software as a service, Platform as a service and Infrastructure as a service concepts and can drive clients to a decisions

8

Thought Leadership – blogs, key note sessions, POV/POC, hackathon

9

Certification in either one of the cloud platforms or big data technologies


Personal Attributes: 

  • Strong analytical and problem solving skills
  • Strong communication skills verbal, written and visual presentations
  • Strong coordination and negotiation skills
  • Self-starter who requires minimal oversight
  • Ability to prioritize and manage multiple tasks
  • Multi geo experience and distributed delivery experience in large programs

Education

Bachelor’s/Master’s Degree in Computer Engineering, Computer Science, or a related field.


ABOUT US 

Publicis Sapient is a digital transformation partner helping established organizations get to their future, digitally-enabled state, both in the way they work and the way they serve their customers. We help unlock value through a start-up mindset and modern methods, fusing strategy, consulting and customer experience with agile engineering and problem-solving creativity. As digital pioneers with 20,000 people and 53 offices around the globe, our experience spanning technology, data sciences, consulting and customer obsession – combined with our culture of curiosity and relentlessness – enables us to accelerate our clients’ businesses through designing the products and services their customers truly value. Publicis Sapient is the digital business transformation hub of Publicis Groupe. For more information, visit publicissapient.com

Minimum Work Experience:
5 years
Minimum Qualification:
Graduate
Mandatory SkillSet:
Java, Scala, Hadoop, NoSQL, Sqoop, AWS, flume, Azure, GCP, kafka, Python.
Support

Feedback

We believe in making Analytics Vidhya the best experience possible for Data Science enthusiasts. Help us by providing valuable Feedback.