Dr.Reddy's : Technical Architect (Data)

Brief Description of position:

We need to look for resources with at least 1-2 architecture experience of Data Lake, DWH, very good understanding of Data Management tech stacks on any of the clouds, understanding of data quality etc.

Reports to : Head of Data

Job location : Banjara Hills

Context Data is at the heart of a pharmaceutical organization - from manufacturing operations, Quality control & Assurance, Research & Development to all the front end teams- interactions with doctors and patient community. Analytics initatives at Dr. Reddy’s hence are becoming more strategic to various business units and functions performance. While the journey is on, the current context is to drive better leverage of all this data with the use of advanced analytics techniques to help answer various business questions across the organization. 

The position is responsible to ensure that the data platforms in Dr. Reddy’s are designed and implemented for flexibility and scalability leveraging the latest technology stack and data management best practices. Objective is to have an organization data platform that enables analytics, advanced machine learning tools and technologies to solve real-world business problems.

What makes this assignment interesting? 

  • Role that gives a broader view of Dr. Reddy’s business context and various functional scenarios 
  • Opportunity to apply data architecture capabilities in an industry where data governance and leverage is fast becoming a competitive advantage 
  • Operate in environment with good executive sponsorship in a reputed organization in the industry
  • Opportunity to provide thought leadership for end to end Data and Analytics Architecture
  • Opportunity to work on cutting edge and emerging technologies like Big Data, Hadoop, Cloud (GCP, AWS etc.)

Key Responsibilities –

  • Provide technical leadership and strategic direction for data platform and data management architecture through a good understanding of Data Lake and integration technologies. 
  • Responsible for choosing, designing and building the data management/engineering strategy through evaluation/identification of data acquisition, data flow solutions. 
  • Own the database structures, data artefacts (dictionary, catalogs) and data models for a data engineering platform that enables organization's objectives through a tool-agnostic approach. 
  • Liaison with business analyst & solutioning teams to build a data flow & platform strategy across business functions with a solid understanding of the integration points with applications, platforms and network infrastructure. 
  • Advocate a tool-agnostic approach to building the data engineering platform for Dr. Reddy’s in to the future bringing together Data Warehousing, Data Virtualization, ETL ideas. 
  • Work with consulting and development partners to deliver scalable and effective analytics solutions that leverage the best of the latest advancements in the field.

Operating Network 

Internal: All functions and teams within the organization across India and overseas locations 

External: Vendors, implementation partners, knowledge networks in the industry

Technical Requirements:

  • Drive the advancement of Enterprise Data Architecture by designing and implementing the underlying logic and structure for how data is set up, cleansed, and ultimately stored for organizational usage
  • Working knowledge of Data Integration (Structured/ Unstructured), Data Quality, Metdata Management, Data Governance, Master Data Management etc.
  • Solid understanding of designing data engineering platforms (preferably Cloud-based - GCP and Big Data Platforms – Hadoop, hive, spark etc.) 
  • Exposure to one Data Modelling tool – ErWin , ErStudio etc.
  • Experience of architecting at least 3 Data Lake on Hadoop (Hive, HDFS), Google Bigquery etc.
  • Hands on exposure to Big Data technologies like spark, hive and GCP Data Engineering stacks like Composer, Pub/Sub, Dataproc, dataflow etc. 
  • Good understanding of different kinds of databases like RDBMS, No-SQL, Graph Databases, MPP etc.
  • Good understanding of data processing architectures like Kappa, Lambda etc.
  • Good understanding of Dataops (understand, design, develop, manage the lifecycle of ingestion, store, process, analysis and report/visualize the insight)
  • Comfortable with understanding the needs of advanced analytics solutions (R, Python etc.) and data mining/analytics techniques and visual analytics tools like Tableau/ Micro-strategy 
  • Good to have exposure to micro services based architecture, API orchestration etc.
  • Good understanding of SAP Data tools – Data Store, CDS views, RFC, OData, PI etc.
  • Knowledge in Project management background and familiar with PMI project methodologies and SDLC 
  • Good to have Google Certified professional Solution Architect certification

Educational Qualifications: 

  1. Tech/M. Tech. Certifications or academic qualification in data warehousing, data management and data platforms- design 

Experience Required: 

Minimum 7 to 12 years of experience in a reputed medium to largescale consulting/ analytics services or captive analytics organization preferably in Pharmaceuticals or with Manufacturing/CPG Customers. 

Key Personal Attributes 

  • Good communication skills, reasoning and analytical ability 
  • Must be collaborative in approach and a team player 
  • Very good at networking and influencing 
  • Willingness to learn new concepts, tools and techniques related to the subject - Open to travel across locations in India and overseas locations 

Mandatory Skills : - Standard RDBS like Oracle, SQL and cloud/big data platforms like Hadoop Stack – HDFS, Hive, Spark, Kafka, Any one ETL tool - Talend, Informatica, Ni-Fi; GCP tech stack -Google Big Query, Dataproc, Data Flow, pub/sub, composer etc.

Data Structure & Algorithm in the language Preferred Java/ Python/Scala: 5

  • Hadoop -5.6 Yrs.
  • DWH - NA
  • Data Lake - 4 Yrs.
  • Athena - 2 Yrs.
  • ETL Tools - 2 Yrs.
  • Cloud - 3.6 Yrs.
  • Architecture Exp. - 3 Yrs.
  • Big Data - 5.6 Yrs.
Location
Hyderabad
Mandatory SkillSet:
Hadoop, AWS, NoSQL, Spark, Kafka, Java, ETL, GCP, Data Warehousing, Tableau.
Support

Feedback

We believe in making Analytics Vidhya the best experience possible for Data Science enthusiasts. Help us by providing valuable Feedback.