Maveric : Hadoop Developer

Apply
Brief Description of position:

MAVERIC SYSTEMS

Started in 2000, Maveric Systems helps global banking and fintech leaders drive business agility through effective integration of development, operations and quality engineering initiatives. Our strong banking domain competency combined with expertise across legacy and new age technology landscapes makes us a preferred partner for customers worldwide.

We offer Product Implementation, Integration and Quality Engineering services across Digital platforms, Banking solutions and Regulatory systems. Our insight led engagement approach helps our clients quickly adapt to dynamic technology and competitive landscapes with a sharp focus on quality.

Role: Developer

We are looking for a Developer/Senior Developer to be a part of building advanced analytical platform leveraging Big Data technologies and transform the legacy systems. This role is an exciting, fast-paced, constantly changing and challenging work environment, and will play an important role in resolving and influencing high-level decisions across standard chartered Retail Banking Foundation program.

Requirements:

  • The candidate must be a self-starter, who can work under general guidelines in a fast-spaced environment.
  • Overall minimum of 3 to 9 year of software development experience and 2 years in Data Warehousing domain knowledge
  • Must have at least 3 years of hands-on working knowledge on Big Data technologies such as Hive, Hadoop, Hbase, Spark, Nifi, SCALA, Kafka
  • Excellent knowledge in SQL & Linux Shell scripting
  • Bachelors/Master’s/Engineering Degree from a well-reputed university.
  • Strong communication, Interpersonal, Learning and organizing skills matched with the ability to manage stress, Time, and People effectively
  • Proven experience in co-ordination of many dependencies and multiple demanding stakeholders in a complex, large-scale deployment environment
  • Ability to manage a diverse and challenging stakeholder community
  • Diverse knowledge and experience of working on Agile Deliveries and Scrum teams.

Responsibilities

  • Responsible for the documentation, design, development, and architecture of Hadoop applications
  • Converting hard and complex techniques as well as functional requirements into the detailed designs
  • work as a senior developer/individual contributor based on situations
  • Adhere to SCRUM timeline and deliver accordingly  Prepare Unit/SIT/UAT testcase and log the results
  • Co-ordinate SIT and UAT Testing. Take feedbacks and provide necessary remediation/recommendation in time.
  • Drive small projects individually.
  • Co-ordinate change and deployment in time
Location
Bangalore
Minimum Work Experience:
3 years
Maximum Work Experience:
9 years
Minimum Qualification:
Graduate
Mandatory SkillSet:
Big Data, Hive, Hadoop, Hbase, Spark, Nifi, Scala, Kafka, SQL, Linux Shell scripting, Data Warehousing.

Feedback

We believe in making Analytics Vidhya the best experience possible for Data Science enthusiasts. Help us by providing valuable Feedback.