
Job SummaryThe primary purpose of this role is to translate business requirements and functional specifications into logical program designs and to deliver modules, stable application systems, and Data or Platform solutions. This includes developing, configuring, or modifying integrated business and/or enterprise infrastructure or application solutions within various computing environments. This role facilitates the implementation and maintenance of business and enterprise Data or Platform solutions to ensure successful deployment of released applications.
Data Engineering• 3 years of experience in Hadoop or GCP (specific to the Data Engineering role)• Expertise in Python, SQL, Scripting, Teradata, Hadoop (Sqoop, Hive), Spark (Spark Streaming), Kafka or equivalent GCP Bigdata components (specific to the Data Engineering role)
Roles and Responsibility:
· Help build and support ETL jobs and components in Apache Spark, Python, Pyspark, Talend, Kafka, Hadoop, Airflow, GCP· Requirement gathering and understanding, Analyze and convert functional requirements into concrete technical tasks using Spark/Airflow.· Prepare Technical ETL design document, develop and support ETL processes.· Responsible for systems analysis - Design, Coding, Unit Testing and other SDLC activities· Support business continuity- monitoring, triaging, and resolving integration issues that arise and ensure availability of data within SLA, including Production support – 24*7 model with rotational shift.
Key Skills
Related Jobs

Security Engineer Ubuntu

Senior Software Engineer

Test Automation Engineer

Senior Android Engineer

Lead Software Engineer

Senior Frontend Engineer

Data Engineer

Production Engineer

Senior Back End Developer

Senior Frontend Engineer Platform

IT Engineer

Senior Software Engineer Cloud Images

Senior Product Designer

Hands On Engineering Manager

Full Stack Software Engineer

Senior Front End Developer

Customer Success Manager

Senior Software Engineer .NET Core

Front End TypeScript Developer
