Job Description / Duties & Responsibilities
• Develop, program, and maintain applications using the Apache Spark open-source framework.• Work with different aspects of the Spark ecosystem, including Spark SQL, DataFrames, Datasets, andstreaming.• Spark Developer must have strong programming skills in Java, Scala, or Python.• Familiar with big data processing tools and techniques.• Adhere to the ISMS policies and procedures.
Job Specification / Skills and Competencies
• Proven experience as a Spark Developer or a related role.• Strong programming skills in Scala/PySpark/Python.• Familiarity with big data processing tools and techniques.• Understanding ino Apache Spark 2.x, Spark 3.xo Apache Spark RDD APIo Apache Spark SQL Data Frame APIo Apache Spark Streaming APIo Spark query tuning and performance optimizationo Database integration - Cassandra, Elasticsearch• Experience working with HDFS.• Deep understanding of distributed systems (e.g. CAP theorem, partitioning, replication, consistency, andconsensus).• Experience with the Hadoop ecosystem.• Good understanding of distributed systems.• Experience with streaming data platforms.• Excellent analytical and problem-solving skills.
Related Jobs

Security Engineer Ubuntu

Senior Software Engineer

Test Automation Engineer

Senior Android Engineer

Lead Software Engineer

Senior Frontend Engineer

Data Engineer

Production Engineer

Senior Back End Developer

Senior Frontend Engineer Platform

IT Engineer

Senior Software Engineer Cloud Images

Senior Product Designer

Hands On Engineering Manager

Full Stack Software Engineer

Senior Front End Developer

Customer Success Manager

Senior Software Engineer .NET Core

Front End TypeScript Developer
