Position: Hadoop ExpertLocation: REMOTEEmployment Type: CONTRACT FOR 50 DAYS
Key Responsibilities:Hadoop Administration:Install, configure, and manage Hadoop clusters.Monitor cluster health and performance.Implement and manage security policies for Hadoop clusters.Ensure high availability and scalability of Hadoop clusters.HDFS Management:Perform HDFS maintenance operations such as balancing and upgrades.Manage HDFS storage policies and quotas.Troubleshoot HDFS issues and perform necessary recovery operations.HIVE Management:Set up and optimize HIVE environments.Develop and maintain HIVE scripts for data processing.Implement HIVE security and compliance policies.Optimize HIVE queries for performance and resource utilization.Spark Development:Design and implement Spark applications for data processing and analytics.Optimize Spark jobs for performance and resource efficiency.Integrate Spark with Hadoop and HIVE for seamless data processing.Data Management:Collaborate with data engineers and data scientists to design and implement data workflows.Ensure data quality, integrity, and governance.Develop and maintain ETL processes for data ingestion and transformation.Technical Support:Provide technical support for Hadoop, HDFS, HIVE, and Spark environments.Resolve performance issues and optimize resource usage.Document configurations, procedures, and best practices.Qualifications:Bachelor’s degree in Computer Science, Information Technology, or a related field.5+ years of experience with Hadoop, HDFS, HIVE, and Spark.Proficiency in programming languages such as Java, Scala, or Python.Strong understanding of distributed systems and big data architectures.Experience with Linux/Unix systems administration.Familiarity with cloud platforms such as AWS, Azure, or Google Cloud is a plus.Excellent problem-solving skills and attention to detail.Strong communication and collaboration skills.