We're seeking a skilled individual with expertise in Amazon Web Services (AWS) and Extract, Transform, Load (ETL) processes. Your role will involve crafting and maintaining ETL pipelines, implementing data integration solutions on AWS cloud services, and fine-tuning data processing workflows. If you're passionate about shaping the future of Salesforce data implementations, this opportunity is for you.
Key ResponsibilitiesResponsible for working with our Enterprise customers and migrating data into the Cloud.Set up the ETL process to move data into the cloud and refresh it daily.Helped the team optimize queries and evaluate different architectures.Working with internal teams and helping them make the most of our data lake.Helping with identifying processes and tasks that can be automated with internal tools.Participates in the data domain technical and business discussions relative to future architect direction.Researches and evaluates emerging data technology, industry, and market trends to assist in project development and/or operational support activities. QualificationsMinimum 10 years of hands-on experience in Salesforce cross-cloud testing with a proven track record.Mastery of Amazon Web Services (AWS) including S3, Glue, Redshift, and Athena.Proficiency in Extract, Transform, Load (ETL) processes and associated tools.Strong programming prowess, especially in scripting languages like Python or Java.Advanced SQL skills for efficient data querying and manipulation.Experience in data modeling, database concepts, and data warehousing principles.Ability to design, develop, and maintain ETL pipelines and data integration solutions.Familiarity with Salesforce Data Cloud and its implementation.Knowledge of big data frameworks such as Hadoop, Spark, or Kafka for processing and analyzing large datasets.Expertise in JEE, AWS, SOA, web development, web services, XML, JSON, DHTML, and Oracle 11G (or higher)/SQL Server.