Responsibilities:
Airflow Pipeline Development: Design, develop, test, and maintain data pipelines and workflows using Apache Airflow for data ingestion, transformation, and orchestration.Snowflake Integration: Integrate Airflow with Snowflake, ensuring seamless data transfer, loading, and transformation processes.Monitoring and Optimization: Implement monitoring, alerting, and logging systems to ensure the health and performance of Airflow workflows. Optimize pipeline performance and efficiency.Data Transformation: Develop and maintain Python scripts and transformations as required to process and prepare data for analysis and reporting.Code Review: Review and provide guidance on code written by team members to maintain coding standards and best practices.Documentation: Create and maintain comprehensive documentation for data pipelines, workflows, and integration processes.Collaboration: Collaborate with data engineers, data scientists, and other stakeholders to understand data requirements and ensure data availability and quality.Security and Compliance: Ensure data security and compliance with relevant regulations, such as GDPR and HIPAA, by implementing appropriate access controls and encryption. Qualifications:Bachelor's degree in computer science, data engineering, or a related field (or equivalent work experience).Proven experience in Airflow development, including designing and building complex workflows.Strong Experience with DBT.Strong proficiency in Snowflake data warehousing, including data loading, transformation, and querying.Proficiency in Python programming and SQL.Experience with data modelling and ETL processes.