Project Description:You will be a critical team member and developing the means to collect and ingest data, developing data models and data architectures, creating automated data pipelines, and taking the lead in making these Production-ready. You will assist with integrating with existing applications, and will be working with leading organizations and teams in the region.We are interested in hearing from individuals with a background in data - both SQL and non-SQL, with very strong Python scripting background.
Responsibilities:You will work alongside a strong, global team of individuals with diverse backgrounds and skills in a variety of areas to:- Analyse data sources, and acquire data- Create data pipelines, and integrate to final data destinations- Create appropriate data models, architectures and pipelines- Move the models and pipelines into ProductionYou will assist the practice in:- Developing templates and accelerators, across a variety of libraries and platforms.- Participating in data workshops and client work as necessary- You will collaborate with business and technology partners to grow and develop the data engineering practice.
Mandatory Skills:Strong understanding of DWH concepts and implementation methodology using, Data vault, Snowflake Schema and Star Schema design pattern.Experience in design, implementation and the development of the complete Data Warehousing solutions using the Azure, DBT (Data Build Tool), Snowflake, Python etc.Must demonstrate very good DBT (Data Build Tool) experience.Strong experience in ETL/ ELT framework design and development adhering to best practices and standards.Good exposure in carrying the solution design and migration approach for the cloud-based Data platform using Microsoft Azure.Excellent documentation skills around producing and maintaining the technical design artifacts for different BI projects.Excellent leadership quality to lead the DEV/test team and providing the technical expertise.Excellent communication and stakeholder management skills.Hands on experience to carry out POC(s), technical impact assessment, troubleshoot live issues.Performance Optimization: Identifying and implementing optimizations to enhance the performance of BI/DW systems, ensuring timely and accurate data delivery.Security: Implementing robust security measures to protect sensitive data and ensure compliance with regulations.Collaboration: Collaborating with stakeholders, including business analysts, data scientists, and IT teams, to understand requirements and deliver effective solutions.Scalability: Planning for scalability and future growth, considering data volume and user requirements.
Nice-to-Have Skills:Good understanding of infrastructure deployment as IACExperience with Data modelling toolAzure, Snowflake and DBT certification
Languages:English: C1 Advanced