Key Responsibilities
• Design, develop, and maintain data pipelines on GCP
• Build and optimize ETL/ELT workflows using tools such as Dataflow, Dataproc, Cloud Composer (Airflow), or dbt
• Develop and manage data models in BigQuery
• Ensure data quality, reliability, and performance across pipelines
• Implement CI/CD for data workflows using Git and automation tools
• Collaborate with stakeholders to translate business requirements into data solutions
• Monitor, troubleshoot, and optimize data jobs and infrastructure
• Apply best practices for security, cost optimization, and scalability
Required Skills & Qualifications
• GCP certifications (Professional Data Engineer)
• Strong experience with Google Cloud Platform (GCP)
• Hands-on expertise with BigQuery
• Proficiency in SQL and Python
• Experience with orchestration tools such as Airflow / Cloud Composer
• Knowledge of data modeling and analytics engineering best practices
• Experience with dbt (preferred)
• Familiarity with Docker and CI/CD pipelines
• Solid understanding of cloud IAM, networking, and security fundamentals