About the company
Our client is a global leader in energy technology. Located in 90 countries, operates across the whole energy landscape. From conventional to renewable power, from grid technology to storage to electrifying complex industrial processes. The mission is to support companies and countries with what they need to reduce greenhouse gas emissions and make energy reliable, affordable and more sustainable.
Job Summary
Responsibilities:
📍Designing, building, and maintaining scalable data pipelines using technologies such as Dataform, Google BigQuery, Docker, Pub/Sub, and Airflow; 📍You will belong to our cross-functional team in which you will work with requirements, test, development, and deployment and have a dialogue with stakeholders around the solutions; 📍Collaborating with other developers in the team by coaching, performing code reviews and being involved in design decisions. 📍Technological stack - Google Cloud Platform (or experience from other cloud provider), Dataform, Google BigQuery, Docker, 📍Pub/Sub, Airflow, Terraform, Python, SQL.
Requirements
📍Experience in software development minimum 5 years, experience from a cloud or containerized stack with virtual infrastructure, preferably GCP; 📍A deep understanding of data modeling concepts, ETL processes and how to build a good data infrastructure with a modern setup; 📍Experience with modern DevOps practices, including CI/CD pipelines, version control using GitLab and infrastructure as code using Terraform; 📍Experience with modern data architectures, such as Event-driven, Data Lake, Data Warehouse and Lake House; 📍Experience with modern data modeling techniques like Dimensional Modelling and Data Vault; 📍Advanced knowledge of Python and SQL.