About the company
We work with businesses globally to deliver tailored, end-to-end Artificial Intelligence, Consulting, Data, Digital, Cloud & DevOps and Software Engineering solutions that drive value and growth. Our business domain expertise covers hi-tech, financial services and insurance, while we explore the art of the possible in our groundbreaking Financial Labs (FinLabs).
Job Summary
Responsibilities:
šData Pipeline Development: šBuild and maintain scalable ETL/ELT pipelines using Databricks. šLeverage PySpark/Spark and SQL to transform and process large datasets. šIntegrate data from multiple sources including Azure Blob Storage, ADLS and other relational/non-relational systems. šCollaboration & Analysis: šWork Closely with multiple teams to prepare data for dashboard and BI Tools. šCollaborate with cross-functional teams to understand business requirements and deliver tailored data solutions.
Requirements:
šStrong expertise in Databricks (Delta Lake, Unity Catalog, Lakehouse Architecture, Table Triggers, Delta Live Pipelines, Databricks Runtime etc.). šProficiency in Azure Cloud Services. šSolid Understanding of Spark and PySpark for big data processing. šExperience in relational databases. šKnowledge on Databricks Asset Bundles and GitLab.
The crypto industry is evolving rapidly, offering new opportunities in blockchain, web3, and remote crypto roles ā donāt miss your chance to be part of it.