About the company
The World's Leading Cryptocurrency Platform
Job Summary
Key Responsibilities:
📍Design, implement, and manage scalable ClickHouse infrastructure for large datasets, ensuring optimal performance, availability, and maintainability. 📍Develop robust, scalable ETL pipelines using Python to handle data ingestion, transformation, and storage. 📍Collaborate with infrastructure teams to ensure proper AWS setup, covering everything from EC2, S3, RDS, Lambda, and more. 📍Build and maintain data orchestration workflows using Airflow and integrate Kafka for real-time data streaming and processing. 📍Work closely with cross-functional teams, including trading, risk management, and middle office, to ensure the data infrastructure supports all business needs. 📍Optimize performance and scalability of data pipelines, ensuring data integrity and accuracy. 📍Monitor and troubleshoot data infrastructure, implementing best practices for data governance and security. 📍Mentor junior engineers and provide technical leadership within the team.
Qualifications:
📍8+ years of experience in data engineering, with a focus on building and managing large-scale data systems. 📍Strong hands-on experience with ClickHouse, both in infrastructure setup and operational use. 📍Proficiency in Python, especially for building ETL pipelines and data workflows. 📍Deep understanding of AWS infrastructure, including networking, storage, compute, and security. 📍Experience with data orchestration and workflow automation using Airflow. 📍Familiarity with Kafka for building real-time data pipelines. 📍Exposure to trading, crypto, middle office, or trade capture/trade matching systems is a significant plus. 📍Experience or knowledge in machine learning and AI is a bonus. 📍Strong problem-solving skills, attention to detail, and the ability to work under pressure in a fast-paced environment.