About the company
Founded in Belgium in 2017, Keyrock are cryptocurrency market makers building scalable, self-adaptive algorithmic technologies to support efficient digital asset markets. Through a combination of in-house algorithmic trading tools, high-frequency trading infrastructure and industry expertise, Keyrock provides unparalleled liquidity services to tokens, exchanges and brokerages within the cryptocurrency ecosystem. Keyrock operates with the vision of democratizing cryptocurrency liquidity through a strict dedication to transparency, operational integrity and regulatory compliance.
Job Summary
Job description
📍Design, implement, and maintain scalable and efficient ETL pipelines using Python and SQL to process and store large volumes of financial data. 📍Develop and optimize data models and schemas in PostgreSQL and Clickhouse to support complex queries and analytics. 📍Build and maintain real-time data streaming and processing systems using AWS Kinesis and Lambda. 📍Automate infrastructure management using Terraform for deploying and managing AWS resources, ensuring a secure and scalable environment. 📍Collaborate with trading and analytics teams to understand data requirements and ensure the availability and quality of data for decision-making. 📍Implement data monitoring and alerting mechanisms to proactively identify and resolve issues in data pipelines. 📍Write shell scripts to automate routine data processing tasks and system maintenance. 📍Ensure compliance with data governance and security best practices, especially around handling sensitive financial data.
Background and experience
📍3+ years of experience as a Data Engineer or in a similar role within financial services, ideally with exposure to digital assets or cryptocurrency markets. 📍Proficiency in Python for data engineering tasks and automation. 📍Strong SQL skills for complex query design, optimization, and data modeling. 📍Experience with AWS services, specifically Lambda, Kinesis, RDS, S3, and IAM. 📍Hands-on experience with Terraform for infrastructure as code and managing AWS environments. 📍Familiarity with Clickhouse for handling large datasets and real-time analytics is highly desirable. 📍Experience with PostgreSQL for data warehousing and analytics.