About the company
Founded in Belgium in 2017, Keyrock are cryptocurrency market makers building scalable, self-adaptive algorithmic technologies to support efficient digital asset markets. Through a combination of in-house algorithmic trading tools, high-frequency trading infrastructure and industry expertise, Keyrock provides unparalleled liquidity services to tokens, exchanges and brokerages within the cryptocurrency ecosystem. Keyrock operates with the vision of democratizing cryptocurrency liquidity through a strict dedication to transparency, operational integrity and regulatory compliance.
Job Summary
Job description
šDesign, implement, and maintain scalable and efficient ETL pipelines using Python and SQL to process and store large volumes of financial data. šDevelop and optimize data models and schemas in PostgreSQL and Clickhouse to support complex queries and analytics. šBuild and maintain real-time data streaming and processing systems using AWS Kinesis and Lambda. šAutomate infrastructure management using Terraform for deploying and managing AWS resources, ensuring a secure and scalable environment. šCollaborate with trading and analytics teams to understand data requirements and ensure the availability and quality of data for decision-making. šImplement data monitoring and alerting mechanisms to proactively identify and resolve issues in data pipelines. šWrite shell scripts to automate routine data processing tasks and system maintenance. šEnsure compliance with data governance and security best practices, especially around handling sensitive financial data.
Background and experience
š3+ years of experience as a Data Engineer or in a similar role within financial services, ideally with exposure to digital assets or cryptocurrency markets. šProficiency in Python for data engineering tasks and automation. šStrong SQL skills for complex query design, optimization, and data modeling. šExperience with AWS services, specifically Lambda, Kinesis, RDS, S3, and IAM. šHands-on experience with Terraform for infrastructure as code and managing AWS environments. šFamiliarity with Clickhouse for handling large datasets and real-time analytics is highly desirable. šExperience with PostgreSQL for data warehousing and analytics.