About the company
Founded in Belgium in 2017, Keyrock are cryptocurrency market makers building scalable, self-adaptive algorithmic technologies to support efficient digital asset markets. Through a combination of in-house algorithmic trading tools, high-frequency trading infrastructure and industry expertise, Keyrock provides unparalleled liquidity services to tokens, exchanges and brokerages within the cryptocurrency ecosystem. Keyrock operates with the vision of democratizing cryptocurrency liquidity through a strict dedication to transparency, operational integrity and regulatory compliance.
Job Summary
Key Responsibilities
šDesigning Data Architecture: Plan and implement a robust, scalable data architecture that integrates data from various sources and supports diverse analytical needs, while optimizing costs and meeting business requirements. šImplementing Data Engineering Pipelines: Design and develop data pipelines for data extraction, transformation, and loading (ETL) processes, ensuring data quality and consistency. šEnabling Data Intelligence and Analytics: Build and maintain data warehouses, data marts, and data lakes to support business intelligence and data analytics initiatives. šSupporting MLOps Practices: Collaborate with data scientists and machine learning engineers to design and implement data infrastructure and processes that support machine learning model development, deployment, and maintenance. šEnsuring Data Security and Compliance: Implement security measures, policies, and procedures to safeguard data privacy and comply with relevant regulations. šData Governance and Management: Establish and enforce data governance policies and standards to ensure data quality, integrity, and accessibility. šCollaborating with Cross-Functional Teams: Work closely with data engineers, data scientists, business analysts, and other stakeholders to understand data requirements and translate them into technical solutions. šStaying Abreast of Technological Advancements: Keep up-to-date with emerging technologies and trends in data architecture, data engineering, and MLOps to identify opportunities for improvement and innovation. šOptimizing Data Performance: Monitor and analyze data processing performance, identify bottlenecks, and implement optimizations to enhance efficiency and scalability. šDocumentation and Knowledge Sharing: Create and maintain comprehensive documentation of data architecture, models, and processing workflows.
Technical Requirements
šExtensive experience in data architecture design and implementation. šStrong knowledge of data engineering principles and practices. šExpertise in data warehousing, data modelling, and data integration. šExperience in MLOps and machine learning pipelines. šProficiency in SQL and data manipulation languages. šExperience with big data platforms (including Apache Arrow, Apache Spark, Apache Iceberg, and Clickhouse) and cloud-based infrastructure on AWS.
If youāre passionate about blockchain and decentralized technologies, explore more opportunities in web3 and cryptocurrency careers.