About the company
P2P helps investors compound their cryptocurrency investments through non-custodial staking. We provide high uptime, secure staking with advanced monitoring & support.
Job Summary
What You’ll Do
📍Design, maintain, and scale streaming ETL pipelines for blockchain data. 📍Build and optimize ClickHouse data models and materialized views for high-performance analytics. 📍Develop and maintain data exporters using orchestration tools. 📍Implement data transformations and decoding logic. 📍Establish and improve testing, monitoring, automation, and migration processes for pipelines. 📍Ensure timely delivery of new data features in alignment with product goals. 📍Combine multiple data sources — indexers and Kafka topics from third parties — to aggregate them into tables for our API. 📍Create automation tools for data analyst inputs, such as a dictionary, to keep them up to date. 📍Collaborate within the team to deliver accurate, reliable, and scalable data services that power the Lambda app.
Tech Stack
📍Streaming & ETL: Managed Flink-based pipelines (real-time event & transaction processing), Apache Kafka 📍Data Warehouse: ClickHouse (Cloud) 📍Workflow orchestration: Airflow 📍Programming: Python (data processing, services, automation) 📍Domain: Multi-chain crypto data (EVM & non-EVM ecosystems) 📍4+ years in Data Engineering (ETL/ELT, data pipelines, streaming systems).
If this role isn’t the perfect fit, there are plenty of exciting opportunities in blockchain technology, cryptocurrency startups, and remote crypto jobs to explore. Check them on our Jobs Board.