About the company
Nansen is a blockchain analytics platform that enriches on-chain data with millions of wallets labels. Crypto investors use Nansen to discover opportunities, perform due diligence and defend their portfolios with our real-time dashboards and alerts.
Job Summary
What You'll Do:
šCollaborate closely with crypto researchers, other engineers, and product managers to shape how data is modeled, surfaced, and productized. šTackle large-scale data challenges, processing terabytes of streaming and batch data daily. šDesign, build, and scale performant data pipelines and infrastructure, primarily using ClickHouse and dbt. šEnsure high standards for data quality, reliability, and observability across all systems. šBring fresh thinking to the table, staying current with best practices and evolving your toolkit over time. šUse AI tools and Agents such as Cursor, MCPs and LLMs to accelerate development, automate repetitive work, and boost quality.
What We're Looking For:
šA proven track record of building and scaling high-performance data systems in production. šExpertise in SQL and Python, with hands-on experience using dbt, ClickHouse, and BigQuery. šStrong grasp of streaming data architectures and experience handling large-scale data volumes. šComfortable working full-stack with data, from ingestion and transformation to storage, modeling, and serving. šExperience using AI-powered tools in day-to-day development and a natural curiosity to push boundaries with them.
The crypto industry is evolving rapidly, offering new opportunities in blockchain, web3, and remote crypto roles ā donāt miss your chance to be part of it.