About the company
Decentraland is the worldās first fully decentralized, Ethereum blockchain-based virtual world, built, governed and owned by its users. Itās a truly unique ecosystem with its own decentralized autonomous organization, currency, marketplace and system of property ā and itās growing fast! New creations are added daily by creators who use proprietary developer tools to make games, puzzles, scenes, artworks ā whatever their imaginations allow. Via their personal avatars, users attend live music events, conferences, exhibitions, dance parties and other experiences every day of the year. What began as a proof-of-concept for assigning ownership of digital real estate to users of a blockchain is now an immersive, ever-expanding and richly detailed metaverse, where anything is possible. Check it out at: http://play.decentraland.org/
Job Summary
Responsibilities
šDesign, build, and maintain complex ETL/ELT pipelines and data workflows using Snowflake, dbt, Airflow, and Meltano within our AWS-based infrastructure. šEnsure high availability, scalability, and security of data services running on our own infrastructureāsupporting batch and real-time data processing needs. šCollaborate closely with analytics, product, and engineering teams to translate business requirements into effective data models, KPIs, and dashboards using Metabase and Segment. šDevelop automated solutions for ingesting, transforming, and modeling blockchain, gaming telemetry, and user interaction data to drive actionable insights. šParticipate in cross-functional initiatives to evolve our data platform architecture and enable data democratization across Decentraland. šExplore innovative approaches to capturing and leveraging decentralized data aligned with the blockchain and gaming domains.
The Requirements
š5+ years of hands-on experience as a data engineer or similar role focused on large-scale data infrastructure. šAdvanced proficiency with Python for building data pipelines and automation. šExpertise building and optimizing data warehouses and transformations on Snowflake and dbt. šStrong experience orchestrating workflows with Apache Airflow and managing data ETL/ELT pipelines. šDeep knowledge of AWS services related to data processing and storage. šFamiliar with modern data stack components such as Meltano, Metabase, and Segment, including deployment and monitoring on own infrastructure. šProven ability to collaborate effectively with cross-functional teams and communicate technical concepts clearly. šDemonstrated experience extracting, processing, or integrating blockchain data is highly desirable. šPassion for working in the blockchain/gaming industry and decentralized architectures.
Looking for your next challenge? The world of crypto offers exciting roles in blockchain development, web3 innovations, and remote opportunities.