About the company
Figment is the worldās leading provider of blockchain infrastructure. We provide the most comprehensive staking solution for our over 200+ institutional clients including exchanges, wallets, foundations, custodians, and large token holders to earn rewards on their crypto assets. These clients rely on Figmentās institutional staking service including rewards optimization, rapid API development, rewards reporting, partner integrations, governance, and slashing protection. Figment is backed by industry experts, financial institutions and our global team across twenty three countries. This all leads to our mission to support the adoption, growth and long term success of the Web3 ecosystem. We are a growth stage technology company ā looking for people who are builders and doers. People who are comfortable plotting their course through ambiguity and uncertainty to drive impact and who are excited to work in new ways and empower a generative company culture.
Job Summary
Responsibilities
šImplement and maintain reliable data pipelines and data storage solutions. šImplement data modeling and integrate technologies according to project needs. šManage specific data pipelines and oversees the technical aspects of data operations šEnsure data processes are optimized and align with business requirements šIdentify areas for process improvements and suggests tools and technologies to enhance efficiency šContinuously improve data infrastructure automation, ensuring reliable and efficient data processing. šDevelop and maintain data pipelines and ETL processes using technologies such as Dagster and DBT to ensure efficient data flow and processing. šAutomate data ingestion, transformation, and loading processes to support blockchain data analytics and reporting. šUtilize Snowflake data warehousing solutions to manage and optimize data storage and retrieval. šCollaborate with Engineering Leadership and Product teams to articulate data strategies and progress. šPromote best practices in data engineering, cloud infrastructure, networking, and security.
Qualifications
šExtensive experience with data engineering, including building and managing data pipelines and ETL processes. šProficiency in programming languages such as Golang and Python. šStrong foundation in data networking, storage, and security best practices. šExperience developing CI/CD pipelines for automated data infrastructure provisioning and application deployment. šFamiliarity with a data orchestration tool (Dagster, Airflow, Mage, etc.). šFamiliarity with a data transformation tool (DBT, Beam, Dataform, Talend, etc.) šExperience with data warehousing solutions like Snowflake or similar technologies. šExperience in managing infrastructure across multiple cloud providers (AWS, GCP), with a focus on performance and security