About the company
Flipside Crypto enables on-demand analytics for blockchains, driving ecosystem growth and retention. Through a free, open data platform, it enables 60,000 analysts to learn, collaborate and compete to solve analytical challenges via structured bounty programs. Flipside activates on projects with kinetic energy including Flow, Solana, Algorand, THORChain, SushiSwap, and Osmosis. Founded in 2017, Flipside Crypto recently raised $50M in a Series A financing led by Republic Capital with other investors including True Ventures, Galaxy Digital, M13, Dapper Labs, Collab Currency, Tribe Capital, HashKey and others. Flipside is remote-first (originally headquartered in Boston, MA) with a growing team of 75+ employees.
Job Summary
Primary Responsibilities:
📍Data Pipelines and Modeling: 📍Design, develop, and maintain robust data pipelines to support the Finance and Product team's data needs. 📍Integrate data from sources including internally-generated product data, human-collected operational data, and publicly available blockchain data into useful models and views. 📍Utilize industry-standard data design patterns (Medallion Architecture) to simplify maintenance and enable adoption across the Internal Analytics team. 📍Build and support business intelligence models (Star Schema) for downstream reporting and analysis. 📍Data Analysis and Reporting: 📍Analyze financial, product, and operational data to provide actionable insights and build understanding. 📍Develop and maintain dashboards and reports using SQL and Python-based data visualization tools. 📍Communicate findings in written and verbal formats, both synchronously and asynchronously, as required. 📍Cross-Functional Collaboration and Data Integrity: 📍Work closely with the Finance and Product teams to define and document their requirements, and deliver solutions that meet their needs. 📍Ensure organization-wide visibility into business intelligence work with clear and consistent communication, employing basic project management skills to ensure timely delivery of commitments. 📍Build a deep understanding of our business systems and data pipelines to investigate anomalous or interesting data, and ensure quality and intelligibility of downstream reporting.
Qualifications:
📍Bachelor’s degree in Analytics, Data Science, Finance, Computer Science, or related technical field, or equivalent practical experience. 📍2+ years of experience in data engineering, data analysis, business intelligence, or a similar role. 📍Proficiency in SQL (experience in any dialect is acceptable, though we use Snowflake SQL). 📍Experience with ELT tools and data pipeline frameworks. 📍Experience with data warehouse/lakehouse architectures for business intelligence. 📍Strong analytical skills with the ability to integrate and interpret complex data sets. 📍Excellent communication skills and the ability to work collaboratively with cross-functional teams.