About the company
Committed to developing products and applications on the Aptos blockchain that redefine the web3 user experience. Our team of accomplished technical experts is dedicated to creating better network tooling and seamless usability to bring the benefits of decentralization to the masses.
Job Summary
What you'll be doing:
šDefine, build, and deliver data pipelining architecture. This includes ingesting data from different data sources and creating aggregate views. šWork with stakeholders to assemble large, complex data sets for ready business consumption. šIdentify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. šEnsure that Aptos Data Warehouse has high-quality, high-trust data the company can drive decisions from. Build a core data model that serves as the foundation of all of the use cases. šManage data scalability, partitioning, growth, and availability utilizing cloud data warehousing technologies like Bigquery. šAssist analytics and data scientist team members in building and optimizing our product into an innovative industry leader.
What weāre looking for:
š3+ years of relevant experience šA degree in a technical field such as Finance, Data Science, Statistics, Computer Science, or similar field šStrong data wrangling and SQL skills. You have a track record of optimizing large pipelines to run very efficiently šExperience in at least one programming language (e.g. Python) šExperience manipulating large amounts of structured and unstructured data through pipeline development tools like Airflow šBe able to proactively manage prioritization of work and deliver work with great quality and influence the broader team in creating leverage šHands-on experience building dashboards with a data visualization tool such as Tableau, Looker, Data Studio, etc.