Create New Account
Sign up to continue searching for suitable jobs in Web 3.0

Terms of Use
Already have an account?

Log In to Your Account
Log in to continue searching for suitable jobs in Web 3.0

Don’t have an account?
Data Engineering Lead
at BitMEX
about 2 months ago | 181 views | Be the first one to apply

Data Engineering Lead

Hong Kong
Per year
$105,000 To $120,000

About the company

BitMEX is the world’s leading cryptocurrency derivatives trading platform, which has pioneered cryptocurrency trading through relentless commitment to change, and continues to set benchmarks for innovation, liquidity, and security today. As the world's most advanced peer-to-peer crypto-products trading platform and API, BitMEX gives knowledge, confidence, and precision to hundreds of thousands of traders, transacting billions of USD a day. Join us, as we build a thriving cryptocurrency ecosystem through strategic investments in emerging cryptocurrency technology, and create the future of digital financial services.

Job Summary


📍Design and maintain enhancements to our data warehouse, data lake, and data pipelines to increase their reliability and consistency 📍Improve queriability of large historical datasets through industry-standard tools, careful data representation and aggregation for both technical and business units 📍Ensure data governance and security/retention policies can be implemented and enforced 📍Ensure that operational system integrations driven from the data stack are running, monitored and available 📍Continually review the data platform to ensure that it is fit for purpose and meeting the needs of the business 📍Support and maintain downstream integrations from our data lake, for example business intelligence and visualization tools and third party systems


📍Experience in the data engineering field with demonstrated design and technical implementation of data warehouses 📍Experience with OLAP databases, how they differ from OLTP databases, and data structuring/modeling with understanding of key data points in a business sense for trade-offs between storage/performance and usability 📍Experience with building, deploying, and troubleshooting reliable, idempotent and consistent data pipelines working with disparate in-house and external data-sources, e.g. using Airflow DAGs 📍Experience with AWS Redshift, Glue Data Catalog, S3, PostgreSQL, Parquet, Iceberg, Trino, and how they are managed using Terraform & Kubernetes 📍Experience with data loading, extraction, manipulation and preparing data for ingestion and integration with visualization platforms such as Tableau.

Similar jobs

about 10 hours ago | 5 views | Be the first one to apply
San Francisco
about 10 hours ago | 9 views | 1 applications
$72,000 To $112,000 per year
1 day ago | 18 views | Be the first one to apply
2 days ago | 33 views | 2 applications
$63,000 To $90,000 per year
2 days ago | 18 views | Be the first one to apply