Create New Account
Sign up to continue searching for suitable jobs in Web 3.0

OR
Terms of Use
Already have an account?

Log In to Your Account
Log in to continue searching for suitable jobs in Web 3.0

OR
Don’t have an account?
Kraken
Data Engineer Data Platm
atĀ Kraken
about 10 hours ago | 21 views | Be the first one to apply

Data Engineer Data Platm

Full-time
Canada

About the company

Kraken, the trusted and secure digital asset exchange, is on a mission to accelerate the adoption of cryptocurrency so that you and the rest of the world can achieve financial freedom and inclusion. Our 2,350+ Krakenites are a world-class team ranging from the crypto-curious to industry experts, united by our desire to discover and unlock the potential of crypto and blockchain technology. As a fully remote company, we already have Krakenites in 70+ countries (speaking 50+ languages). We're one of the most diverse organizations on the planet and this remains key to our values. We continue to lead the industry with new product advancements like Kraken NFT, on- and off-chain staking and instant bitcoin transfers via the Lightning Network.

Job Summary

The opportunity

šŸ“Build scalable and reliable data pipelines that collect, transform, load and curate data from internal systems šŸ“Augment data platform with data pipelines from external systems. šŸ“Ensure high data quality for pipelines you build and make them auditable šŸ“Drive data systems to be as near real-time as possible šŸ“Support design and deployment of distributed data store that will be central source of truth across the organization šŸ“Build data connections to company's internal IT systems šŸ“Develop, customize, configure self service tools that help our data consumers to extract and analyze data from our massive internal data store šŸ“Evaluate new technologies and build prototypes for continuous improvements in data engineering.

Skills you should HODL

šŸ“5+ years of work experience in relevant field (Data Engineer, DWH Engineer, Software Engineer, etc) šŸ“Experience with data-lake and data-warehousing technologies and relevant data modeling best practices (Presto, Athena, Glue, etc) šŸ“Proficiency in at least one of the main programming languages used: Python and Scala. Additional programming languages expertise is a big plus! šŸ“Experience building data pipelines/ETL in Airflow, and familiarity with software design principles. šŸ“Excellent SQL and data manipulation skills using common frameworks like Spark/PySpark, or similar. šŸ“Expertise in Apache Spark, or similar Big Data technologies, with a proven record of processing high volumes and velocity of datasets. šŸ“Experience with business requirements gathering for data sourcing. šŸ“Bonus - Kafka and other streaming technologies like Apache Flink.

The crypto industry is evolving rapidly, offering new opportunities in blockchain, web3, and remote crypto roles — don’t miss your chance to be part of it.

Salaries for similar jobs:

Similar jobs

about 10 hours ago | 28 views | Be the first one to apply
Full-time
New York, North America
$140,000 To $155,000 per year
about 10 hours ago | 29 views | Be the first one to apply
1 day ago | 58 views | Be the first one to apply
1 day ago | 59 views | Be the first one to apply
Full-time
North America, United States
1 day ago | 35 views | Be the first one to apply
Full-time
Portugal, Europe