Your Email Is Verified!
Now you can log in and start your job search

Create New Account
Sign up to continue searching for suitable jobs in Web 3.0

OR
Terms of Use
Already have an account?

Log In to Your Account
Log in to continue searching for suitable jobs in Web 3.0

OR
Don’t have an account?
Blockchain.com
Data Engineer
over 1 year ago | 1169 views | 11 applications

About the company

Blockchain.com (formerly Blockchain.info) is a cryptocurrency financial services company. The company began as the first Bitcoin blockchain explorer in 2011 and later created a cryptocurrency wallet that accounted for 28% of bitcoin transactions between 2012 and 2020. It also operates a cryptocurrency exchange and provides institutional markets lending business and data, charts, and analytics.

Job Summary

We are looking for someone with experience in designing, building, and maintaining a scalable and robust Data Infra that makes data easily accessible to the Data Science team and the broader audience via different tools. As a data engineer, you will be involved in all aspects of the data infrastructure, from understanding current bottlenecks and requirements to ensuring the quality and availability of data. You will collaborate closely with data scientists, platform, and front-end engineers, defining requirements and designing new data processes for both streaming and batch processing of data, as well as maintaining and improving existing ones. We are looking for someone passionate about high-quality data who understands their impact in solving real-life problems. Being proactive in identifying issues, digging deep into their source, and developing solutions, are at the heart of this role.

Junior:

What you will do

Maintain and evolve the current data lake infrastructure and look to evolve it for new requirements Maintain and extend our core data infrastructure and existing data pipelines and ETLs Provide best practices and frameworks for data testing and validation and ensure reliability and accuracy of data Design, develop and implement data visualization and analytics tools and data products.

What you will need

Bachelor’s degree in Computer Science, Applied Mathematics, Engineering or any other technology-related field Previous experience working in a data engineering project or role Fluency in Python Previous experience with ETL pipelines and data processing Good knowledge of SQL and no-SQL databases Good knowledge of coding principles, including Oriented Object Programming Experience with Git

Nice to have

Experience with Airflow, Google Composer or Kubernetes Engine Experience working with Google Cloud Platform Experiences with other programming languages, like Java, Kotlin or Scala Experience with Spark or other Big Data frameworks Experience with distributed and real-time technologies (Kafka, etc..) 1-2 years commercial experience in a related role

Middle:

What you will do

Maintain and evolve the current data infrastructure and look to evolve it for new requirements Maintain and extend our core data infrastructure and existing data pipelines and ETLs Provide best practices and frameworks for data testing and validation and ensure reliability and accuracy of data Design, develop and implement data visualization and analytics tools and data products.

What you will need

Bachelor’s degree in Computer Science, Applied Mathematics, Engineering or any other technology-related field Previous experience working in a data engineering role Fluency in Python Previous experience with ETL pipelines Experience working with Google Cloud Platform In-depth knowledge of SQL and no-SQL databases In-depth knowledge of coding principles, including Oriented Object Programming Experience with Git

Nice to have

Experience with code optimisation, parallel processing Experience with Airflow, Google Composer or Kubernetes Engine Experiences with other programming languages, like Java, Kotlin or Scala Experience with Spark or other Big Data frameworks Experience with distributed and real-time technologies (Kafka, etc..) 2-5 years commercial experience in a related role

Senior:

What you will do

Maintain and evolve the current data lake infrastructure and look to evolve it for new requirements Maintain and extend our core data infrastructure and existing data pipelines and ETLs Provide best practices and frameworks for data testing and validation and ensure reliability and accuracy of data Design, develop and implement data visualization and analytics tools and data products.

What you will need

Bachelor’s degree in Computer Science, Applied Mathematics, Engineering or any other technology-related field Previous experience working in a data engineering role Fluency in Python Experience in both batch processing and streaming data pipelines Experience working with Google Cloud Platform In-depth knowledge of SQL and no-SQL databases In-depth knowledge of coding principles, including Oriented Object Programming Experience with Git

Nice to have

Experience with code optimisation, parallel processing Experience with Airflow, Google Composer or Kubernetes Engine Experiences with other programming languages, like Java, Kotlin or Scala Experience with Spark or other Big Data frameworks Experience with distributed and real-time technologies (Kafka, etc..) 5-8 years commercial experience in a related role

Staff:

What you will do

Maintain and evolve the current data infrastructure and look to evolve it for new requirements Maintain and extend our core data infrastructure and existing data pipelines and ETLs Provide best practices and frameworks for data testing and validation and ensure reliability and accuracy of data Design, develop and implement data visualization and analytics tools and data products. Play a critical role in helping to set up directions and goals for the team Build and ship high-quality code, provide thorough code reviews, testing, monitoring and proactive changes to improve stability You are the one who implements the hardest part of the system or feature.

What you will need

Bachelor’s degree in Computer Science, Applied Mathematics, Engineering or any other technology-related field Previous experience working in a data engineering role Fluency in Python Experience in both batch processing and streaming data pipelines Experience working with Google Cloud Platform In-depth knowledge of SQL and no-SQL databases In-depth knowledge of coding principles, including Oriented Object Programming Experience with Git Ability to solve technical problems that few others can do Ability to lead/coordinate rollout and releases of major initiatives

Nice to have

Experience with code optimisation, parallel processing Experience with Airflow, Google Composer or Kubernetes Engine Experiences with other programming languages, like Java, Kotlin or Scala Experience with Spark or other Big Data frameworks Experience with distributed and real-time technologies (Kafka, etc..) 8+ years commercial experience in a related role You may contact our Data Protection Officer by email at [email protected]. Your personal data will be processed for the purposes of managing Controller’s recruitment related activities, which include setting up and conducting interviews and tests for applicants, evaluating and assessing the results thereto, and as is otherwise needed in the recruitment and hiring processes. Such processing is legally permissible under Art. 6(1)(f) of Regulation (EU) 2016/679 (General Data Protection Regulation) as necessary for the purposes of the legitimate interests pursued by the Controller, which are the solicitation, evaluation, and selection of applicants for employment.

Your personal data will be shared with Greenhouse Software, Inc., a cloud services provider located in the United States of America and engaged by Controller to help manage its recruitment and hiring process on Controller’s behalf. Accordingly, if you are located outside of the United States, your personal data will be transferred to the United States once you submit it through this site. Because the European Union Commission has determined that United States data privacy laws do not ensure an adequate level of protection for personal data collected from EU data subjects, the transfer will be subject to appropriate additional safeguards under the standard contractual clauses.

Your personal data will be retained by Controller as long as Controller determines it is necessary to evaluate your application for employment. Under the GDPR, you have the right to request access to your personal data, to request that your personal data be rectified or erased, and to request that processing of your personal data be restricted. You also have to right to data portability. In addition, you may lodge a complaint with an EU supervisory authority.

Similar jobs

about 11 hours ago | 19 views | Be the first one to apply
Full-time
Remote
1 day ago | 40 views | Be the first one to apply
1 day ago | 26 views | Be the first one to apply
Full-time
Canada
1 day ago | 49 views | 1 applications
Full-time
Remote, United Kingdom
$119,000 To $135,000 per year
1 day ago | 30 views | Be the first one to apply
Full-time
Amsterdam