Create New Account
Sign up to continue searching for suitable jobs in Web 3.0

OR
Terms of Use
Already have an account?

Log In to Your Account
Log in to continue searching for suitable jobs in Web 3.0

OR
Donā€™t have an account?
Pintu
Senior Data Engineer
atĀ Pintu
about 1 year ago | 980 views | Be the first one to apply

Senior Data Engineer

Full-time
Remote
Per year
$81,000 To $100,000

About the company

At PINTU, We are building the #1 crypto investment platform to focus on new investors in Indonesia and Southeast Asia. We know that 99% of new investors are underserved because existing solutions cater to the 1% who are pros and early adopters hence we built an app that helps them to learn, invest and sell cryptocurrencies with one click away.

Job Summary

In this role, you will:

šŸ“Conceptualize and generate infrastructure that allows big data to be accessed and analyzed; šŸ“Reformulate existing frameworks to optimize their functioning; šŸ“Test such structures to ensure that they are fit for use; šŸ“Liaise with coworkers and specific stakeholders to elucidate the requirements for each task šŸ“Keep up-to-date with blockchain standards and technological advancements that will improve the quality of your outputs.

Who We Are Looking For

šŸ“A Bachelorā€™s Degree in Computer Science or a related field preferred šŸ“5+ years of Data Integration experience. šŸ“3+ years of hands-on experience with one of the following technologies: Apache Spark, SQL or BigQuery or PostgreSQL šŸ“Proficiency in at least one of the following programming languages: Python, Scala, and Java šŸ“Experience in writing Apache Spark or Apache Beam including an understanding of optimization techniques. šŸ“Experience in data streaming and integration with Kafka. šŸ“Having experience in GCP Products like Bigquery, Dataflow, PubSub, Bigtable, Composer, and GCS šŸ“Proficiency in traditional RDBMS with an emphasis on Postgres, and MySQL. šŸ“General understanding of ETL/ELT frameworks, error handling techniques, data quality techniques and their overall operation šŸ“Proficient in developing and supporting all aspects of a big data cluster: Ingestion, šŸ“Processing, integration (Python, Spark, Scala), data cleansing, workflow management (Airflow), and querying (SQL). šŸ“Proficient in Docker and containerization šŸ“Capable of navigating and working effectively in a DevOps model including leveraging related technologies: Jenkins, GitLab, Git etc.

Similar jobs

about 3 hours ago | 8 views | 1 applications
about 3 hours ago | 5 views | Be the first one to apply
about 3 hours ago | 2 views | 1 applications
$100,000 To $150,000 per year
about 4 hours ago | 5 views | Be the first one to apply
about 4 hours ago | 3 views | Be the first one to apply