About the company
Coins is the most established crypto brand in The Philippines and has gained the trust of more than 18 million users. Through the easy-to-use mobile app, users can buy and sell a variety of different cryptocurrencies and access a wide range of financial services. Coins is fully regulated by the Bangko Sentral ng Pilipinas (BSP) and is the first ever crypto-based company in Asia to hold both Virtual Currency and Electronic Money Issuer licenses from a central bank.
Job Summary
Responsibilities:
š Design, develop, and maintain highly scalable, reliable, and efficient data processing systems with a strong emphasis on code quality and performance. š Collaborate closely with data analysts, software developers, and business stakeholders to deeply understand data requirements and architect robust solutions to address their needs. š Focus on the development and maintenance of ETL pipelines, ensuring seamless extraction, transformation, and loading of data from diverse sources into our data warehouse based on Data-bricks platform. š Spearhead the development and maintenance of real-time data processing systems utilizing cutting-edge big data technologies such as Spark Streaming and Kafka. šEstablish and enforce rigorous data quality and validation checks to uphold the accuracy and consistency of our data assets. š Act as a point of contact for troubleshooting and resolving complex data processing issues, collaborating with cross-functional teams as necessary to ensure timely resolution. šProactively monitor and optimize data processing systems to uphold peak performance, scalability, and reliability standards, leveraging advanced AWS operational knowledge. š Utilize AWS services such as EC2, S3, Glue and Data-bricks to architect, deploy, and manage data processing infrastructure in the cloud. š Implement robust security measures and access controls to safeguard sensitive data assets within the AWS environment. š. Stay abreast of the latest advancements in AWS technologies and best practices, incorporating new tools and services to continually improve our data processing capabilities.
Requirements:
šBachelorās or Masterās degree in Computer Science or a related field. š Minimum of 5 years of hands-on experience as a Data Engineer, demonstrating a proven track record of designing and implementing sophisticated data processing systems. šGood understanding of Data-bricks platform and Delta Lake. Familiar with data job scheduler tool such as Dagster. š Proficiency in one or more programming languages such as Scala, Java, or Python. š Deep expertise in big data technologies including Apache Spark for ETL processing and optimization. šProficient in utilizing BI tools such as Metabase for data visualization and analysis. š Advanced understanding of data modeling, data quality, and data governance best practices. šOutstanding communication and collaboration skills, with the ability to effectively engage with diverse stakeholders across the organization.