About the company
Coins is the most established crypto brand in The Philippines and has gained the trust of more than 18 million users. Through the easy-to-use mobile app, users can buy and sell a variety of different cryptocurrencies and access a wide range of financial services. Coins is fully regulated by the Bangko Sentral ng Pilipinas (BSP) and is the first ever crypto-based company in Asia to hold both Virtual Currency and Electronic Money Issuer licenses from a central bank.
Job Summary
Responsibilities:
📍 Design, develop, and maintain highly scalable, reliable, and efficient data processing systems with a strong emphasis on code quality and performance. 📍 Collaborate closely with data analysts, software developers, and business stakeholders to deeply understand data requirements and architect robust solutions to address their needs. 📍 Focus on the development and maintenance of ETL pipelines, ensuring seamless extraction, transformation, and loading of data from diverse sources into our data warehouse based on Data-bricks platform. 📍 Spearhead the development and maintenance of real-time data processing systems utilizing cutting-edge big data technologies such as Spark Streaming and Kafka. 📍Establish and enforce rigorous data quality and validation checks to uphold the accuracy and consistency of our data assets. 📍 Act as a point of contact for troubleshooting and resolving complex data processing issues, collaborating with cross-functional teams as necessary to ensure timely resolution. 📍Proactively monitor and optimize data processing systems to uphold peak performance, scalability, and reliability standards, leveraging advanced AWS operational knowledge. 📍 Utilize AWS services such as EC2, S3, Glue and Data-bricks to architect, deploy, and manage data processing infrastructure in the cloud. 📍 Implement robust security measures and access controls to safeguard sensitive data assets within the AWS environment. 📍. Stay abreast of the latest advancements in AWS technologies and best practices, incorporating new tools and services to continually improve our data processing capabilities.
Requirements:
📍Bachelor’s or Master’s degree in Computer Science or a related field. 📍 Minimum of 5 years of hands-on experience as a Data Engineer, demonstrating a proven track record of designing and implementing sophisticated data processing systems. 📍Good understanding of Data-bricks platform and Delta Lake. Familiar with data job scheduler tool such as Dagster. 📍 Proficiency in one or more programming languages such as Scala, Java, or Python. 📍 Deep expertise in big data technologies including Apache Spark for ETL processing and optimization. 📍Proficient in utilizing BI tools such as Metabase for data visualization and analysis. 📍 Advanced understanding of data modeling, data quality, and data governance best practices. 📍Outstanding communication and collaboration skills, with the ability to effectively engage with diverse stakeholders across the organization.