About the company
Easygo is the Australian powerhouse behind some of the worldās fastest growing online brands including Kick and Stake At Easygo we proudly stand as a prominent service provider to a powerhouse of brands within the iGaming industry, including Stake.com, Kick.com and Twist Gaming. Stake is the world's largest crypto casino, and leads the industry with a seamless online casino and sportsbook experience. Level up your online entertainment with Kick.com, the vibrant live streaming platform, which connects millions of gamers and content creators worldwide. All alongside the innovative game design studio, Twist Gaming, which takes creativity to new heights by crafting cutting-edge and captivating games. Our commitment to placing our clients and their communities' entertainment at the forefront of everything we do, has solidified us as the ultimate online service provider for entertainment companies.
Job Summary
What you'll do:
šETL Development: Design, develop, and maintain ETL processes using Matillion to extract, transform, and load data from various source systems into our data warehouse. šData Ingestion and Integration: Integrate data from diverse sources such as databases, APIs, flat files, and cloud storage into a centralized data repository. Use AWS Glue to automate data extraction and transformation from various sources like S3, databases, and streaming services for real-time data ingestion, leverage Amazon Kinesis streams or AWS Data Pipeline. šData Transformation: Transform and clean data to ensure accuracy, consistency, and quality, while also optimizing performance for analytical queries. šPerformance Optimisation: Monitor and optimise Matillion ETL jobs for efficiency, scalability, and performance to meet business needs. šData Modelling: Collaborate with data analysts and data scientists to design and implement data models that support analytical reporting and data visualization. šData Governance: Ensure data governance best practices are followed, including data lineage, data quality, and data security. šTroubleshooting: Identify and resolve ETL job failures, data issues, and performance bottlenecks in a timely manner. šScalability and Automation: Plan for scalability as data volumes grow and automate repetitive tasks.
What you'll bring:
šBachelor's degree in Computer Science, Information Systems, or a related field (Master's degree preferred). šProven experience in data engineering or a related role, with a strong background in data modeling, ETL, and database management. šProficiency in programming languages such as SQL, Python, or Java. šFamiliarity with data warehousing solutions (e.g., Snowflake, Redshift) and big data technologies (e.g., Hadoop, Spark). šKnowledge of data visualization tools (e.g., Tableau, Power BI) is a plus. šSolid understanding of data governance, data security, and compliance standards