About the company
Instead of waiting for best talent to come to you (majority of which are irrelevant applications), do targeted reach out using AI and India's largest talent database
Job Summary
Key Responsibilities
📍Design and develop robust, scalable data pipelines for ingestion, transformation, and integration of structured and unstructured data. 📍Perform comprehensive data wrangling, cleaning, and transformation from multiple formats (API, CSV, XLS, JSON, etc.). 📍Work with BigQuery, DataFlow (Apache Beam), and Google Cloud Storage (GCS) to manage large-scale data workflows. 📍Implement and maintain data validation and quality assurance processes. 📍Contribute to data modeling and schema design, especially for knowledge graph development (Schema.org, RDF, SPARQL, JSON-LD). 📍Collaborate within Agile teams to deliver scalable, reliable, and efficient data solutions. 📍Apply CI/CD practices (e.g., Cloud Build) to ensure seamless development and deployment workflows.
Core Competencies
Must Have: 📍Proficiency in SQL and Python. 📍Experience with BigQuery, GCS, and GCP DataFlow / Apache Beam. 📍Proven ability to handle complex data transformations across diverse data formats.
The crypto industry is evolving rapidly, offering new opportunities in blockchain, web3, and remote crypto roles — don’t miss your chance to be part of it.




