Skills and Qualifications :
At least 2 years of experience in the Snowflake database; Expertise with various ETL technologies and familiarity with ETL tools;
Ability to articulate and implement Snowflake standard methodologies and find an inventive way of implementing them; Strong technical experience in large distributed systems, Data Warehousing, Data Lake at scale;
Design, develop and deploy production-ready algorithms at scale; Knowledge of orchestrating workloads on the cloud; Knowledge of one or more scripting languages;
Good understanding and implementation experience in Data Governance, Data Quality, Data Modelling etc; English Upper Intermediate (great communication skills, both verbal and written).
Responsibilities :
Build and maintain data lake, including Data Pipeline, Data Governance; Design and develop data pipeline with transformation;
Performance tuning and troubleshooting of data pipeline and transformation scripts; Applying best practices and methodologies.
Nice to have :
Experience in Snowflake performance tuning, capacity planning, handling cloud spend and utilization; Experience with PySpark, Python programming;
Experience with NoSQL and streaming platforms, e.g. Kafka, MongoDB, Neo4j; Proficiency with APIs, containerization, and orchestration;
Informatica CAI experience.