Posted:2 months ago| Platform:
Remote
Full Time
Key Responsibilities: Snowflake Architecture & Setup : Design and implement Snowflake environments , ensuring best practices in RBAC, network security policies, and external access integrations . Iceberg Catalog Implementation : Configure and manage Apache Iceberg catalogs within Snowflake and integrate with Azure ADLS Gen2 for external storage. External Storage & Access : Set up external tables, storage integrations , and access policies for ADLS Gen2, AWS S3, and GCS . Data Ingestion & Streaming : Implement Snowpipe, Dynamic Tables , and batch/streaming ETL pipelines for real-time and scheduled data processing. CI/CD & Automation : Develop CI/CD pipelines for Snowflake schema changes, security updates, and data workflows using Terraform, dbt, GitHub Actions, or Azure DevOps . Snowflake Notebooks & Snowpark : Utilize Snowflake Notebooks for analytics and data exploration, and develop Snowpark applications for machine learning and complex data transformations using Python, Java, or Scala . Security & Compliance : Implement RBAC, Okta SSO authentication, OAuth, network security policies, and governance frameworks for Snowflake environments. Notification & Monitoring Integration : Set up event-driven notifications and alerting using Azure Event Grid, SNS, or cloud-native services . Performance & Cost Optimization : Continuously monitor query performance, warehouse utilization, cost estimates, and optimizations to improve efficiency. Documentation & Best Practices : Define best practices for Snowflake architecture, automation, security, and performance tuning . Required Skills & Experience: 7+ years of experience in data architecture and engineering, specializing in Snowflake Expertise in SQL, Python, and Snowpark APIs. Hands-on experience with Iceberg Catalogs, Snowflake Notebooks, and external storage (Azure ADLS Gen2, S3, GCS). Strong understanding of CI/CD for Snowflake , including automation with Terraform, dbt, and DevOps tools. Experience with Snowpipe, Dynamic Tables, and real-time/batch ingestion pipelines. Proven ability to analyze and optimize Snowflake performance, storage costs, and compute efficiency. Knowledge of Okta SSO, OAuth, federated authentication, and network security in Snowflake. Cloud experience in Azure, AWS, or GCP , including cloud networking and security configurations. Additional Details: This is a Contractual position for a duration of 6-12 months, This is a Completely Remote Opportunity .
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Mumbai, Bengaluru, Gurgaon
INR 32.5 - 37.5 Lacs P.A.
Chennai, Pune, Mumbai, Bengaluru, Gurgaon
INR 35.0 - 42.5 Lacs P.A.
Chennai, Pune, Delhi, Mumbai, Bengaluru, Hyderabad, Kolkata
INR 8.0 - 12.0 Lacs P.A.
Pune, Bengaluru, Mumbai (All Areas)
INR 0.5 - 0.7 Lacs P.A.
INR 2.5 - 5.5 Lacs P.A.
INR 3.0 - 4.5 Lacs P.A.
Bengaluru
INR 3.0 - 3.0 Lacs P.A.
Bengaluru
INR 3.5 - 3.75 Lacs P.A.
INR 2.5 - 3.0 Lacs P.A.
INR 4.0 - 4.0 Lacs P.A.