Posted:2 months ago| Platform:
Work from Office
Full Time
As a Consultant Data Engineer at Hakkoda, you will be more than just a builder you will be a trusted consultant, working closely with clients to design and implement scalable data solutions on Snowflake and other cloud data platforms. Your expertise and curiosity will help drive meaningful change in data-driven organizations. What We Are Looking For: We are in search of a skilled Consultant Data Engineer to join our expanding team of experts. This role will be pivotal in the design and development of Snowflake Data Cloud solutions, encompassing responsibilities such as constructing data ingestion pipelines, establishing sound data architecture, and implementing stringent data governance and security protocols. The ideal candidate brings experience as a proficient data pipeline builder and adept data wrangler, deriving satisfaction from optimizing data systems from their foundational stages. Collaborating closely with database architects, data analysts, and data scientists, the Data Engineer will play a crucial role in ensuring a consistent and optimal data delivery architecture across ongoing customer projects. This position demands a self-directed individual comfortable navigating the diverse data needs of multiple teams, systems, and products. If you are enthusiastic about the prospect of contributing to a startup environment and supporting our customers in their next generation of data initiatives, we invite you to explore this opportunity. Qualification Location: Jaipur, Rajasthan (Work from Office) Looking for candidates who can join within a month Possession of a Bachelors degree in engineering, computer science, or a related field. 3-6 years of experience in relevant technical roles, demonstrating proficiency in data management, database development, ETL, and/or data preparation domains. At least 1+ years of experience within the Snowflake Data Cloud environment. Explicit experience with Snowflake, including details on architectural design, data modelling, and implementation. Proven experience in developing data warehouses and constructing ETL / ELT ingestion pipelines. Adept knowledge in manipulating, processing, and extracting value from extensive disconnected datasets. Proficiency in SQL and Python scripting is a requirement, with additional proficiency in Scala and Javascript considered advantageous. Exposure to cloud platforms (AWS, Azure, or GCP) is a favourable attribute. Proven experience with Extract, Transform, Load (ETL) or Extract, Load, Transform (ELT) tools, especially those compatible with Snowflake (e.g., Matillion, Fivetran). Knowledge of DBT is an advantage. Strong interpersonal skills, including assertiveness and the ability to foster robust client relationships. Demonstrated proficiency in project management and organizational skills. Capability to collaborate and support cross-functional and agile teams within a dynamic environment. Advanced proficiency in English is a mandatory requirement.
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Mumbai, Bengaluru, Gurgaon
INR 32.5 - 37.5 Lacs P.A.
Chennai, Pune, Mumbai, Bengaluru, Gurgaon
INR 35.0 - 42.5 Lacs P.A.
Chennai, Pune, Delhi, Mumbai, Bengaluru, Hyderabad, Kolkata
INR 8.0 - 12.0 Lacs P.A.
Pune, Bengaluru, Mumbai (All Areas)
INR 0.5 - 0.7 Lacs P.A.
INR 2.5 - 5.5 Lacs P.A.
INR 3.0 - 4.5 Lacs P.A.
Bengaluru
INR 3.0 - 3.0 Lacs P.A.
Bengaluru
INR 3.5 - 3.75 Lacs P.A.
INR 2.5 - 3.0 Lacs P.A.
INR 4.0 - 4.0 Lacs P.A.