Data Engineer - PySpark / Kafka

6 - 8 years

8.0 - 14.0 Lacs P.A.

delhi ncr, mumbai, bengaluru

Posted:3 months ago| Platform: Naukri logo

Apply Now

Skills Required

pysparkdata ingestiondata engineeringdata pipelinegoogle cloud platformbig datakafkadata warehousingetlsqlpython

Work Mode

Work from Office

Job Type

Full Time

Job Description

Data engineer. Responsibilities : - Design and Implement scalable solutions for ever-increasing data volumes, using big data/cloud technologies like Pyspark, Kafka, etc. - Collaborate with cross-functional teams to understand data requirements and provide effective solutions. - Implement real-time data ingestion and processing solutions. - Develop and maintain ETL/ELT processes to support data analytics and reporting. - Implement best practices for data security, integrity, and quality. - Optimize and troubleshoot data-related issues for seamless operations. Requirements : - Bachelor's in Engineering / Master's degree in Computer Science, Information Systems or related field. - Minimum of 6-8 years of experience in data engineering. - Experience with databases and data warehousing concepts (Preferred PostGreSQL, Snowflake). - Strong SQL skills with experience in writing complex queries. - Working knowledge of Data warehousing, Data modelling, Governance, and Data Architecture. - Ability to handle large scale structured and unstructured data from internal and third-party sources. - Hands On Experience in Python, Pyspark, Kafka. - Experience with data engineering tools/technologies in GCP Cloud environment. - Proficiency in designing and maintaining scalable data architectures. - Experience with CICD Tools like GitHub. Location- Delhi NCR, Bangalore, Chennai, Pune, Kolkata, Ahmedabad, Mumbai, Hyderabad

Information Technology
Techville

RecommendedJobs for You

mumbai, bengaluru, hyderabad