Data Engineer

5 - 10 years

10.0 - 20.0 Lacs P.A.

Pune

Posted:2 months ago| Platform: Naukri logo

Apply Now

Skills Required

PysparkData EngineeringSCALAHadoop FrameworkPythonAirflowHiveGCPHadoopAWS

Work Mode

Hybrid

Job Type

Full Time

Job Description

Role Description: Sr. Data Engineer Location: Pune Mode: Hybrid Immediate Joiner/20days Job Description: Senior DE: To be successful in this role, you should meet the following requirements: Should have at least 5+ years experience working in data engineering. (Note: Must have Requirements) Scala and Python development and understanding requirements and come up solutions. Experience using scheduling tools such as Airflow. Experience with most of the following technologies (Apache Hadoop, Pyspark, Apache Spark, YARN, Hive, Python, ETL frameworks, Map Reduce, SQL, RESTful services). Sound knowledge of working in Unix/Linux Platform Hands-on experience building data pipelines using Hadoop components - Hive, Spark, Spark SQL. Experience with industry standard version control tools (Git, GitHub), automated deployment tools (Ansible & Jenkins) and requirement management in JIRA. Understanding of big data modelling techniques using relational and non-relational techniques Experience on debugging code issues and then publishing the highlighted differences to the development team. Flexible to adapt to new tooling. Job responsibilities Software design, Scala and Python - Spark development experience , automated testing of new and existing components in an Agile, DevOps and dynamic environment. Minimum 1 year experience in Scala Promoting development standards, code reviews, mentoring, knowledge sharing Production support & troubleshooting. Implement the tools and processes, handling performance, scale, availability, accuracy and monitoring Liaison with BAs to ensure that requirements are correctly interpreted and implemented. Participation in regular planning and status meetings. Input to the development process through the involvement in Sprint reviews and retrospectives. Input into system architecture and design. The successful candidate will also meet the following requirements: (Good to have Requirements) Experience with Elastic search. Experience developing in Java APIs. Experience doing ingestions. Understanding or experience of Cloud design pattern GCP Development experience Exposure to DevOps & Agile Project methodology such as Scrum and Kanban. This role offers the opportunity to be part of an innovative team, delivering impactful solutions while developing your technical expertise. Share resume at: Pratikshya.nayak@purviewservices.com

Data Management
San Francisco

RecommendedJobs for You

Chennai, Pune, Delhi, Mumbai, Bengaluru, Hyderabad, Kolkata

Pune, Bengaluru, Mumbai (All Areas)

Chennai, Pune, Delhi, Mumbai, Bengaluru, Hyderabad, Kolkata

Bengaluru, Hyderabad, Mumbai (All Areas)

Hyderabad, Gurgaon, Mumbai (All Areas)