Posted:3 days ago| Platform:
Work from Office
Full Time
Notice Period : Immediate - 30 days Mandatory Skills : Big Data, Python, SQL, Spark/Pyspark, AWS Cloud JD and required Skills & Responsibilities : - Actively participate in all phases of the software development lifecycle, including requirements gathering, functional and technical design, development, testing, roll-out, and support. - Solve complex business problems by utilizing a disciplined development methodology. - Produce scalable, flexible, efficient, and supportable solutions using appropriate technologies. - Analyse the source and target system data. Map the transformation that meets the requirements. - Interact with the client and onsite coordinators during different phases of a project. - Design and implement product features in collaboration with business and Technology stakeholders. - Anticipate, identify, and solve issues concerning data management to improve data quality. - Clean, prepare, and optimize data at scale for ingestion and consumption. - Support the implementation of new data management projects and re-structure the current data architecture. - Implement automated workflows and routines using workflow scheduling tools. - Understand and use continuous integration, test-driven development, and production deployment frameworks. - Participate in design, code, test plans, and dataset implementation performed by other data engineers in support of maintaining data engineering standards. - Analyze and profile data for the purpose of designing scalable solutions. - Troubleshoot straightforward data issues and perform root cause analysis to proactively resolve product issues. Required Skills : - 5+ years of relevant experience developing Data and analytic solutions. - Experience building data lake solutions leveraging one or more of the following AWS, EMR, S3, Hive & PySpark - Experience with relational SQL. - Experience with scripting languages such as Python. - Experience with source control tools such as GitHub and related dev process. - Experience with workflow scheduling tools such as Airflow. - In-depth knowledge of AWS Cloud (S3, EMR, Databricks) - Has a passion for data solutions. - Has a strong problem-solving and analytical mindset - Working experience in the design, Development, and test of data pipelines. - Experience working with Agile Teams. - Able to influence and communicate effectively, both verbally and in writing, with team members and business stakeholders - Able to quickly pick up new programming languages, technologies, and frameworks. - Bachelor's degree in computer science
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
INR 10.0 - 20.0 Lacs P.A.
Mumbai, Hyderabad, Bengaluru
INR 8.0 - 12.0 Lacs P.A.
Hyderabad, Pune, Bengaluru
INR 20.0 - 35.0 Lacs P.A.
Hyderabad
INR 10.0 - 20.0 Lacs P.A.
INR 15.0 - 30.0 Lacs P.A.
Chennai
INR 17.0 - 32.0 Lacs P.A.
INR 6.0 - 8.0 Lacs P.A.
Hyderabad
INR 17.0 - 30.0 Lacs P.A.
Experience: Not specified
INR 3.0 - 8.0 Lacs P.A.
INR 20.0 - 30.0 Lacs P.A.