Posted:3 weeks ago| Platform:
Work from Office
Full Time
" Senior Data Engineer Location: Bengaluru, Karnataka, India About the Role: Were looking for an experienced Senior Data Engineer (6-8 years) to join our data team. Youll be key in building and maintaining our data systems on AWS. Youll use your strong skills in big data tools and cloud technology to help our analytics team get valuable insights from our data. Youll be in charge of the whole process of our data pipelines, making sure the data is good, reliable, and fast. What Youll Do: Design and build efficient data pipelines using Spark / PySpark / Scala . Manage complex data processes with Airflow , creating and fixing any issues with the workflows ( DAGs ). Clean, transform, and prepare data for analysis. Use Python for data tasks, automation, and building tools. Work with AWS services like S3, Redshift, EMR, Glue, and Athena to manage our data infrastructure. Collaborate closely with the Analytics team to understand what data they need and provide solutions. Help develop and maintain our Node.js backend, using Typescript , for data services. Use YAML to manage the settings for our data tools. Set up and manage automated deployment processes ( CI/CD ) using GitHub Actions . Monitor and fix problems in our data pipelines to keep them running smoothly. Implement checks to ensure our data is accurate and consistent. Help design and build data warehouses and data lakes. Use SQL extensively to query and work with data in different systems. Work with streaming data using technologies like Kafka for real-time data processing. Stay updated on the latest data engineering technologies. Guide and mentor junior data engineers. Help create data management rules and procedures. What Youll Need: Bachelors or Masters degree in Computer Science, Engineering, or a related field. 6-8 years of experience as a Data Engineer. Strong skills in Spark and Scala for handling large amounts of data. Good experience with Airflow for managing data workflows and understanding DAGs . Solid understanding of how to transform and prepare data. Strong programming skills in Python for data tasks and automation.. Proven experience working with AWS cloud services (S3, Redshift, EMR, Glue, IAM, EC2, and Athena ). Experience building data solutions for Analytics teams. Familiarity with Node.js for backend development. Experience with Typescript for backend development is a plus. Experience using YAML for configuration management. Hands-on experience with GitHub Actions for automated deployment ( CI/CD ). Good understanding of data warehousing concepts. Strong database skills - OLAP/OLTP Excellent command of SQL for data querying and manipulation. Experience with stream processing using Kafka or similar technologies. Excellent problem-solving, analytical, and communication skills. Ability to work well independently and as part of a team. Bonus Points: Familiarity with data lake technologies (e.g., Delta Lake, Apache Iceberg). Experience with other stream processing technologies (e.g., Flink, Kinesis). Knowledge of data management, data quality, statistics and data governance frameworks. Experience with tools for managing infrastructure as code (e.g., Terraform). Familiarity with container technologies (e.g., Docker, Kubernetes). Experience with monitoring and logging tools (e.g., Prometheus, Grafana).
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
INR 10.0 - 20.0 Lacs P.A.
Mumbai, Hyderabad, Bengaluru
INR 8.0 - 12.0 Lacs P.A.
Hyderabad, Pune, Bengaluru
INR 20.0 - 35.0 Lacs P.A.
Hyderabad
INR 10.0 - 20.0 Lacs P.A.
INR 15.0 - 30.0 Lacs P.A.
Chennai
INR 17.0 - 32.0 Lacs P.A.
INR 6.0 - 8.0 Lacs P.A.
Hyderabad
INR 17.0 - 30.0 Lacs P.A.
Experience: Not specified
INR 3.0 - 8.0 Lacs P.A.
INR 20.0 - 30.0 Lacs P.A.