Lead Data Engineer

10 - 14 years

35.0 - 40.0 Lacs P.A.

Bengaluru

Posted:2 months ago| Platform: Naukri logo

Apply Now

Skills Required

Data EngineeringJavaScalaHadoopBig DataData HarmonizationSQLMapReduceMaster Data ManagementData WarehousingSparkAWSPython

Work Mode

Work from Office

Job Type

Full Time

Job Description

Your Job The Lead Data Engineer will be a part of an international team that designs, develops and delivers new applications for Koch Industries. This role will have the opportunity to join on the ground floor and will play a critical part in helping build out the Koch Global Services (KGS) over the next several years. Working closely with global colleagues would provide significant international exposure to the employees Our Team What You Will Do Work with business partners to understand key business drivers and use that knowledge to experiment and transform Business Intelligence & Advanced Analytics solutions to capture the value of potential business opportunities Improve data pipeline reliability, scalability, and security. Design, build and maintain optimal data pipeline architecture, assemble large, complex data sets that meet functional / non-functional business requirements. Translate a business process/problem into a conceptual and logical data model and proposed technical implementation plan Work closely with the Product Owners and stake holders to design the Technical Architecture for data platform to meet the requirements of the proposed solution. Provides technical leadership in Big Data space (Hadoop Stack like Spark, M/R, HDFS, Hive, NoSQL stores like Cassandra, HBase etc, Databased like Snowflake, RDS etc) Assist in developing and implementing consistent processes for data modeling, mining, and production Experience in architecting software solution on public cloud. Help the Data Engineering team produce high-quality code that allows us to put solutions into production Creating reusable and scalable data pipelines Focus on implementing development processes and tools that allow for the collection of metadata, access to metadata, and completed in a way that allows for widespread code reuse (e.g., utilization of ETL Frameworks, Generic Metadata driven Tools, shared data dimensions, etc.) that will enable impact analysis as well as source to target tracking and reporting Create and own the technical product backlogs for products, help the team to close the backlogs in right time. Refactor code into reusable libraries, APIs, and tools. Who You Are (Basic Qualifications) 10+ years of industry professional experience or a bachelors degree in MIS, CS, or an industry equivalent with consultative/complex deployment project, architecture, design, implementation and/or support of data and analytics solutions At least 6-8 years of Data Engineering experience (AWS) in delivering Advance10+ years of industry professional experience or a bachelors degree in MIS, CS, or an industry equivalent with consultative/complex deployment project, architecture, design, implementation and/or support of data and analytics solutions At least 6-8 years of Data Engineering experience (AWS) in delivering Advance Analytics solution, Data Warehousing, Big Data or Cloud. Should have strong knowledge in SQL, developing, deploying, and modelling DWH and data pipelines on AWS cloud or similar other cloud environments. 5+ years of experience with business and technical requirements analysis, elicitation, data modeling, verification, and methodology development with a good hold of communicating complex technical ideas to technical and non-technical team members Manage data related requests, analyze issues and provide efficient resolution. Design all program specifications and perform required tests. Experience in authoring or reviewing system design documents for enterprise solutions. Knowledge of Big Data technologies, such as Spark, Hadoop/MapReduce. Strong coding skills in Java and Python or Scala. Demonstrated experience using git-based source control management platforms (Gitlab, GitHub, DevOps, etc.). Experience of working in Agile delivery Experience in Data Harmonization, Master Data Management & Critical Data Elements Management What Will Put You Ahead 8+ years experience in the Amazon Web Services stack experience including S3, Athena, Redshift, Glue, or Lambda 8+ years experience with cloud data warehousing solutions including Snowflake with developing in and implementation of dimensional modeling Experience with Open-source tools & integration with AWS platform will be preferred Certified as Cloud Architect from a reputed public cloud. Experience with Git and CICD pipelines. Development experience with docker and a Kubernetes environment (would be a plus) Understanding of infrastructure (including hosting, container-based deployments and storage architectures) would be advantageous.

RecommendedJobs for You

Chennai, Pune, Delhi, Mumbai, Bengaluru, Hyderabad, Kolkata

Pune, Bengaluru, Mumbai (All Areas)

Chennai, Pune, Delhi, Mumbai, Bengaluru, Hyderabad, Kolkata

Bengaluru, Hyderabad, Mumbai (All Areas)

Hyderabad, Gurgaon, Mumbai (All Areas)