Posted:2 months ago| Platform:
Work from Office
Full Time
Your Job The Lead Data Engineer will be a part of an international team that designs, develops and delivers new applications for Koch Industries. This role will have the opportunity to join on the ground floor and will play a critical part in helping build out the Koch Global Services (KGS) over the next several years. Working closely with global colleagues would provide significant international exposure to the employees Our Team What You Will Do Work with business partners to understand key business drivers and use that knowledge to experiment and transform Business Intelligence & Advanced Analytics solutions to capture the value of potential business opportunities Improve data pipeline reliability, scalability, and security. Design, build and maintain optimal data pipeline architecture, assemble large, complex data sets that meet functional / non-functional business requirements. Translate a business process/problem into a conceptual and logical data model and proposed technical implementation plan Work closely with the Product Owners and stake holders to design the Technical Architecture for data platform to meet the requirements of the proposed solution. Provides technical leadership in Big Data space (Hadoop Stack like Spark, M/R, HDFS, Hive, NoSQL stores like Cassandra, HBase etc, Databased like Snowflake, RDS etc) Assist in developing and implementing consistent processes for data modeling, mining, and production Experience in architecting software solution on public cloud. Help the Data Engineering team produce high-quality code that allows us to put solutions into production Creating reusable and scalable data pipelines Focus on implementing development processes and tools that allow for the collection of metadata, access to metadata, and completed in a way that allows for widespread code reuse (e.g., utilization of ETL Frameworks, Generic Metadata driven Tools, shared data dimensions, etc.) that will enable impact analysis as well as source to target tracking and reporting Create and own the technical product backlogs for products, help the team to close the backlogs in right time. Refactor code into reusable libraries, APIs, and tools. Who You Are (Basic Qualifications) 10+ years of industry professional experience or a bachelors degree in MIS, CS, or an industry equivalent with consultative/complex deployment project, architecture, design, implementation and/or support of data and analytics solutions At least 6-8 years of Data Engineering experience (AWS) in delivering Advance10+ years of industry professional experience or a bachelors degree in MIS, CS, or an industry equivalent with consultative/complex deployment project, architecture, design, implementation and/or support of data and analytics solutions At least 6-8 years of Data Engineering experience (AWS) in delivering Advance Analytics solution, Data Warehousing, Big Data or Cloud. Should have strong knowledge in SQL, developing, deploying, and modelling DWH and data pipelines on AWS cloud or similar other cloud environments. 5+ years of experience with business and technical requirements analysis, elicitation, data modeling, verification, and methodology development with a good hold of communicating complex technical ideas to technical and non-technical team members Manage data related requests, analyze issues and provide efficient resolution. Design all program specifications and perform required tests. Experience in authoring or reviewing system design documents for enterprise solutions. Knowledge of Big Data technologies, such as Spark, Hadoop/MapReduce. Strong coding skills in Java and Python or Scala. Demonstrated experience using git-based source control management platforms (Gitlab, GitHub, DevOps, etc.). Experience of working in Agile delivery Experience in Data Harmonization, Master Data Management & Critical Data Elements Management What Will Put You Ahead 8+ years experience in the Amazon Web Services stack experience including S3, Athena, Redshift, Glue, or Lambda 8+ years experience with cloud data warehousing solutions including Snowflake with developing in and implementation of dimensional modeling Experience with Open-source tools & integration with AWS platform will be preferred Certified as Cloud Architect from a reputed public cloud. Experience with Git and CICD pipelines. Development experience with docker and a Kubernetes environment (would be a plus) Understanding of infrastructure (including hosting, container-based deployments and storage architectures) would be advantageous.
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Bengaluru, Hyderabad
INR 3.5 - 8.5 Lacs P.A.
Mumbai, Bengaluru, Gurgaon
INR 5.5 - 13.0 Lacs P.A.
Chennai, Pune, Delhi, Mumbai, Bengaluru, Hyderabad, Kolkata
INR 3.0 - 7.0 Lacs P.A.
Chennai, Pune, Mumbai (All Areas)
INR 5.0 - 15.0 Lacs P.A.
Pune, Bengaluru, Mumbai (All Areas)
INR 11.0 - 21.0 Lacs P.A.
Chennai, Pune, Delhi, Mumbai, Bengaluru, Hyderabad, Kolkata
INR 15.0 - 16.0 Lacs P.A.
Pune, Bengaluru, Mumbai (All Areas)
INR 10.0 - 15.0 Lacs P.A.
Bengaluru, Hyderabad, Mumbai (All Areas)
INR 0.5 - 3.0 Lacs P.A.
Hyderabad, Gurgaon, Mumbai (All Areas)
INR 6.0 - 16.0 Lacs P.A.
Bengaluru, Noida
INR 16.0 - 22.5 Lacs P.A.