Posted:2 months ago| Platform:
Work from Office
Full Time
About ProcDNA: ProcDNA is a global consulting firm. We fuse design thinking with cutting-edge technology to create game-changing Commercial Analytics and Technology solutions for our clients. Were a passionate team of 275+ across 6 offices, all growing and learning together since our launch during the pandemic. Here, you wont be stuck in a cubicle - youll be out in the open water, shaping the future with brilliant minds. At ProcDNA, innovation isnt just encouraged; its ingrained in our DNA. Ready to join our epic growth journey What we are looking for: Youll build and maintain systems for efficient data collection, storage, and processing to ensure data pipelines are robust and scalable for seamless integration and analysis. We are seeking an individual who not only possesses the requisite expertise but also thrives in the dynamic landscape of a fast-paced global firm. What you ll do: Design/implement complex and scalable enterprise data processing and BI reporting solutions. Design, document implement the data pipelines to feed data models for subsequent consumption in Snowflake using DBt, and airflow. Ensure strict data compliance, security, and cost optimization. Re-architect data solutions for scalability, reliability, and resilience. Manage data schemas and flow to ensure compliance, integrity, and security. Deliver end-to-end data solutions across multiple infrastructures and applications. Actively monitor and triage technical challenges in critical situations that require immediate resolution. Develop relationships with external stakeholders to maintain awareness of data and security issues and trends. Review work from other tech team members and provide feedback for growth. Effectively mentor and develop your team members. Must have: 8-13 years of experience in a data engineering role with an engineering degree in background. Expertise in writing SQL and database objects - Stored procedures, functions, and views. Hands-on experience in ETL/ELT and data security, SQL performance optimization, and job orchestration tools and technologies e.g., DBt, APIs, Apache Airflow. Knowledge of data warehouses (Redshift, Snowflake, Databricks, Cloudera). Proficient in Python scripting, PySpark, and Spark. Experience with data ingestion, storage, and consumption. Skilled in SQL and data schema management. Excellent leadership, communication, and interpersonal skills, with the ability to build and motivate high-performing teams. Strong project management skills, with experience in defining project scope, objectives, and deliverables, developing project plans, and managing project budgets and resources. Domain knowledge of the pharma landscape is a plus.
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
INR 7.0 - 12.0 Lacs P.A.
INR 13.0 - 18.0 Lacs P.A.
Chennai, Pune, Delhi, Mumbai, Bengaluru, Hyderabad, Kolkata
INR 30.0 - 35.0 Lacs P.A.
Mumbai, Bengaluru
INR 15.0 - 30.0 Lacs P.A.
INR 22.5 - 27.5 Lacs P.A.
Pune
INR 7.0 - 10.0 Lacs P.A.
INR 17.0 - 19.0 Lacs P.A.
INR 30.0 - 35.0 Lacs P.A.
INR 3.0 - 5.0 Lacs P.A.
INR 4.0 - 6.0 Lacs P.A.