Posted:6 days ago| Platform:
On-site
Full Time
We Are the Perfect Match If You Have 4 - 6 years of experience in creating & managing data ingestion pipelines based on the ELT (Extract, Load & Transform) model, at scale. Worked on coding for Data Models, Transformation Logic & Data Flow in your current role. Know-how of Logstash, Apache BEAM & Dataflow, Apache Airflow, ClickHouse, Grafana, InfluxDB/VictoriaMetrics, BigQuery. Understanding of Product Development Methodologies; we follow Agile methodology. Have a minimum 4 years of hands-on experience in any data warehousing stack. Have a knack for understanding data & can derive insights from it. Experience with any TimeSeries DB (we use InfluxDB & VictoriaMetrics) and Alerting / Anomaly Detection Frameworks. Visualization Tools: Metabase/ PowerBI/Tableau. Experience in developing software in the Cloud such as GCP / AWS. A passion for exploring new technologies and express yourself through technical blogs. Here s What Your Day Will Look Like Coding & implementing Data Pipelines & Frameworks to provide a better developer experience for our dev teams. Helping other PODs in IDfy define their data landscape and onboarding them onto our platform. Ensuring high quality of secure code with unit test cases and implementation with appropriate monitoring. Provide guidance and support to your peers within the team and cross functionally. Identify potential issues, troubleshoot, device analytics and help develop contingency plans to ensure that projects pass the IDfy quality gates and are delivered on time and within budget. Keep abreast of the latest trends and technologies in Data Engineering, GenAI, and Data Science. What s it like working at IDfy We build products that detect and prevent fraud. At IDfy, you will apply your skills to stay one step ahead of fraudsters. You will be mind-mapping fraudsters modus operandi, predicting the evolution of fraud techniques, and designing solutions to prevent new & emerging fraud. At IDfy, you will work on the entire end-to-end solution rather than a small cog of a giant wheel. Thanks to our problem-centric approach, one in which we find the right technology to solve a problem rather than the other way around, you will always be working on the latest technologies. We work hard and party hard. There are weekly sessions on emerging technologies. Work weeks are usually capped off with board games, poker, karaoke, and other fun activities. Experience Range: 3 - 5 years Educational Qualifications: Any graduation, Job Responsibilities: Coding & implementing Data Pipelines & Frameworks to provide a better developer experience for our dev teams. Helping other PODs in IDfy define their data landscape and onboarding them onto our platform. Ensuring high quality of secure code with unit test cases and implementation with appropriate monitoring. Provide guidance and support to your peers within the team and cross functionally. Identify potential issues, troubleshoot, device analytics and help develop contingency plans to ensure that projects pass the IDfy quality gates and are delivered on time and within budget. Keep abreast of the latest trends and technologies in Data Engineering, GenAI, and Data Science. Skills Required: Software Engineers,API Development,
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Navi Mumbai, Maharashtra, India
INR 3.0 - 8.0 Lacs P.A.
Chennai, Tamil Nadu, India
INR 15.0 - 20.0 Lacs P.A.
Bengaluru / Bangalore, Karnataka, India
INR 6.0 - 9.5 Lacs P.A.
Navi Mumbai, Maharashtra, India
INR 5.0 - 8.0 Lacs P.A.
Delhi, Delhi, India
INR 3.0 - 6.0 Lacs P.A.
Bengaluru / Bangalore, Karnataka, India
INR 3.0 - 6.0 Lacs P.A.
Mumbai City, Maharashtra, India
INR 0.5 - 0.5 Lacs P.A.
Bengaluru / Bangalore, Karnataka, India
INR 3.0 - 5.0 Lacs P.A.
Hyderabad / Secunderabad, Telangana, Telangana, India
INR 9.0 - 20.0 Lacs P.A.
Gurgaon / Gurugram, Haryana, India
INR 9.0 - 12.0 Lacs P.A.