Hanuma InfoTech works across a range of technologies and our team has significant experience in web based application development, database-driven systems, mobile application development, cloud services, big data and outsourcing services with a quick turn around time in an economical manner.
Not specified
INR 5.0 - 15.0 Lacs P.A.
Work from Office
Full Time
Workday Studio, BIRT Reporting, EIBs, Core Connectors, iLoads, XML, and data migration.
Not specified
INR 5.0 - 15.0 Lacs P.A.
Work from Office
Full Time
HCM modules, business process configurations (Hire, Termination, Change Job, etc.), tenant setup, custom reports, HR lifecycle knowledge.
Not specified
INR 5.0 - 15.0 Lacs P.A.
Work from Office
Full Time
Adaptive Planning (Workforce Planning, data loads, supply & demand, skill gaps), Workday HCM staging, Agile (Rally, Jira), documentation skills.
Not specified
INR 10.0 - 18.0 Lacs P.A.
Remote
Full Time
We are seeking SAP CPQ offshore support for an immediate long-term contract. This techno-functional role requires strong SAP CPQ technical expertise, excellent communication skills & exp in requirements gathering and documentation with stakeholders.
Not specified
INR 10.0 - 18.0 Lacs P.A.
Remote
Full Time
Customer is seeking SAP BASIS offshore support for an immediate long-term contract need. The right consultant will have extensive BASIS experience within S/4 HANA Private Cloud environments (2-3 projects in the last 3 years).
Not specified
INR 15.0 - 30.0 Lacs P.A.
Remote
Full Time
Seeking profiles of individuals focused solely on design architecture. Strong understanding of Java and the JVM ecosystem, and expertise in Spring Boot and Kafka is essential.
Not specified
INR 10.0 - 20.0 Lacs P.A.
Remote
Full Time
Seeking for java with AWS.• Experience of database systems, such as RDBMS (Oracle, PostgreSQL, DB2), for data storage and retrieval.• Working in OpenShift, Docker, Kubernetes.• Work with Code versioning tools like GIT and Knowledge of CICD.
Not specified
INR 22.5 - 27.5 Lacs P.A.
Work from Office
Full Time
Key ResponsibilitiesData Pipeline Development: Design, develop, and optimize robust data pipelines to efficiently collect, process, and store large-scale datasets for AI/ML applications.ETL Processes: Develop and maintain Extract, Transform, and Load (ETL) processes to ensure accurate and timely data delivery for machine learning models.Data Integration: Integrate diverse data sources (structured, unstructured, and semi-structured data) into a unified and scalable data architecture.Data Warehousing & Management: Design and manage data warehouses to store processed and raw data in a highly structured, accessible format for analytics and AI/ML models.AI/ML Model Development: Collaborate with Data Scientists to build, fine-tune, and deploy machine learning models into production environments. Focus on model optimization, scalability, and operationalization.Automation: Implement automation techniques to support model retraining, monitoring, and reporting.Cloud & Distributed Systems: Work with cloud platforms (AWS, Azure, GCP) and distributed systems to store and process data efficiently, ensuring that AI/ML models are scalable and maintainable in the cloud environment.Data Quality & Governance: Implement data quality checks, monitoring, and governance frameworks to ensure the integrity and security of the data being used for AI/ML models.Collaboration: Work cross-functionally with Data Science, Business Intelligence, and other engineering teams to meet organizational data needs and ensure seamless integration with analytics platforms.Required Skills and QualificationsBachelor's or Masters Degree in Computer Science, Engineering, Data Science, or a related field.Strong proficiency in Python for AI/ML and data engineering tasks.Experience with AI/ML frameworks such as TensorFlow, PyTorch, Scikit-learn, and Keras.Proficient in SQL and working with relational databases (e.g., MySQL, PostgreSQL, SQL Server).Strong experience with ETL pipelines and data wrangling in large datasets.Familiarity with cloud-based data engineering tools and services (e.g., AWS (S3, Lambda, Redshift), Azure, GCP).Solid understanding of big data technologies like Hadoop, Spark, and Kafka for data processing at scale.Experience in managing and processing both structured and unstructured data.Knowledge of version control systems (e.g., Git) and agile development methodologies.Experience with data containers and orchestration tools such as Docker and Kubernetes.Strong communication skills to collaborate effectively with cross-functional teams.Preferred SkillsExperience with Data Warehouses (e.g., Amazon Redshift, Google BigQuery, Snowflake).Familiarity with CI/CD pipelines for ML model deployment and automation.Familiarity with machine learning model monitoring and performance optimization.Experience with data visualization tools like Tableau, Power BI, or Plotly.Knowledge of deep learning models and frameworks.DevOps or MLOps experience for automating deployment of models.Advanced statistics or math background for improving model performance and accuracy.
Not specified
INR 15.0 - 25.0 Lacs P.A.
Remote
Full Time
10+ years of overall IT experience.• 2+ years of experience with Angular 8+ using NgRx.• 4+ years of experience in implementing REST services with Java and Spring Boot.• Experience deploying to a cloud platform (Kubernetes, GCP, Azure, etc.).
Not specified
INR 22.5 - 27.5 Lacs P.A.
Work from Office
Full Time
FIND ON MAP
Gallery
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
Chrome Extension