IQHQ is the intelligence (IQ) headquarters (HQ) of today's organizations which maximizes the potential of Big Data via powering data-driven businesses with foremost intelligence resources. With increasing volume of data being generated every day, only a small fraction of this data is used. The importance of data as one of the most valuable resources is growing and organizations demand solution, which helps them efficiently aggregate, process and analyze Big Data to make right impactful decisions, discover hidden insights, leverage every emerging opportunity and gain inimitable competitive advantage. It's IQHQ. IQHQ enables users to: - Explore aggregated external data in structured form - Import internal data and integrate it with external data - Analyze and visualize data to discover hidden insights and opportunities - Monetize data/acquire data from other users - Request "Custom Intelligence", a flagship custom data sourcing service for organizations with special data needs.
Not specified
INR 4.0 - 7.0 Lacs P.A.
Work from Office
Full Time
Job Description of the Data Modeler Role The Data Modeler will work towardsdesign and implementation of new data structures to support the project teamsdelivering on ETL, Datawarehouse design , managingthe enterprise data model , the maintenance of the data , and enterprise data integration approaches. Technical Responsibilities Build and maintain out ofstandards data models to report disparate data sets in a reliable, consistentand interpretable manner. Gather, distil andharmonize data requirements and to design coherent Conceptual, logical andphysical data models and associated physical feed formats to support these dataflows. Articulate businessrequirements and build source-to-target mappings having complex ETLtransformation. Write complex SQL statements and profile source data to validate data transformations. Contribute to requirement analysisand database design -Transactional and Dimensional data modelling. Normalize/ De-normalizedata structures, introduce hierarchies and inheritance wherever required inexisting/ new data models. Develop and implement data warehouse projects independently. Work with data consumersand data suppliers to understand detailed requirements, and to proposestandardized data models. Contribute to improvingthe Data Management data models. Be an influencer to present and facilitate discussions tounderstand business requirements and develop dimension data models based onthese capabilities and industry best practices. Requirements Extensive practicalexperience in Information Technology and software development projects of with atleast 8+ years of experience in designing Operational data store data warehouse. Extensive experience inany of Data Modelling tools Erwin/ SAP power designer. Strong understanding ofETL and data warehouse concepts processes and best practices. Proficient in Data Modellingincluding conceptual, logical and physical data modelling for both OLTP andOLAP. Ability to write complexSQL for data transformations and data profiling in source and target systems Basic understanding of SQL vs NoSQL databases. Possess a combination ofsolid business knowledge and technical expertise with strong communicationskills. Demonstrate excellent analytical and logical thinking. Good verbal written communication skills and Ability to workindependently as well as in a team environment providing structure in ambiguoussituation. Good to have Understanding of Insurance Domain Basic understanding of AWS cloud Good understanding ofMaster Data Management, Data Quality and Data Governance. Basicunderstanding of datavisualization tools like SAS VA, Tableau Goodunderstanding of implementing architecting data solutions using the informatica, SQL server/Oracle ","
Not specified
INR 22.5 - 27.5 Lacs P.A.
Work from Office
Full Time
Role: Technical Project Manager Qualification: BTech/MTech/MCA Job Location: Pune Experience: 12+ years Work Mode: Hybrid Job Overview: We are seeking a skilled and motivated Technical Project Manager to lead, plan, and oversee projects focused on data engineering and business intelligence tools. This role requires a deep understanding of data engineering processes, BI technologies, and the ability to manage cross-functional teams while ensuring the delivery of high-quality solutions that meet business requirements. Key Responsibilities: Lead and manage end-to-end project lifecycles, ensuring successful delivery of data engineering projects and BI tools initiatives. Collaborate with data engineers, data analysts, and business stakeholders to define project scope, requirements, and goals. Develop detailed project plans, timelines, and schedules, and ensure resources are allocated effectively. Track and report on project progress, addressing risks, issues, and delays promptly. Serve as the primary point of contact for business stakeholders, ensuring their needs and expectations are met while managing project priorities. Facilitate communication between technical teams and non-technical stakeholders, ensuring that all parties are informed of project status and milestones. Work closely with data engineering teams to ensure the successful design, development, and deployment of data pipelines, ETL processes, and integration of data systems. Help evaluate and select BI tools (e.g., Power BI, Tableau, Looker) and ensure their alignment with the organizations needs. Oversee the implementation of BI reporting solutions, dashboards, and analytics tools to deliver actionable insights to business users. Proactively identify project risks and issues and develop mitigation strategies. Lead problem-solving efforts to overcome roadblocks and ensure timely project delivery. Continuously evaluate and refine project management processes and methodologies to improve efficiency and effectiveness in data-driven projects. Stay up to date with industry trends and best practices in data engineering and BI tools to ensure the team uses cutting-edge technologies and techniques. Requirements Requirements Qualifications: Bachelordegree in computer science, Information Technology, Engineering, or a related field. Proven experience as a Technical Project Manager in a data engineering or BI-focused environment. Strong understanding of data engineering concepts, ETL processes, and database technologies (e.g., SQL, NoSQL, data lakes, cloud platforms). Familiarity with business intelligence tools (e.g., Power BI, Tableau, Looker, Qlik) and data visualization best practices. Experience with Agile or Scrum methodologies and tools (Jira, Confluence, etc.). Excellent leadership, communication, and interpersonal skills to work with both technical and non-technical teams. Strong organizational and problem-solving skills with the ability to manage multiple projects simultaneously. Knowledge of cloud platforms like AWS, Azure, or Google Cloud is a plus. Benefits Benefits Company Standard benefits ","
Not specified
INR 25.0 - 32.0 Lacs P.A.
Work from Office
Full Time
We are seeking an experienced DevOps Engineer with a strong background in container orchestration, tracing, and CI/CD tools. In this role, you will be responsible for managing Kubernetes clusters, troubleshooting container environments, and ensuring seamless application performance through effective tracing and CI/CD pipeline management. Cloud experience is a plus but not essential. Key Responsibilities: Kubernetes Container Management: Deploy, manage, and troubleshoot Kubernetes clusters using both open source Kubernetes and Red Hat OpenShift. Utilize Istio and Calico for service mesh management and network policy enforcement. Leverage strong troubleshooting skills to resolve container and Docker-related issues. Application Tracing Monitoring: Implement and maintain Datadog monitoring and tracing systems. Analyze and troubleshoot application performance issues using deep tracing insights. CI/CD Pipeline Management: Design, build, and maintain CI/CD pipelines using Jenkins, ensuring integration with Hoover libraries where applicable. Manage artifact repositories with Nexus and JFrog. Oversee SSL certificate management through Venafi to ensure secure deployments. Utilize Linux systems for continuous integration and deployment tasks. Collaboration Documentation: Work closely with development and operations teams to streamline the deployment process. Document processes, configurations, and troubleshooting steps for future reference. Requirements Qualifications: Proven experience with Kubernetes (open source and Red Hat OpenShift) and container technologies such as Docker. Strong knowledge of service meshes and network policies using Istio and Calico. Hands-on experience with Datadog for application tracing and performance monitoring. Proficiency in designing and managing CI/CD pipelines with Jenkins (experience with Hoover libraries is a plus), as well as familiarity with Nexus, JFrog, and Venafi for SSL certificate management. Solid working knowledge of Linux operating systems. Cloud platform experience is optional but will be considered an asset.
Not specified
INR 9.0 - 13.0 Lacs P.A.
Work from Office
Full Time
We are seeking a Lead ETL Data Engineer to design, develop, and optimize data pipelines, ensuring smooth data integration across our platforms. This role will lead a team of ETL developers and work closely with data analysts, engineers, and business stakeholders to drive data solutions in a cloud environment. Key Responsibilities: \u2705 ETL Development Data Pipeline Design Lead the design, development, and optimization of ETL processes using Talend (or similar ETL tools). Build, automate, and maintain scalable data pipelines for efficient data processing. Ensure data quality, consistency, and performance across ETL workflows. \u2705 Database Data Warehouse Management Work with relational and NoSQL databases , ensuring optimized SQL queries for performance. Implement data warehouse solutions (DWH) on AWS (Redshift, S3, Glue, RDS) or other cloud environments. Perform data modeling to support business intelligence and analytics. \u2705 Leadership Collaboration Guide and mentor a team of ETL developers and data engineers . Collaborate with data scientists, analysts, and business teams to understand data needs. Drive best practices in data governance, security, and compliance . \u2705 Performance Optimization Troubleshooting Monitor and troubleshoot ETL performance issues . Optimize database performance and ensure low-latency data processing. Automate error handling and data recovery strategies. Requirements Required Skills Qualifications: \u2714 10 + years of experience in ETL development and data engineering . \u2714 Expertise in ETL tools like Talend, Informatica, or Apache NiFi . \u2714 Strong proficiency in SQL and database optimization techniques . \u2714 Hands-on experience with AWS cloud services (Redshift, Glue, Lambda, S3, RDS, etc.) . \u2714 Experience with big data technologies (Spark, Hadoop, or Kafka) is a plus. \u2714 Solid understanding of data modeling, warehousing (DWH), and governance . \u2714 Excellent problem-solving and communication skills . \u2714 Experience in leading a team and driving technical best practices. Benefits As per company standards. ","
Not specified
INR 4.0 - 7.0 Lacs P.A.
Work from Office
Full Time
Job Summary: We are seeking a skilled ETL Data Engineer to design, build, and maintain efficient and reliable ETL pipelines, ensuring seamless data integration, transformation, and delivery to support business intelligence and analytics. The ideal candidate should have hands-on experience with ETL tools like Talend , strong database knowledge, and familiarity with AWS services . Key Responsibilities: Design, develop, and optimize ETL workflows and data pipelines using Talend or similar ETL tools. Collaborate with stakeholders to understand business requirements and translate them into technical specifications. Integrate data from various sources, including databases, APIs, and cloud platforms, into data warehouses or data lakes. Create and optimize complex SQL queries for data extraction, transformation, and loading. Manage and monitor ETL processes to ensure data integrity, accuracy, and efficiency. Work with AWS services like S3, Redshift, RDS , and Glue for data storage and processing. Implement data quality checks and ensure compliance with data governance standards. Troubleshoot and resolve data discrepancies and performance issues. Document ETL processes, workflows, and technical specifications for future reference. Requirements Bachelors degree in Computer Science, Information Technology, or a related field. 4+ years of experience in ETL development, data engineering, or data warehousing. Hands-on experience with Talend or similar ETL tools (Informatica, SSIS, etc.). Proficiency in SQL and strong understanding of database concepts (relational and non-relational). Experience working in an AWS environment with services like S3, Redshift, RDS , or Glue . Strong problem-solving skills and ability to troubleshoot data-related issues. Knowledge of scripting languages like Python or Shell scripting is a plus. Good communication skills to collaborate with cross-functional teams. Benefits As per company standards. ","
Not specified
INR 10.0 - 14.0 Lacs P.A.
Work from Office
Full Time
We are on the lookout for a resourceful and proficient AI engineer/developer to join our forward-thinking team. The ideal candidate will not only possess a strong foundation in Python programming, advanced mathematics, and algorithms, but also have specialized knowledge in generative AI (GenAI) and large language models (LLMs). This role is pivotal in developing machine learning and deep learning models, understanding and applying various neural network architectures, and handling intricate data processing and visualization. The successful candidate will be adept in natural language processing, deploying AI and ML solutions, and upholding AI security. Experience Required: 5 to 10 years of relevant experience in AI/ML development. Key Responsibilities: Develop and refine machine learning and deep learning models. Apply expertise in neural network architectures, specifically for GenAI and LLM applications. Handle complex data processing, cleaning, and visualization tasks. Utilize natural language processing techniques for advanced AI solutions. Efficiently deploy AI/ML models in production environments, focusing on scalability and robustness. Uphold and enhance AI security measures to protect systems and data. Collaborate with cross-functional teams to integrate AI solutions, particularly GenAI and LLMs, into broader systems and applications. Stay abreast of the latest trends and advancements in AI, machine learning, GenAI, and LLMs. Requirements Requirements: Proficiency in Python programming. Advanced knowledge in mathematics and algorithm development. Experience in developing machine learning and deep learning models. Strong understanding of neural network architectures, with emphasis on GenAI and LLMs. Skilled in data processing and visualization. Experienced in natural language processing. Knowledgeable in AI/ML deployment, DevOps practices, and cloud services. In-depth understanding of AI security principles and practices. Benefits Benefits: Standard Company Benefits. ","
Not specified
INR 12.0 - 16.0 Lacs P.A.
Work from Office
Full Time
":" Job Title: Data Governance Analyst Location: Pune (Hybrid; thrice a week in-officerequirement) Company: Leading Insurance and Investments Firm We are seeking a skilled anddetail-oriented DataGovernance Analyst to join our Data Lakehouse program team . The ideal candidatewill play a crucial role in ensuring data integrity, quality, and complianceacross our organization, with a focus on Data Ownership/Stewardship, Metadata Management, DataQuality, and Reference Data Management. Key Responsibilities: 1. Metadata Management: - Reviewand validate metadata documents and ingestion templates populated by sourcesystem Subject Matter Experts (SMEs) and Business Analysts (BAs) - Analyzeand recommend improvements to existing data dictionaries, business glossaries,access controls, data classification, and data quality requirements - Ensuremetadata accuracy and completeness across all data assets 2. Data Ownership and Stewardship - Collaborateclosely with Data Owners and Stewards to obtain approvals and sign-offs on datagovernance initiatives - Align data governance standards with business requirements and needs - Facilitatecommunication between technical teams and business stakeholders 3. Data Quality: - Reviewand enforce data quality requirements across the organization - Developand implement data quality metrics and monitoring processes - Identifyand address data quality issues in collaboration with relevant teams 4. Reference Data Management: - Reviewand standardize reference data and Lists of Values (LOVs) - Ensureproper maintenance and version control of reference data - Collaboratewith business units to define and implement reference data standards 5. Cross-functional Collaboration: - Workclosely with Business Systems Analysts, Data Architects, Change Management andTechnology Governance Teams - Participatein data governance meetings and initiatives - Contribute to the development andimplementation of data governance policies and procedures Preferred Qualifications: 1. Professionalcertifications in data governance or data management (e.g., CDMP, DGCP) 2. Experiencewith data lakehousearchitectures and technologies 3. Familiaritywith Agile methodologies and project management practices 4. Experiencewith data governance tools and applications (e.g. Talend, Erwin Data Modeler ) Requirements Requirements: 1. Bachelorsdegree in Computer Science, Information Systems, or a related field 2. 5+ years of experience in datagovernance, data management, or a similar role 3. Strongunderstanding of data governance principles, metadata management, and dataquality concepts 4. Experiencewith data dictionaries,business glossaries, and data classification methodologies 5. Familiaritywith insurance and investment industry data standards and regulations 6. Excellentanalytical and problem-solving skills 7. Strongcommunication and interpersonal skills, with the ability to work effectivelywith both technical and non-technical stakeholders 8. Proficiencyin data governance tools and technologies (e.g., data catalogs, metadata repositories ) 9. Knowledgeof data privacy regulations and best practices ","
Not specified
INR 5.0 - 8.0 Lacs P.A.
Work from Office
Full Time
Position:Tableau Server Administrator Location: Balewadi Highstreet Job Type: Full-time Introduction: We are looking for an experienced and highly motivated Tableau ServerAdministrator to join our dynamic team. As a Tableau Server Administrator,you will be responsible for the installation, configuration, maintenance, andmanagement of Tableau Server environments. You will also ensure the optimalperformance, security, and uptime of Tableau Server while providingadministrative support and troubleshooting assistance to users. KeyResponsibilities: Tableau Server Administration: Install, configure, and manage Tableau Server environments (version upgrades, patches, and migrations). Configure Tableau Server for optimal performance and scalability. Set up and maintain security settings, user roles, and permissions. Manage Tableau Server environments, including user access, server monitoring, and resource management. Perform regular backups, data recovery, and disaster recovery procedures for Tableau Server. Server Monitoring Performance Optimization: Monitor and troubleshoot Tableau Server performance issues. Optimize server performance by analyzing and improving configurations. Analyze Tableau Server logs, troubleshoot issues, and implement fixes. Ensure the servers high availability, stability, and reliability. Collaboration User Support: Work closely with business analysts, developers, and other stakeholders to ensure effective Tableau usage. Provide technical support and training to end-users to maximize the utility of Tableau Server. Respond to Tableau user queries and issues in a timely manner, providing troubleshooting and solutions. Security and Compliance: Implement Tableau Server security best practices and ensure data protection. Maintain user authentication and access control policies. Ensure compliance with internal and external security standards. Integration and Automation: Support integration of Tableau Server with other data tools and systems (e.g., databases, data warehouses). Automate and schedule tasks such as data extracts, report distribution, and server monitoring. Documentation Reporting: Maintain detailed documentation for Tableau Server configurations, processes, and troubleshooting steps. Provide regular performance and usage reports to senior management. Create and maintain knowledge articles for Tableau-related procedures and guidelines. Qualifications: Education: Bachelordegree in Computer Science, Information Technology, or related field (or equivalent experience). Experience: Minimum of 5+ years of experience as a Tableau Server Administrator or in a similar role. Experience with Tableau Server installation, configuration, administration, and performance tuning. Hands-on experience with SQL and database management (e.g., MS SQL Server, MySQL, etc.). Technical Skills: Expertise in Tableau Server setup, administration, and deployment. Proficiency in configuring Tableau Server for high availability and failover. Familiarity with server and data security protocols. Experience with scripting languages (e.g., Python, PowerShell, or Bash). Strong understanding of Tableau architecture, including data sources, workbooks, views, and server infrastructure. Soft Skills: Excellent problem-solving skills and attention to detail. Strong communication and interpersonal skills. Ability to work in a fast-paced environment and manage multiple priorities. Team player with a collaborative approach to work. Preferred Skills: Tableau Server certification (Tableau Server Certified Associate or Tableau Server Certified Professional). Experience with Tableau Desktop or Tableau Prep. Knowledge of cloud-based Tableau deployments (AWS, Azure). Familiarity with networking and firewall configuration for Tableau Server. ","
Not specified
INR 7.0 - 11.0 Lacs P.A.
Work from Office
Full Time
We are seeking a dynamic and experienced Lead Software Test Engineer with a strong background in Selenium and API Automation Testing using Java. As a key member of our testing team, you will be responsible for leading and executing test strategies, mentoring team members, and ensuring the delivery of high-quality software products. The ideal candidate should have in-depth knowledge of automation testing, excellent leadership skills, and a passion for driving excellence in testing practices. Requirements Key Responsibilities: Define and implement test strategies, methodologies, and best practices for Selenium and API automation testing by developing an effective automation framework. Design, develop, and maintain robust and scalable automation frameworks using Selenium WebDriver and Java. Create and execute automated test scripts for web applications, ensuring comprehensive test coverage. Develop and implement automated tests for APIs and microservices using tools such as Rest Assured or similar. Verify data integrity, security, and performance of APIs through systematic testing. Collaborate with cross-functional teams to develop test plans, test cases, and test scenarios. Execute test cases and ensure the timely identification and resolution of defects. Integrate automated tests into CI/CD pipelines to support continuous testing and deployment. Implement and optimize automated regression testing to maintain software stability. Work closely with development teams, product managers, and other stakeholders to ensure alignment with project goals and requirements. Provide timely and accurate testing status reports to project stakeholders. Champion and enforce quality assurance processes and standards throughout the software development lifecycle. Conduct code reviews and ensure the adoption of best coding practices within the testing team. Lead and mentor a team of software test engineers, providing technical guidance and support. Requirements Bachelordegree in computer science, Information Technology, or a related field. Proven experience in leading Selenium and API automation testing efforts. Expert in understanding the requirements and developing Automation frame work from scratch Strong programming skills in Java and hands-on experience with testing frameworks such as TestNG or JUnit. Extensive experience in designing and implementing automation frameworks for web applications. Solid understanding of API testing principles and tools. Experience with version control systems (e.g., Git) and build tools (e.g., Maven, Gradle). Familiarity with CI/CD tools (e.g., Jenkins, Bamboo). Excellent leadership, communication, and interpersonal skills. Ability to drive innovation and continuous improvement within the testing team.
Not specified
INR 20.0 - 25.0 Lacs P.A.
Work from Office
Full Time
Position :Senior .Net Developer Location : Hyderabad Modeof Employment: Fulltime Salary: Industry Standard. Job Description We are seeking a talented and experienced Senior .NET developer to join our growing team. You will be responsible for the design, development, deployment, and maintenance of software applications using the .NET framework. You will work closely with other developers, designers, and product managers to deliver high-quality, scalable solutions. Responsibilities Participate in requirements gathering and analysis sessions Design, develop, test, and deploy web applications using ASP.NET, ASP.NET Core, and C# Develop and consume Webservices, WCF Services, and Core Microservices. Work with XML and perform XSLT transformations (3+ years of experience required) Manage data using SQL databases (4+ years of experience) Experience with PL/SQL is a plus. Utilize source code management tools like Git, Bitbucket, and SVN Collaborate effectively in an Agile development environment using Jira and Bitbucket Experience with cloud platforms like Azure or AWS is a must. 3+ years of web application development is must. It is good to have Development Experience in WPF. Requirements Qualifications 5-9 years of experience in software development using the .NET Framework 4.8 and above, .NET 6 and above, .netcore 2 and above. Proven ability to write clean, maintainable, and well-documented code Strong understanding of object-oriented programming principles Experience with unit testing and code coverage is a plus Excellent communication and collaboration skills Ability to work independently and as part of a team Benefits Company Standard Benefits. ","
Not specified
INR 30.0 - 35.0 Lacs P.A.
Work from Office
Full Time
Not specified
INR 3.0 - 6.0 Lacs P.A.
Work from Office
Full Time
Not specified
INR 7.0 - 12.0 Lacs P.A.
Work from Office
Full Time
Not specified
INR 7.0 - 12.0 Lacs P.A.
Work from Office
Full Time
Not specified
INR 15.0 - 30.0 Lacs P.A.
Work from Office
Full Time
FIND ON MAP
Gallery
Reviews
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
Chrome Extension