Home
Jobs
Companies
Resume
10 Job openings at Frisco Analytics
About Frisco Analytics

Frisco Analytics specializes in data analytics solutions to enhance business intelligence and operational efficiency.

VP of Engineering

Not specified

10 - 12 years

INR 30.0 - 34.0 Lacs P.A.

Work from Office

Full Time

About The Role The VP of Engineering will lead the technical vision and execution of LakeFusionAI's platform, leveraging cutting-edge technologies such as Large Language Models (LLMs), Vector Search, and Databricks This role requires a deep understanding of scalable data platforms, modern AI/ML methodologies, and experience working within or alongside Databricks ecosystems to drive innovation in Master Data Management (MDM) and AI-powered solutions Requirements Key Responsibilities: Define and execute the engineering roadmap with a strong emphasis on Databricks capabilities, LLMs, and Vector Search technologies Architect scalable and high-performance data pipelines using Databricks, Delta Lake, and Unity Catalog, ensuring optimal data governance and analytics Lead the integration of LLMs and vector databases to enhance entity resolution, semantic search, and intelligent data enrichment Drive collaboration with Databricks' tools and features to implement real-time data processing workflows and AI/ML solutions Oversee the design and development of AI-powered MDM solutions, ensuring compliance with business rules such as match and merge, survivorship, and anomaly detection Build and mentor a high-performing engineering team, fostering a culture of innovation and collaboration Collaborate with internal stakeholders, clients, and partners to deliver scalable solutions that align with business goals Qualifications Educational Background: Masters or PhD in Computer Science, Artificial Intelligence, Data Science, or related fields Experience: 12+ years of engineering leadership experience in AI/ML or data platform-driven organizations Proven expertise in deploying LLMs (e g , GPT, BERT) and vector search technologies (e g , Pinecone, Milvus, Weaviate) Hands-on experience with Databricks workflows, including Delta Lake, Unity Catalog, and Machine Learning pipelines Candidates with direct exposure to working in or with Databricks teams are highly preferred Strong background in MDM processes and data engineering best practices Demonstrated success in building scalable, cloud-native architectures on AWS or Azure Technical Skills: Advanced knowledge of AI/ML frameworks such as PyTorch, TensorFlow, and Hugging Face Proficiency in Databricks features, including Auto Loader, SQL Analytics, and real-time processing pipelines Expertise in vector database solutions and tools for building LLM-based applications Familiarity with compliance standards like HIPAA and GDPR Soft Skills: Strategic thinker with the ability to align engineering initiatives with broader business objectives Strong leadership and communication skills, capable of engaging with clients, partners, and stakeholders Preferred Qualifications Direct experience working within Databricks or significant collaboration with their ecosystem Industry expertise in healthcare or life sciences, retail etc Familiarity with Databricks Machine Learning and tools for scaling AI/ML models About The Company Frisco Analytics is a forward-thinking data consulting firm dedicated to empowering businesses with cutting-edge analytics and insights We specialize in transforming complex data into actionable strategies that drive growth and innovation Our expert team leverages advanced technologies and a deep understanding of industry trends to deliver tailored solutions that meet the unique needs of our clients At Frisco Analytics, we believe in the power of data to unlock potential and create lasting impact, partnering with businesses to navigate the ever-evolving landscape of modern analytics Apply Now

Senior Data Engineer

Not specified

8 - 13 years

INR 25.0 - 30.0 Lacs P.A.

Work from Office

Full Time

About The Role The Senior Data Engineer will play a pivotal role in building, optimizing, and maintaining data pipelines for LakeFusionAI's platform This role emphasizes advanced expertise in Databricks and its ecosystem, requiring a proven ability to design scalable, high-performance data solutions aligned with best practices for data quality, governance, and compliance Ideal candidates will have a Databricks MVP-type skillset, showcasing deep technical knowledge and leadership in Databricks workflows and features Requirements Key Responsibilities: Design, build, and maintain scalable and high-performance data pipelines using Databricks and Delta Lake Implement robust data ingestion processes, including real-time ingestion using tools like Databricks Auto Loader and structured streaming Work extensively with Unity Catalog to manage metadata, enforce governance, and ensure data quality across the platform Optimize data storage and transformations for structured and unstructured data from multiple sources (e g , Salesforce, Workday, ADLS) Collaborate with data scientists and engineers to integrate data pipelines with LLMs and vector database workflows Define and implement advanced ETL/ELT processes, focusing on performance tuning and cost optimization in a cloud-native environment Ensure compliance with HIPAA and other relevant data privacy and security standards Provide guidance on best practices for Databricks usage, staying up-to-date with the latest Databricks features and enhancements Qualifications Educational Background: Bachelors or Masters degree in Computer Science, Data Engineering, or a related field Experience: 7+ years of experience in data engineering roles, with at least 3 years of hands-on experience with Databricks Proven expertise in designing and optimizing data pipelines using Spark on Databricks Experience with Delta Lake for data storage and processing Strong understanding of data governance and metadata management using Unity Catalog Experience integrating data workflows with advanced AI/ML models, including LLMs and graph-based solutions Technical Skills: Proficiency in Databricks SQL, Python, and Scala for data engineering tasks Expertise in Databricks Auto Loader and real-time streaming pipelines Familiarity with vector databases (e g , Pinecone, Milvus) and their integration with Databricks workflows Knowledge of cloud platforms such as AWS or Azure, with hands-on experience in cloud-native data solutions Strong debugging and optimization skills for large-scale data pipelines Soft Skills: Ability to collaborate with cross-functional teams and translate business needs into technical requirements Excellent problem-solving and analytical skills Strong communication skills for explaining technical concepts to non-technical stakeholders Preferred Qualifications Certification as a Databricks Certified Data Engineer or Databricks Certified Professional Hands-on experience with Databricks MLflow for experiment tracking and model deployment Previous contributions to Databricks-related open-source projects, blogs, or technical community engagements Familiarity with HIPAA compliance and healthcare data management About The Company Frisco Analytics is a forward-thinking data consulting firm dedicated to empowering businesses with cutting-edge analytics and insights We specialize in transforming complex data into actionable strategies that drive growth and innovation Our expert team leverages advanced technologies and a deep understanding of industry trends to deliver tailored solutions that meet the unique needs of our clients At Frisco Analytics, we believe in the power of data to unlock potential and create lasting impact, partnering with businesses to navigate the ever-evolving landscape of modern analytics Apply Now

Senior MDM Developer

Not specified

1 - 6 years

INR 4.0 - 8.0 Lacs P.A.

Work from Office

Full Time

About The Role The Senior MDM Developer will be responsible for designing, implementing, and optimizing Master Data Management (MDM) solutions using industry-leading tools such as Informatica, Reltio, or similar platforms This role focuses on ensuring robust data governance, accurate entity resolution, and scalable MDM processes to deliver clean, reliable, and actionable data for enterprise use Requirements Key Responsibilities: Develop and configure MDM solutions using platforms such as Informatica MDM, Reltio, or other MDM tools to support entity resolution, match-and-merge, and survivorship rules Implement data governance and data quality frameworks, ensuring accuracy and consistency across datasets Create workflows to address common MDM challenges, including data deduplication, enrichment, and anomaly detection Build scalable pipelines for integrating data from multiple sources, including structured and unstructured data, into the MDM platform Collaborate with business and technical teams to define and implement MDM business rules and workflows based on organizational requirements Provide technical support and troubleshooting for MDM-related issues, ensuring system reliability and uptime Conduct regular audits and validation processes to maintain data accuracy and compliance with industry standards such as HIPAA or GDPR Stay updated on the latest advancements in MDM platforms, tools, and methodologies, bringing best practices to the team Qualifications Educational Background: Bachelors or Masters degree in Computer Science, Information Systems, or a related field Experience: 6+ years of experience in MDM development, with hands-on expertise in tools like Informatica MDM, Reltio, or similar platforms Strong experience in implementing workflows for match-and-merge, survivorship, and hierarchy management Deep understanding of data quality principles, governance frameworks, and metadata management Proven experience in integrating multiple data sources (e g , CRMs, ERPs, third-party APIs) into MDM platforms Familiarity with graph-based MDM methodologies and hierarchy visualizations Technical Skills: Expertise in Informatica MDM, Reltio, or comparable platforms, including configuration and administration Proficiency in SQL and scripting languages (e g , Python) for data processing and analysis Knowledge of REST APIs and integration with external systems such as Salesforce, SAP, or Workday Strong understanding of data modeling techniques, particularly for MDM use cases Soft Skills: Strong analytical and problem-solving skills for addressing complex MDM challenges Excellent communication skills to collaborate with business and technical teams Ability to prioritize tasks and deliver solutions in a fast-paced environment Preferred Qualifications Certification in Informatica MDM or Reltio or similar platforms Experience in implementing MDM solutions for healthcare, life sciences, or financial services industries Knowledge of compliance standards such as HIPAA, GDPR, or CCPA Experience with other data management tools (e g , Talend, Collibra) is a plus About The Company Frisco Analytics is a forward-thinking data consulting firm dedicated to empowering businesses with cutting-edge analytics and insights We specialize in transforming complex data into actionable strategies that drive growth and innovation Our expert team leverages advanced technologies and a deep understanding of industry trends to deliver tailored solutions that meet the unique needs of our clients At Frisco Analytics, we believe in the power of data to unlock potential and create lasting impact, partnering with businesses to navigate the ever-evolving landscape of modern analytics Apply Now

Delivery Manager

Not specified

2 - 7 years

INR 11.0 - 15.0 Lacs P.A.

Work from Office

Full Time

About The Role The Delivery Manager will oversee the successful delivery of projects within LakeFusionAI This role requires strong project management skills, team coordination, and a deep understanding of AI/ML and MDM solutions to ensure on-time and high-quality outcomes for clients Requirements Key Responsibilities: Manage the end-to-end delivery of AI/ML and MDM projects for clients Coordinate across engineering, data science, and client teams to meet milestones Track progress, risks, and dependencies while maintaining stakeholder alignment Ensure compliance with industry standards (e g , HIPAA for healthcare) Optimize resource allocation and delivery processes for efficiency Qualifications Bachelors degree in Computer Science, Engineering, or a related field 8+ years of experience in project delivery or program management in the tech industry Strong knowledge of Agile methodologies and tools like JIRA or Azure DevOps Excellent communication and stakeholder management skills About The Company Frisco Analytics is a forward-thinking data consulting firm dedicated to empowering businesses with cutting-edge analytics and insights We specialize in transforming complex data into actionable strategies that drive growth and innovation Our expert team leverages advanced technologies and a deep understanding of industry trends to deliver tailored solutions that meet the unique needs of our clients At Frisco Analytics, we believe in the power of data to unlock potential and create lasting impact, partnering with businesses to navigate the ever-evolving landscape of modern analytics Apply Now

DevOps Engineer

Not specified

8 - 13 years

INR 25.0 - 30.0 Lacs P.A.

Work from Office

Full Time

About The Role LakeFusionAI is seeking a skilled DevOps Engineer to streamline our development and deployment processes, ensuring reliable and scalable delivery pipelines for our AI-powered Master Data Management (MDM) platform This role involves optimizing cloud infrastructure, enhancing CI/CD workflows, and contributing to our long-term vision of automating deployments through infrastructure-as-code (IaC) solutions Requirements Key Responsibilities: CI/CD Pipeline Management: Design, implement, and maintain CI/CD pipelines for AI/ML applications Cloud Infrastructure Optimization: Build and manage cloud-based infrastructure on platforms such as AWS, Azure, and Databricks System Monitoring & Reliability: Monitor system performance, set up failover mechanisms, and ensure high availability Security & Compliance: Ensure deployments adhere to security and compliance standards, including HIPAA Collaboration: Work closely with engineering teams to enhance deployment processes, resolve operational issues, and promote DevOps best practices Infrastructure as Code (IaC): Develop Terraform scripts for automating product deployment in customer environments Qualifications Education: Bachelors degree in Computer Science, Engineering, or a related field Experience: 6+ years of hands-on DevOps experience Technical Expertise: Proficiency in Kubernetes, Docker, Jenkins, and cloud platforms (AWS, Azure, Databricks) IaC & Automation: Experience writing and managing Terraform scripts Security & Compliance: Familiarity with security best practices and compliance frameworks like HIPAA Preferred Skills: Understanding of Databricks and Unity Catalog is a strong plus About The Company Frisco Analytics is a forward-thinking data consulting firm dedicated to empowering businesses with cutting-edge analytics and insights We specialize in transforming complex data into actionable strategies that drive growth and innovation Our expert team leverages advanced technologies and a deep understanding of industry trends to deliver tailored solutions that meet the unique needs of our clients At Frisco Analytics, we believe in the power of data to unlock potential and create lasting impact, partnering with businesses to navigate the ever-evolving landscape of modern analytics Apply Now

Senior Data Engineer

Not specified

8 - 13 years

INR 25.0 - 30.0 Lacs P.A.

Work from Office

Full Time

The Senior Data Engineer will play a pivotal role in building, optimizing, and maintaining data pipelines for LakeFusion.AIs platform. This role emphasizes advanced expertise in Databricks and its ecosystem, requiring a proven ability to design scalable, high-performance data solutions aligned with best practices for data quality, governance, and compliance. Ideal candidates will have a Databricks MVP-type skillset, showcasing deep technical knowledge and leadership in Databricks workflows and features. Requirements Key Responsibilities: Design, build, and maintain scalable and high-performance data pipelines using Databricks and Delta Lake . Implement robust data ingestion processes, including real-time ingestion using tools like Databricks Auto Loader and structured streaming. Work extensively with Unity Catalog to manage metadata, enforce governance, and ensure data quality across the platform. Optimize data storage and transformations for structured and unstructured data from multiple sources (e.g., Salesforce, Workday, ADLS). Collaborate with data scientists and engineers to integrate data pipelines with LLMs and vector database workflows. Define and implement advanced ETL/ELT processes, focusing on performance tuning and cost optimization in a cloud-native environment. Ensure compliance with HIPAA and other relevant data privacy and security standards. Provide guidance on best practices for Databricks usage, staying up-to-date with the latest Databricks features and enhancements. Qualifications: Educational Background : Bachelor s or Master s degree in Computer Science, Data Engineering, or a related field. Experience : 7+ years of experience in data engineering roles, with at least 3 years of hands-on experience with Databricks . Proven expertise in designing and optimizing data pipelines using Spark on Databricks. Experience with Delta Lake for data storage and processing. Strong understanding of data governance and metadata management using Unity Catalog . Experience integrating data workflows with advanced AI/ML models, including LLMs and graph-based solutions. Technical Skills : Proficiency in Databricks SQL , Python, and Scala for data engineering tasks. Expertise in Databricks Auto Loader and real-time streaming pipelines. Familiarity with vector databases (e.g., Pinecone, Milvus) and their integration with Databricks workflows. Knowledge of cloud platforms such as AWS or Azure, with hands-on experience in cloud-native data solutions. Strong debugging and optimization skills for large-scale data pipelines. Soft Skills : Ability to collaborate with cross-functional teams and translate business needs into technical requirements. Excellent problem-solving and analytical skills. Strong communication skills for explaining technical concepts to non-technical stakeholders. Preferred Qualifications: Certification as a Databricks Certified Data Engineer or Databricks Certified Professional . Hands-on experience with Databricks MLflow for experiment tracking and model deployment. Previous contributions to Databricks-related open-source projects, blogs, or technical community engagements. Familiarity with HIPAA compliance and healthcare data management.

Senior MDM Developer

Not specified

6 - 8 years

INR 4.0 - 8.0 Lacs P.A.

Work from Office

Full Time

The Senior MDM Developer will be responsible for designing, implementing, and optimizing Master Data Management (MDM) solutions using industry-leading tools such as Informatica , Reltio , or similar platforms. This role focuses on ensuring robust data governance, accurate entity resolution, and scalable MDM processes to deliver clean, reliable, and actionable data for enterprise use. Requirements Key Responsibilities: Develop and configure MDM solutions using platforms such as Informatica MDM , Reltio , or other MDM tools to support entity resolution, match-and-merge, and survivorship rules. Implement data governance and data quality frameworks, ensuring accuracy and consistency across datasets. Create workflows to address common MDM challenges, including data deduplication, enrichment, and anomaly detection. Build scalable pipelines for integrating data from multiple sources, including structured and unstructured data, into the MDM platform. Collaborate with business and technical teams to define and implement MDM business rules and workflows based on organizational requirements. Provide technical support and troubleshooting for MDM-related issues, ensuring system reliability and uptime. Conduct regular audits and validation processes to maintain data accuracy and compliance with industry standards such as HIPAA or GDPR. Stay updated on the latest advancements in MDM platforms, tools, and methodologies, bringing best practices to the team. Qualifications: Educational Background : Bachelor s or Master s degree in Computer Science, Information Systems, or a related field. Experience : 6+ years of experience in MDM development, with hands-on expertise in tools like Informatica MDM , Reltio , or similar platforms. Strong experience in implementing workflows for match-and-merge , survivorship , and hierarchy management . Deep understanding of data quality principles, governance frameworks, and metadata management. Proven experience in integrating multiple data sources (e.g., CRMs, ERPs, third-party APIs) into MDM platforms. Familiarity with graph-based MDM methodologies and hierarchy visualizations. Technical Skills : Expertise in Informatica MDM , Reltio , or comparable platforms, including configuration and administration. Proficiency in SQL and scripting languages (e.g., Python) for data processing and analysis. Knowledge of REST APIs and integration with external systems such as Salesforce, SAP, or Workday. Strong understanding of data modeling techniques, particularly for MDM use cases. Soft Skills : Strong analytical and problem-solving skills for addressing complex MDM challenges. Excellent communication skills to collaborate with business and technical teams. Ability to prioritize tasks and deliver solutions in a fast-paced environment. Preferred Qualifications: Certification in Informatica MDM or Reltio or similar platforms. Experience in implementing MDM solutions for healthcare, life sciences, or financial services industries. Knowledge of compliance standards such as HIPAA, GDPR, or CCPA. Experience with other data management tools (e.g., Talend, Collibra) is a plus.

VP of Engineering

Not specified

12 - 15 years

INR 30.0 - 34.0 Lacs P.A.

Work from Office

Full Time

The VP of Engineering will lead the technical vision and execution of LakeFusion.AIs platform, leveraging cutting-edge technologies such as Large Language Models (LLMs), Vector Search, and Databricks. This role requires a deep understanding of scalable data platforms, modern AI/ML methodologies, and experience working within or alongside Databricks ecosystems to drive innovation in Master Data Management (MDM) and AI-powered solutions. Requirements Key Responsibilities: Define and execute the engineering roadmap with a strong emphasis on Databricks capabilities, LLMs, and Vector Search technologies. Architect scalable and high-performance data pipelines using Databricks, Delta Lake, and Unity Catalog, ensuring optimal data governance and analytics. Lead the integration of LLMs and vector databases to enhance entity resolution, semantic search, and intelligent data enrichment. Drive collaboration with Databricks tools and features to implement real-time data processing workflows and AI/ML solutions. Oversee the design and development of AI-powered MDM solutions, ensuring compliance with business rules such as match and merge, survivorship, and anomaly detection. Build and mentor a high-performing engineering team, fostering a culture of innovation and collaboration. Collaborate with internal stakeholders, clients, and partners to deliver scalable solutions that align with business goals. Qualifications: Educational Background : Master s or Ph.D. in Computer Science, Artificial Intelligence, Data Science, or related fields. Experience : 12+ years of engineering leadership experience in AI/ML or data platform-driven organizations. Proven expertise in deploying LLMs (e.g., GPT, BERT) and vector search technologies (e.g., Pinecone, Milvus, Weaviate). Hands-on experience with Databricks workflows, including Delta Lake, Unity Catalog, and Machine Learning pipelines. Candidates with direct exposure to working in or with Databricks teams are highly preferred. Strong background in MDM processes and data engineering best practices. Demonstrated success in building scalable, cloud-native architectures on AWS or Azure. Technical Skills : Advanced knowledge of AI/ML frameworks such as PyTorch, TensorFlow, and Hugging Face. Proficiency in Databricks features, including Auto Loader, SQL Analytics, and real-time processing pipelines. Expertise in vector database solutions and tools for building LLM-based applications. Familiarity with compliance standards like HIPAA and GDPR. Soft Skills : Strategic thinker with the ability to align engineering initiatives with broader business objectives. Strong leadership and communication skills, capable of engaging with clients, partners, and stakeholders. Preferred Qualifications: Direct experience working within Databricks or significant collaboration with their ecosystem. Industry expertise in healthcare or life sciences, retail etc. Familiarity with Databricks Machine Learning and tools for scaling AI/ML models.

Delivery Manager

Not specified

2 - 7 years

INR 11.0 - 15.0 Lacs P.A.

Work from Office

Full Time

The Delivery Manager will oversee the successful delivery of projects within LakeFusion.AI. This role requires strong project management skills, team coordination, and a deep understanding of AI/ML and MDM solutions to ensure on-time and high-quality outcomes for clients. Requirements Key Responsibilities: Manage the end-to-end delivery of AI/ML and MDM projects for clients. Coordinate across engineering, data science, and client teams to meet milestones. Track progress, risks, and dependencies while maintaining stakeholder alignment. Ensure compliance with industry standards (e.g., HIPAA for healthcare). Optimize resource allocation and delivery processes for efficiency. Qualifications: Bachelor s degree in Computer Science, Engineering, or a related field. 8+ years of experience in project delivery or program management in the tech industry. Strong knowledge of Agile methodologies and tools like JIRA or Azure DevOps. Excellent communication and stakeholder management skills.

Sr. DevOps Engineer

Not specified

1 - 6 years

INR 6.0 - 10.0 Lacs P.A.

Work from Office

Full Time

LakeFusion.AI is seeking a skilled DevOps Engineer to streamline our development and deployment processes, ensuring reliable and scalable delivery pipelines for our AI-powered Master Data Management (MDM) platform. This role involves optimizing cloud infrastructure, enhancing CI/CD workflows, and contributing to our long-term vision of automating deployments through infrastructure-as-code (IaC) solutions. Requirements Key Responsibilities: CI/CD Pipeline Management: Design, implement, and maintain CI/CD pipelines for AI/ML applications. Cloud Infrastructure Optimization: Build and manage cloud-based infrastructure on platforms such as AWS, Azure, and Databricks. System Monitoring Reliability: Monitor system performance, set up failover mechanisms, and ensure high availability. Security Compliance: Ensure deployments adhere to security and compliance standards, including HIPAA. Collaboration: Work closely with engineering teams to enhance deployment processes, resolve operational issues, and promote DevOps best practices. Infrastructure as Code (IaC): Develop Terraform scripts for automating product deployment in customer environments. Qualifications: Education: Bachelor s degree in Computer Science, Engineering, or a related field. Experience: 6+ years of hands-on DevOps experience. Technical Expertise: Proficiency in Kubernetes, Docker, Jenkins, and cloud platforms (AWS, Azure, Databricks). IaC Automation: Experience writing and managing Terraform scripts. Security Compliance: Familiarity with security best practices and compliance frameworks like HIPAA. Preferred Skills: Understanding of Databricks and Unity Catalog is a strong plus.

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Chrome Extension

Apply to 20+ Portals
in one click

chrome image
Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Setup Job Alerts

Job Titles Overview