Optimizing On-Call Workflows with AI at TigerConnect

Tribe

The Challenge

TigerConnect, a leading provider of communication solutions for healthcare organizations, sought to address a longstanding pain point in clinical operations: the friction and inefficiency surrounding on-call scheduling and shift swaps. In the current system, healthcare providers must manually navigate multiple steps to identify who is on call or initiate a shift change—often wasting several minutes per inquiry. This inefficiency compounds across time-sensitive environments, directly impacting provider responsiveness and communication.

The problem wasn’t isolated. With an estimated 300,000 to 400,000 end users across approximately 170 client organizations, the scheduling challenge represented a widespread operational hurdle. TigerConnect needed a solution that could not only streamline this process but also scale across its client base. The stakes were high—not just in terms of user experience and operational efficiency, but because this initiative marked TigerConnect’s first AI-driven product feature. Success would serve as a proving ground for broader AI integration and innovation across its platform.

The Solution

TigerConnect partnered with Tribe to build a modular LLM-powered scheduling assistant designed to intelligently handle user queries related to on-call scheduling, contact lookups, and shift swaps—all via a natural language interface.

The solution uses an agentic architecture, where an LLM router identifies user intent and passes the request to one of three child agents—each purpose-built for a specific type of query. Function calls are dynamically orchestrated by the LLM, enabling a hybrid approach of natural language understanding and deterministic behavior.

This system was implemented as a proof of concept and successfully validated across real-world interaction patterns with future extensibility in mind.

Key Features

  • LLM Router Agent: Determines user intent and dispatches the query to the correct child agent (On-Call, Contact Info, or Swap).
  • Modular agent structure: Designed for flexibility and future expansion into other workflows.
  • Natural language interface: Empowers clinical staff to ask questions like "Who’s on call tonight?" or "Can I swap Friday with someone?" and get actionable answers.
  • Real-time data integration: LLM outputs are used as parameters for function calls (e.g. database lookups), ensuring accurate, context-aware results.
  • Prompt-level control: Fine-grained prompts ensure deterministic routing and structured outputs.

How It Works

Agentic Workflow Overview

The system uses LLM function calling to combine natural language queries with backend data actions:

  • LLMs extract relevant variables (e.g. name, date, shift).
  • Those variables are passed into structured function calls.
  • Responses are returned and optionally used to continue the conversation or prompt for clarification.

Agent Roles

  1. Router Agent
  • Classifies user intent and selects the appropriate child agent.
  • Allows for fallback or graceful failure when queries are out of scope.
  1. On-Call Agent
  • Handles queries about who is on call and when.
  • Executes a sequence of LLM queries to extract the date, job role, and relevant events.
  • Fetches data from scheduling tables and narrows results to what matters.
  1. Contact Info Agent
  • Extracts provider names from the query.
  • Uses fuzzy name matching to locate the correct contact entry in the database.
  1. Swap Agent
  • Extracts user’s shift swap intent.
  • Queries availability, identifies eligible swap options, and ranks them using a Swap Recs LLM Query.
  • Presents best-fit swap matches for confirmation.

Advantages of the Architecture

  • Real-time access to backend scheduling and contact data.
  • Flexible interaction patterns, allowing expansion to other agent types.
  • Composable workflow, giving teams full control over how data is used and presented.

Impact & The Future

This LLM-powered assistant delivered measurable improvements:

  • Reduced time spent by providers on on-call tasks.
  • Improved user experience with fewer steps and natural interaction.
  • Proven scalability: The modular agent design makes it easy to scale from 2 clients to 170+.
  • Strategic value: Successfully validated TigerConnect’s ability to integrate LLMs into clinical tools.
  • Path to monetization: Foundation laid for AI-driven revenue opportunities and care coordination platform (CCP) enhancements.

By solving a real workflow pain point and demonstrating the business case for AI in healthcare communications, this initiative helped position TigerConnect to lead the next wave of intelligent, operationally aware clinical software.

Related Case Studies

Case Study

Unlocking Data in Unstructured Documents with AI Q&A

Case Study

How Tribe AI Built a Model on GCP That Increased Security Questionnaire Auditor Efficiency by 55%

Case Study

Accelerating Bankruptcy Filings with an AI-Powered Vendor Classification Engine

Case Study

VitalSource Leans on GenAI to Reimagine Content Discoverability for Higher Ed Faculty

Case Study

Orchard Applies GenAI for a Faster, Easier-to-Use Lab Reporting Interface

Case Study

Advancing AI-Driven Drug Discovery with Supercomputing & ML

Case Study

Tribe AI & rbMedia: Transforming Audiobook Production with Claude & Bedrock-Powered Dramatization

Case Study

Francisco Partners Accelerates Portfolio AI Efforts with Tribe AI

Case Study

How Togal AI Built the World's Fastest Estimation Software on AWS

Case Study

Accela Utilizes GenAI to Innovate 311 Help Lines with Faster & More Accurate Routing

Case Study

Expediting Outside-In Due Diligence with AI Insights

Case Study

Insurance Company Uses ML to Optimize Pricing

Case Study

Leveraging AI to Launch a Revolutionary New Construction Product

Case Study

GoTo Revolutionizes Contact Center Quality Management with AI

Case Study

Building a GenAI Roadmap for Educational Content Creation

Case Study

Scaling Sales Enablement & Coaching with AI

Case Study

How Nota Built a Roadmap for AI-enabled Journalism with Help from Tribe

Case Study

How Tribe Helped Reservoir Bring Finance Infrastructure to NFT Trading

Case Study

Building a Proprietary Investment Engine Using Public Data for a Top PE Firm

Case Study

Kettle uses machine learning to balance risk in a changing climate

Case Study

How MyFitnessPal Reimagined A New Era of AI-Enabled Nutrition Tracking with Tribe AI

Case Study

How Wingspan built a machine learning roadmap with Tribe AI

Case Study

How Avalon Used GenAI to Achieve 100% Precision in Prior Authorization

Case Study

Boomi Leverages Amazon Bedrock for Faster Help Desk Responses

Case Study

Tribe AI & Venture Labs: Accelerating Startups with Tailored AI Expertise

Case Study

Sumo Logic Utilizes GenAI to Reduce Mean Time-to-Resolution of Log Data

Case Study

Native Instruments Leverages Amazon Bedrock for Smarter, More Intuitive Search and Discovery for Music Creators

Case Study

How Fantasmo is using machine learning to make GPS obsolete

Case Study

GenAI Solutions: How Bright Transformed Workforce Training with Tribe AI

Case Study

How Tribe AI Shaped Truebit’s AI Strategy

Case Study

Taking a Data-Driven Approach to Women's Fertility with Rita

Case Study

Litmos Innovates in EdTech by Building AI with a Learner Focus

Case Study

Scaling ML at Sonova Through Automation and MLOps

Tribe helps organizations rapidly deploy AI solutions that have real business impact.

Close
Tribe