The Challenge
TigerConnect, a leading provider of communication solutions for healthcare organizations, sought to address a longstanding pain point in clinical operations: the friction and inefficiency surrounding on-call scheduling and shift swaps. In the current system, healthcare providers must manually navigate multiple steps to identify who is on call or initiate a shift change—often wasting several minutes per inquiry. This inefficiency compounds across time-sensitive environments, directly impacting provider responsiveness and communication.
The problem wasn’t isolated. With an estimated 300,000 to 400,000 end users across approximately 170 client organizations, the scheduling challenge represented a widespread operational hurdle. TigerConnect needed a solution that could not only streamline this process but also scale across its client base. The stakes were high—not just in terms of user experience and operational efficiency, but because this initiative marked TigerConnect’s first AI-driven product feature. Success would serve as a proving ground for broader AI integration and innovation across its platform.
The Solution
TigerConnect partnered with Tribe to build a modular LLM-powered scheduling assistant designed to intelligently handle user queries related to on-call scheduling, contact lookups, and shift swaps—all via a natural language interface.
The solution uses an agentic architecture, where an LLM router identifies user intent and passes the request to one of three child agents—each purpose-built for a specific type of query. Function calls are dynamically orchestrated by the LLM, enabling a hybrid approach of natural language understanding and deterministic behavior.
This system was implemented as a proof of concept and successfully validated across real-world interaction patterns with future extensibility in mind.
Key Features
- LLM Router Agent: Determines user intent and dispatches the query to the correct child agent (On-Call, Contact Info, or Swap).
- Modular agent structure: Designed for flexibility and future expansion into other workflows.
- Natural language interface: Empowers clinical staff to ask questions like "Who’s on call tonight?" or "Can I swap Friday with someone?" and get actionable answers.
- Real-time data integration: LLM outputs are used as parameters for function calls (e.g. database lookups), ensuring accurate, context-aware results.
- Prompt-level control: Fine-grained prompts ensure deterministic routing and structured outputs.
How It Works
Agentic Workflow Overview
The system uses LLM function calling to combine natural language queries with backend data actions:
- LLMs extract relevant variables (e.g. name, date, shift).
- Those variables are passed into structured function calls.
- Responses are returned and optionally used to continue the conversation or prompt for clarification.
Agent Roles
- Router Agent
- Classifies user intent and selects the appropriate child agent.
- Allows for fallback or graceful failure when queries are out of scope.
- On-Call Agent
- Handles queries about who is on call and when.
- Executes a sequence of LLM queries to extract the date, job role, and relevant events.
- Fetches data from scheduling tables and narrows results to what matters.
- Contact Info Agent
- Extracts provider names from the query.
- Uses fuzzy name matching to locate the correct contact entry in the database.
- Swap Agent
- Extracts user’s shift swap intent.
- Queries availability, identifies eligible swap options, and ranks them using a Swap Recs LLM Query.
- Presents best-fit swap matches for confirmation.
Advantages of the Architecture
- Real-time access to backend scheduling and contact data.
- Flexible interaction patterns, allowing expansion to other agent types.
- Composable workflow, giving teams full control over how data is used and presented.
Impact & The Future
This LLM-powered assistant delivered measurable improvements:
- Reduced time spent by providers on on-call tasks.
- Improved user experience with fewer steps and natural interaction.
- Proven scalability: The modular agent design makes it easy to scale from 2 clients to 170+.
- Strategic value: Successfully validated TigerConnect’s ability to integrate LLMs into clinical tools.
- Path to monetization: Foundation laid for AI-driven revenue opportunities and care coordination platform (CCP) enhancements.
By solving a real workflow pain point and demonstrating the business case for AI in healthcare communications, this initiative helped position TigerConnect to lead the next wave of intelligent, operationally aware clinical software.