Generative AI is rapidly transforming enterprise operations, with most forward-thinking organizations now exploring both customer-facing applications and internal productivity enhancements in parallel. While customer applications and content generation tools capture headlines, the most significant business impact often lies in modernizing internal operations—particularly in support functions where traditional automation falls short.
Across industries, companies are under immense pressure to reduce costs, improve efficiency, and scale operations without significantly increasing headcount. Legacy support models, particularly in IT and internal operations, often struggle to keep pace with employee demands, resulting in inefficiencies, bottlenecks, and rising service costs. Traditional rule-based automation tools have provided some relief, but they lack the adaptability required to handle complex, evolving queries.
Unlike static automation solutions, GenAI can process natural language, learn from interactions, and dynamically generate responses, enabling enterprises to modernize internal operations, accelerate problem resolution, and improve service experiences for employees.
Despite its promise, the challenge for many organizations isn’t just understanding AI’s potential—it’s knowing how to implement it in a way that delivers tangible business value. Without a structured adoption strategy, AI projects often fail due to poor integration with existing workflows, low user adoption, and unclear ROI metrics.
A Fortune 150 SaaS company exemplifies this strategic shift. Facing a 70% escalation rate in their IT Service Desk, leadership recognized an opportunity to leverage GenAI’s natural language processing and dynamic response capabilities to revolutionize employee support. This case study examines how their IT Director moved beyond theoretical AI potential to deliver tangible business value through a structured, results-driven implementation of a GenAI-powered Support Copilot.
The initiative demonstrates how enterprises can overcome common AI adoption challenges—poor workflow integration, low user adoption, and unclear ROI metrics—by applying disciplined product strategy to AI deployment. From business case development through scaled implementation, their approach provides executives and product leaders with a practical blueprint for driving meaningful AI transformation in enterprise operations.
The Challenge: IT Support Inefficiencies
The IT Service Desk faced mounting inefficiencies, including high ticket volumes, long resolution times, and limited self-service capabilities. Despite having an AI-powered virtual assistant, most IT support interactions still required human intervention, increasing operational costs and delaying response times. Employees struggled to resolve common IT issues independently, while support teams sought to optimize service efficiency and cost-effectiveness.
A key challenge was choosing the right service delivery method to balance support quality and savings opportunities. Over-relying on agent involvement increased costs, while excessive automation risked frustrating users if responses lacked depth or accuracy. The team needed a strategic mix of immediate answers, workflow automation, proactive notifications, agent support, and ticket creation to ensure efficiency without compromising user satisfaction.
The Solution: AI-Powered Support Copilot
To address these inefficiencies, the team developed a GenAI-driven Support Copilot capable of resolving routine IT issues without escalating to human agents. Unlike traditional rule-based chatbots, this solution leverages natural language processing and retrieval-augmented generation (RAG) to deliver context-aware responses and continuously improve through feedback.
By integrating seamlessly into existing ITSM workflows, the AI-driven Copilot aimed to reduce ticket volumes, accelerate resolution times, and enhance employee self-service capabilities. More importantly, the carefully designed service delivery strategy ensured that automation was applied where it maximized efficiency while agent support remained available for complex cases, creating an optimal balance between cost savings and high-quality IT support service.
Image Title: Sample Understanding Various Support Delivery Methods
Strategic Alignment and Roadmap Development
With a clear problem statement and well-defined objectives, the next step was to align strategy with engineering execution. The Service Desk Chat AI Copilot was designed to enhance IT support efficiency while ensuring a seamless user experience.
Before defining the solution, the team applied end-user-centric product design principles, focusing on who the solution was being built for, their roles, and the specific pain points in their workflow. By analyzing service desk data and gathering insights from IT support agents and employees, the team identified recurring issues that required AI-driven assistance. This user-first approach ensured that the AI Copilot was tailored to real-world needs rather than being driven solely by technological capabilities.
To minimize risk and maximize impact, the roadmap prioritized an MVP focused on high-value use cases. The initial phase centered on information retrieval within the company’s primary communication platform, providing immediate user benefits. Subsequent phases introduced capabilities such as request-based inquiries, asset provisioning, and advanced troubleshooting, progressively increasing AI’s role in IT support.
Selecting the Right Tech Stack & AI Model
Given security, scalability, and integration requirements, the Engineering team selected the company’s internal LLM with an advanced retrieval-augmented generation (RAG) mechanism. While external GPT-based models were considered, the internal solution provided greater control, improved security, and domain-specific accuracy.
To enhance AI performance, the team optimized retrieval mechanisms to handle ambiguous IT support queries effectively. The knowledge retrieval system was fine-tuned to reduce fallback rates to human agents, significantly improving response accuracy. Additionally, custom APIs were developed to enable seamless integration with ITSM workflows, allowing real-time interactions with ticketing and asset management systems.
Image Title: Sample User Persona of an IT Support Copilot End-User
Agile Product Development Process
The Engineering team worked in iterative sprints, ensuring continuous improvements throughout development. The Product Manager collaborated closely with Engineering to conduct feasibility and impact assessments for each proposed use case.
To align technical execution with user expectations, the Product Manager provided detailed UX flows, ensuring clarity in AI responses, expected interactions, and integration within ITSM workflows. Regular feedback loops allowed for rapid iteration, resolving engineering challenges while refining AI performance.
This collaborative and agile approach enabled the team to move quickly, ensuring the AI-powered Copilot delivered measurable impact from early-stage deployment.
With the MVP ready for production, the focus shifted to minimizing friction, validating product performance, and gathering real-world insights.The team launched a controlled pilot, targeting a subset of users who were engaged and likely to provide valuable feedback.
The two-week pilot phase allowed the team to monitor system stability, track AI accuracy, and refine the experience based on user feedback. Users who encountered issues were followed up with directly, ensuring the AI model could quickly adapt and improve before full deployment across the entire organization.
Performance Metrics and Expansion
Early results demonstrated the AI solution’s effectiveness, with escalation rates to human agents dropping by 85% compared to the legacy system. Encouraged by this outcome, the team expanded the rollout across additional departments to further validate performance.
As adoption scaled, KPIs were closely tracked. Although escalation rates were slightly higher in broader deployments compared to the pilot, the AI Copilot still far outperformed the legacy system. With approximately 40% of IT support case volume resolved by AI —even in its MVP state— the solution presents a meaningful opportunity to drive further efficiency. As adoption grows and capabilities expand, the department is well-positioned to realize up to a 30% reduction in cost per ticket—freeing up capacity, reducing operational overhead, and enabling the IT Service Desk to focus on higher-value, more complex support needs.
Image Title: Product & AI Metrics Aligned to the User Journey
Validating Long-Term Success
The team continuously monitored AI accuracy, resolution speed, and user satisfaction, refining the model based on performance data. Customer Satisfaction (CSAT) scores consistently exceeded 4.6, significantly outperforming other GenAI applications deployed within the company.
With penetration testing completed and system stability confirmed, the organization fully decommissioned the legacy solution one week after the global rollout, marking a successful transition to AI-powered IT support at scale.
This initiative demonstrated the potential business impact of AI-driven IT support, setting the stage for future iterations and expansion into additional use cases. The success of this deployment also provided a blueprint for accelerating AI adoption across other business functions.
Image Title: CSAT, Deflection, and Cost Reduction Metrics
What Worked Well?
One of the most critical factors behind the success of this GenAI initiative was the alignment between strategy, engineering, and execution. From the outset, the IT Service Desk, Engineering, and Product teams worked in lockstep, fostering trust and transparency through open communication, shared accountability, and clear goal alignment. This ensured that the vision for the AI-powered Copilot was clearly communicated and executed against well-defined objectives. By maintaining a collaborative and transparent approach, the team was able to address challenges proactively and make informed decisions that kept development on track.
Once the MVP was released to a pilot department, the continuous iteration process became another key success factor. Real-world feedback from users enabled the team to refine responses, optimize AI interactions, and improve the product’s UX/UI. The ability to make data-driven enhancements early on ensured that the solution was not only functional, but also intuitive and effective in real-world support scenarios.
Challenges & How We Overcame Them
Managing Stakeholder Expectations
As a high-profile AI initiative, the project attracted significant executive attention and expectations. Managing this required a well-defined business case and PRD, which provided a structured rationale for decision-making, ensuring strategic alignment and continued buy-in throughout development and deployment.
Handling AI Bias & Hallucinations
Another challenge was ensuring that LLM-generated responses were accurate, relevant, and free from bias or hallucinations—a common issue in AI-powered applications. Since reliable AI outputs were essential for maintaining trust in the system, the team adopted a two-pronged testing strategy:
1. Golden Data Set Approach: The IT Service Desk team curated a golden dataset of expected responses, allowing engineers to manually track AI accuracy by comparing generated outputs against validated answers.
2. Leveraging Product Analytics: As the solution matured, product analytics were leveraged to monitor whether users were successfully resolving issues with AI-generated answers. This helped the team identify patterns of failure, enabling targeted fine-tuning of the model.
This proactive testing and monitoring approach allowed the team to mitigate AI-related risks and ensure that the Copilot provided reliable, high-quality responses to users.
Best Practices for Business Leaders
For business leaders looking to drive successful GenAI implementations, three key principles emerged from this initiative:
Clearly defining the problem, opportunity, and expected impact secures early buy-in and aligns the product strategy with business objectives.
Early and ongoing collaboration between Product and Engineering ensures that technical feasibility, model performance, and user experience are considered holistically, leading to a more effective solution.
AI deployment isn’t just about launching a system—it’s about ensuring users understand, trust, and benefit from it. Leveraging product analytics and user feedback loops enables continuous refinement, increasing engagement and long-term success.
By following these best practices, organizations can maximize GenAI’s business impact while ensuring strong adoption and sustained value.
The successful deployment of the GenAI-powered IT Support Copilot demonstrates that effective AI implementation requires more than cutting-edge technology—it demands strategic vision, disciplined execution, and continuous refinement. This case study reveals a clear path forward: an 85% reduction in escalations, dramatically improved resolution times, and CSAT scores consistently above 4.6 all point to measurable business impact that extends far beyond IT.
For executive leaders, the strategic implications are clear:
1. Act with urgency, but execute with precision. The window for competitive advantage is narrowing as GenAI capabilities mature. Begin by conducting a comprehensive assessment of your enterprise workflows to identify high-value, low-risk opportunities for AI augmentation.
2. Build for scale from day one. While starting small is prudent, architect your AI initiatives with enterprise-wide deployment in mind. Ensure your technology stack can accommodate growing data volumes, expanding use cases, and increasing user expectations.
3. Integrate AI into your talent strategy. The most successful organizations are redefining roles to leverage AI-enhanced productivity. Invest in upskilling programs that enable your workforce to collaborate effectively with AI systems rather than merely responding to automation.
4. Establish cross-functional AI governance. Form a dedicated team spanning IT, legal, HR, and business units to address emerging questions of data privacy, accuracy standards, and appropriate AI use cases.
The time for theoretical discussions about AI’s potential has passed. Organizations that systematically implement GenAI solutions today will create substantial operational advantages that compound over time. By applying the structured approach demonstrated in this case study—clear business case development, collaborative engineering partnership, and metrics-driven refinement—you can transform GenAI from an experimental technology into a core driver of enterprise productivity and innovation.
The question is no longer whether to adopt GenAI, but how quickly you can scale it effectively across your organization.
This article was written by Leila Shaban, Research Associate at Metis Strategy
Thank you to everyone who attended and participated in the 17th Metis Strategy Digital Symposium. Highlights from the event are below. Check out Metis Strategy’s Youtube channel and Technovation podcast in the coming weeks for recordings of each conversation.
Companies continue to make progress in their AI journeys, deploying the technology to drive efficiency, productivity and innovation. Technology leaders are focused now on driving adoption, generating buy-in for new initiatives, and rolling out new training programs to ensure teams across the enterprise are able to take advantage of what AI has to offer. Below are a few highlights from the event:
Building a foundation for AI at scale
Nearly all CIOs on stage said scalable infrastructure and high-quality, accessible data are key to driving value from AI initiatives. Over the past few years, many organizations have focused on building data platforms, shifting to cloud and rethinking ways of working in order to deliver AI at scale. “Having a really good data infrastructure is foundational to taking advantage of any of these generative AI capabilities,” Priceline CTO Marty Brodbeck said. Many speakers noted their current efforts to get reliable data into the hands of more teams across their organizations.
Nearly half of MSDS attendees said that the rapid evolution of AI, among other macro issues, will have the biggest impact on their organizations in the year ahead
Exploring new use cases
Many organizations continue to train generative AI on internal knowledge bases to streamline processes and enable more self service. CIOs also see potential around developer productivity.
Bristol Myers Squibb receives thousands of calls from physicians and nurse practitioners each day requesting information about specific, often technical, topics, Chief Digital and Technology Officer Greg Meyers said. MDs on the other side of the call often find those answers in internal documents. Now, an AI chatbot trained on the company’s knowledge base can search through the documents to retrieve answers to these questions much faster. With enough fine tuning, Meyers noted the chatbot could constrain search results to trusted documents and help agents provide near-immediate answers to customer queries.
At UPS, Chief Digital and Technology Officer Bala Subramanian recently launched an internal AI tool for email which can process the tens of thousands of customer emails UPS receives on a daily basis, connect relevant information across internal policies and procedures, and generate responses for contact center employees. This ultimately improves worker productivity and reduces response time. UPS also launched an AI chatbot to help employees answer HR questions. Subramanian noted that the company is proceeding slowly due to the sensitive information and personal data in HR systems, and emphasized the critical role of risk management and governance.
At AstraZeneca, AI is significantly reducing the amount of time it takes to conduct research. Cindy Hoots, Chief Digital and Information Officer, described a generative AI-enabled research assistant that quickly searches both internal and external data to answer complex scientific questions. The assistant has helped reduce the time it takes to conduct a literature review from months to minutes, she said. Hoots is now focused on scaling AI adoption. About 15,000 employees use the research assistant, she said, while roughly 5,000 use Copilot solutions and almost 80,000 have access to AstraZeneca’s internal ChatGPT.
At KB Home, employees evaluate a number land deals across 35 markets every week. Aggregating property data from different sources to determine whether to make an acquisition used to take 30-90 days, CIO Greg Moore said. With AI, KB Home can now complete the process in less than two weeks. The faster turnaround now enables the company to make more evaluations and manage more potential deals in the pipeline.
Developer productivity is another area of rapid experimentation. Many of the tools offered by major vendors are in their early days and have room to grow, said Brodbeck of Priceline. The team is exploring solutions that can learn from Priceline’s codebase and provide a richer and more contextual experience. Whether for code generation or another use case, Brodbeck said companies will likely need to deploy retrieval-augmented generation (RAG) to deliver more productivity.
At Augment, CEO Scott Dietzen is thinking about how to retrieve knowledge from internal codebases in a way that protects intellectual property and reduces the risk of leaking sensitive information. The team started with basic engineering tasks that can make developers more productive rather than trying to replace them altogether. Demand for these kinds of tools will last for at least a decade as organizations produce more software, Dietzen said.
The top use cases for digital assistants/copilots that are driving the most value for MSDS attendees are code generation, self-service chatbots, and enterprise search/knowledge management
Bringing the organization along on the AI journey
To drive a common understanding and widespread adoption of AI, CIOs have increased their focus on storytelling and talent development.
At Wilson Sonsini, Chief Information Officer Michael Lucas is focused on cascading AI communications across the firm. His team started with a general awareness campaign. That included employee town halls to communicate the broader strategy as well as AI-centric briefings to partners. Given the sea of media coverage about AI, Lucas encouraged leaders to develop their own elevator pitch to help their organizations clearly understand the company’s AI strategy. Driving a common understanding across the firm is key to driving adoption. “We feel like we need to learn, understand, enrich, and then apply and operationalize,” Lucas said.
At Liberty Mutual, Global Chief Information Officer Monica Caldas is delivering customized employee training and connecting it to the company’s capacity demands across 27 countries. It’s part of a workforce strategy plan called “skills to fuel our future.” First, the company surveyed more than 5,000 employees to determine their skill level around topics like data, data engineering and software engineering. Next, the company mapped over 150 skills, connected them to 18 domains, and assessed how and where to invest in training.
Now, Caldas and her team are helping employees apply that training to a variety of career paths. Instead of a traditional career development ladder, Liberty Mutual is evaluating how to map skills to different jobs and create a “jungle gym” or “lattice of opportunities.” The focus on specific skills, Caldas said, “will help you position your capabilities as a tech organization not just for today, but also plan out where it’s going.”
Education at the executive level is also critical. To bring executives along on the journey, Caldas introduced a program called Executech that helps improve organizational data literacy and elevates the digital IQ of decision makers. Enhancing teams’ tech acumen gives leaders the confidence to start conversations early about important technology topics like API integration.
AI adoption may not be uniform, and there is still lots to learn about how it will impact specific roles. At Eli Lilly, employees who have incorporated AI tools into their workflow are reluctant to give them up, said Diogo Rau, Chief Information and Digital Officer. However, widespread adoption is a continuous and sometimes challenging process, “a lot harder than anyone would guess,” Rau said.
Rau often gets more questions about the risks of AI than how it can be used to improve products and services. Another challenge is that teams excited about creating AI bots aren’t always excited about maintaining or training them. “There are lots of good firefighters, but not every firefighter wants to be a fire inspector,” he said.
62% of technology executives who attended the Metis Strategy Digital Symposium anticipate that the most significant impact that AI will have on talent is increased productivity
Leveraging ecosystem partners
Achieving the transformative potential of generative AI will require collaborating with networks of vendors, startups, peers, and academics. In addition to providing technology solutions, these ecosystem partners can help upskill employees, explore emerging challenges, and prototype new use cases.
Amir Kazmi, Chief Information and Digital Officer at WestRock, draws learnings from both established technology partners and startups. He also brings in academics and peers from other companies to share wins and lessons learned about generative AI.
Regal Rexnord’s Tim Dickson, Chief Digital and Information Officer, uses hackathons and internal events with vendor partners to increase the company’s digital IQ. The company also offers self-paced training from about 10 partners that includes pathways to certification. In the past seven months, more than 100 employees have received training on GenAI fundamentals from Databricks and robotic process automation from UiPath, as well as certifications from Microsoft Copilot. Even if employees don’t use these tools every day, increasing the number of people with technical skills means more individuals “can at least help, or even lead, these initiatives across the organization,” Dickson said.
CommScope CIO Praveen Jonnala, like many other technology executives, is thinking about how to drive a cultural shift around AI. He spends about 80% of his time on organizational change management and culture. He is also leaning into existing partnerships to take advantage of new AI solutions and educate teams. For example, he took business teams to Microsoft for a full day to learn more about the technology and its ability to unlock new business opportunities.