automation in IT support using AI and language models

Advanced AI technologies can not only speed up, but also significantly improve the quality of processing requests. I work in generative artificial intelligence at NLMK and will tell you how we, together with IT vendor Axenix, managed to transform the approach to servicing IT user support requests through an innovative project for intelligent classification and routing.

Call context

The company receives several tens of thousands of requests every month through various sources: mail, telephone, self-service portal, etc. Under these conditions, the limitations of traditional processing methods have been exposed: high load on operators, manual routing, speed of request processing, errors and human factor. This became the reason for searching for new solutions. In addition, the development and adoption of LLM technologies has created additional incentive to explore innovative approaches to query processing.

Solution

Similar automation projects are already being successfully implemented all over the world and in Russia, especially relevant if the requests are standard and repetitive.

Statement of the problem and expected results

In the initial phase of the project, key tasks and expected effects were clearly defined.

  1. Automation of primary classification and routing of requests that arrive via mail (more than half of all requests monthly): the goal is to reduce the load on operators by automating the process of classification and routing of incoming requests.

  2. Acceleration of request processing: thanks to fast and accurate routing of requests, the service will reduce response time to user requests.

  3. Testing the hypothesis that improving the classification of queries is possible through interactive refinement of the LLM model's requests to users.

  4. Reduced operational costs: The use of advanced artificial intelligence technologies and large language models is designed to minimize routine tasks and, as a result, reduce manual labor costs.

Description of the business process

At the heart of the user support process is the ITSM System, which manages the flow of requests. User requests received from various channels are first manually classified by content, priority and other parameters. Contact center operators are responsible for this. Once classified, requests are forwarded to the appropriate service groups on the second line for further processing or resolution.

The role of AI‑ICS in call routing

The first version of the Incident Classification System (AI‑ICS) service. The AI ​​tool was developed to automate the classification and routing process.

  • AI‑ICS analyzes and classifies requests based on their content, determining the required categories: service, category, request type and target group.

  • Cases classified by AI‑ICS are automatically sent either directly to the appropriate departments or for additional verification if the system is unsure of the classification.

Key participants are AI‑ICS, ITSM System and NLMK Contact Center operators, working together to ensure efficiency and accuracy in processing user requests.

Figure 1. Platform components, business process participants and their interaction

Figure 1. Platform components, business process participants and their interaction

Training data and training set

Scope and sources of data

The project used an extensive data set of almost a million hits collected over the past two years. This data included the following:

Structure of requests

Approximately 50% of calls were organized into 7 services, 14 categories and 15 groups. The remaining 50% of the data was divided among 50 services, 107 categories and 83 groups, creating a total of about 5,300 unique combinations.

Data suitability

Of the entire array, about 79% of requests were suitable for training and testing the “as is” model. These were short queries, not exceeding 500 characters. Longer texts required pre-processing, including summarization and removal of redundant information.

Attachment integration

A significant portion of requests (34%) contained graphic attachments. However, tests showed that adding data extracted from these embeddings did not improve the accuracy of the model.

Approach to Implementation of AI‑ICS

Development and deployment

The project included several key stages in the development and implementation of the AI ​​system.

  1. Selecting an ML model: determining the appropriate model for classifying and routing calls.

  2. Service development: creation of a Docker container for the ML model and development of an API for integration with the ITSM system.

  3. Model retraining: continuous training of the model based on new data from the support team.

  4. Reporting and Evaluation: Implementation of a reporting system to analyze the effectiveness of AI‑ICS.

  5. Testing and quality assessment: conducting tests and assessing the effectiveness of the system.

  6. System administration: development of tools for managing model parameters and types of requests.

Machine learning tools

The AI‑ICS service uses a number of advanced machine learning technologies.

  • PyTorch: the main framework for working with models.

  • Pandas: a data science tool.

  • MLflow: platform for machine learning management and pipeline automation.

  • Matplotlib and Seaborn: libraries for data visualization.

  • LaBSE: a transformer model optimized for the tasks of classifying and routing calls in various languages.

Classifier architecture based on the LaBSE model

Introduction to Large Language Models

Large language models such as GPT and BERT have revolutionized natural language processing by providing capabilities for translation, summarization, and question answering. Optimized for multilingual use, LaBSE is ideal for global companies with multilingual needs.

Model selection

The selection of the LaBSE model was the result of careful research and experimentation. During the development of AI‑ICS, we tested GPT2, BERT and other transformer models. Each stage included testing hypotheses, analyzing performance and meeting our requirements for accuracy and speed of data processing. LaBSE proved to be the most effective due to its ability to handle multilingual tasks, its lightweight nature that allowed for high processing speed, and its ability to classify and route queries most accurately. In addition, LaBSE demonstrated resilience to data changes and flexibility in integration – these were key factors in choosing a model for the project.

Role of LaBSE in call routing

At the company, the classifier architecture, based on the LaBSE transformer model, plays a key role in the process of intelligent routing of calls to IT support. Here's how it works.

  1. Data preparation: the AI‑ICS service begins by receiving request data. They include the text of the request itself, information about the employee who sent the request, and the special “isSpecial” flag. This flag is used to identify users whose queries may require special attention or manual classification, although they are also included in the overall analysis for statistical purposes.

  2. Tokenization and embedding: the message text and associated data are combined and processed using the LaBSE tokenizer, which converts the text into a vector representation. These vectors or embeddings help encode the semantic content of the text into a machine-processable format.

  3. Classification of features: embeddings are analyzed using classification models that determine the key attributes of each request (service type, request category, etc.). In some cases, one model can be used to define several related attributes—for example, categories and services simultaneously.

  4. Returning results: Classification models return values ​​for each of the features along with parameters that reflect the model's confidence in each of its predictions. These parameters help in further routing of requests.

  5. Case routing: AI‑ICS makes decisions about the direction of the case based on the attributes and confidence level of the model. If confidence is high, the request is sent directly to the appropriate department. If the model expresses insufficient confidence or if the request is subject to additional rules (for example, requests from VIP users), it may be sent for additional manual review.

Practical benefits of using LaBSE

  • Faster processing: Classification automation reduces query processing time.

  • Improved accuracy: Accurately defining categories reduces routing errors.

  • Reduced costs: Effective automation reduces the resources required to manually triage and process requests.

Results of the AI‑ICS service prototype at NLMK

Classification efficiency

During the testing period, 61% of requests were classified by AI with a Confidence Threshold (CT) set at 0.8, and the remaining 39% were sent for manual verification (figure 2).

Figure 2. Dynamics of requests for NLMK support

Figure 2. Dynamics of requests for NLMK support

Of the AI ​​processed, 84% were accurately classified into service, category, type and group attributes, giving an overall service coverage of 51.24% (84% * 61%) (figure 3).

Figure 3. Dynamics of routing quality and AI classification

Figure 3. Dynamics of routing quality and AI classification

The average share of correctly classified and routed CC calls per day is 75% (Figure 4).

Figure 4. Dynamics of routing quality and CC classification

Figure 4. Dynamics of routing quality and CC classification

Process optimization

The use of AI‑ICS made it possible to increase classification accuracy to 85%, reduce time and costs for processing requests by reducing the proportion of requests requiring manual classification. This, in turn, led to resource savings and improved customer service.

Predicted results

It is expected that up to 80% of all electronic requests will be effectively processed by AI without the need for manual routing, and up to 84% of requests will be correctly routed. This confirms the system's potential to significantly reduce operational costs while maintaining high quality standards for request processing.

Further development of AI‑ICS at NLMK: deepening automation and integrating new models

Empowering AI

As part of the ongoing improvement of the AI‑ICS system, the company is actively working to implement automatic processing of standard requests. By highlighting the intent and subsequent auto-response, the model will be able to independently solve typical problems without the involvement of specialists. This significantly speeds up the service process and increases its efficiency.

Improved user experience

One of the new features is the ability for AI to ask clarifying questions. This allows you to more accurately classify requests and find the most suitable solutions, minimizing the likelihood of errors and increasing customer satisfaction.

Integration of advanced AI models

The line of tools includes the latest machine learning models, such as LLAMA 3 and Mixtral, which provide improved capabilities for analysis and query processing. These powerful models provide deeper understanding of context and are capable of processing complex queries with high accuracy.

Future plans

The company plans to further integrate new technologies to reduce dependence on the human factor in the process of processing requests. The development and testing of new features have already shown significant success, and this trend will continue – both in terms of speed of service and the quality of request processing.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *