AI Assistant for Technical Support
Machine Learning
Large Language Models (LLMs)
Retrieval-Augmented Generation (RAG)
- Our client is a Swiss-based manufacturer of specialized measurement devices.
- They needed an AI solution to efficiently organize the knowledge base and make technical resources easily accessible to technicians in real-time.
- We conducted in-depth interviews to understand workflows, data sources, and user needs, forming the basis for a well-defined project scope and system architecture.
- In the data processing part of the system, technical documents are uploaded, split into smaller pieces, enriched with metadata, and converted into a searchable format for the AI to understand efficiently.
- When a technician submits a query, the AI searches the processed data, retrieves the most relevant content, and prepares it for response generation.
- The AI assistant delivers context-specific answers, along with the original documents, allowing technicians to verify details, including diagrams or images.

Our client is a global manufacturer of specialized chemical analysis devices. One of the core values they pursue relentlessly is a commitment to uncompromising quality.
While it's a trademark characteristic, it also brings its own set of challenges. With a strong commitment to long-term product support, they needed an efficient way to manage their technical documentation, internal knowledge bases, and informal "tribal knowledge" - the kind of know-how that’s often held by specific teams or individuals.
The number of PDFs alone has exceeded 7,000.
After careful consideration, they chose AI as their go-to solution, as it offers unparalleled ability to handle documents written in natural language.
We've partnered to develop an AI-powered assistant that leverages Retrieval-Augmented Generation (RAG - a method where AI looks up relevant documents before answering a question, ensuring the response is based on real information, not just what it "knows.") to organize and deliver technical resources like manuals, service logs, and historical repair data to technicians in real time.
Design process
Our collaboration started with a set of workshops and meetings with stakeholders and end users to specify the use case details so we could satisfy all the data- and usability-related outcomes.
The process was smooth, and soon, we quickly aligned on:
- The dataset used for PoC development and its extension in upcoming phases
- User journey map for system flow design
- System architecture document to create a comprehensive development roadmap
The initial PoC phase showed that the system worked technically and was easy to use.

Building The AI Assistant
The system is a custom RAG-based solution fully integrated with the client's IT infrastructure.
There are two core components of the system:
-
Data ingestion pipeline (a part of the system for collecting and preparing documents so the AI can use them) where all provided documents (manuals, guidelines, tickets, and work orders) are split into smaller chunks, and their metadata is added so the LLM can process them easier, retrieving only the relevant chunks, following a sorting logic. The database of chunks is then vectorized (turning text into set of numbers) so the LLM can search and process them natively. It's a crucial part of the solution, as most RAG quality comes from good retrieval. Due to the sheer amount of documents, we had to design and implement a custom processing and retrieval method.
-
Generation pipeline where the user interacts with a chatbot by providing the task context (e.g., identified malfunction, instrument ID, and steps currently taken). The LLM processes the user query, retrieves chunks, and displays troubleshooting instructions and a view of the document on which it's based. This way, the technician can also access any diagrams or images. AI may hallucinate no matter how good it is, so it's also a way to double-check if the generation was correct or the solution was already tried.
Impact and future steps
With the tool rolling in production, we've built a strong foundation for easy knowledge access and context-aware support.
This marks an exciting first step for our client in tackling a challenge that many companies face today: how to keep customer service outstanding as their user base continues to grow.