Customer Support
By introducing the LLM-based ThanoSQL chatbot, you can replace the work of agents and innovate customer service.
Traditional rule-based chatbots only provide predetermined answers based on specific keywords in a scenario, making them unable to respond flexibly to various user needs. Additionally, typical LLMs chatbots struggle or have significant limitations when it comes to accurately responding to real time updates in company’s internal system data.
The following is a case example of customer support (billing inquiry) using the ThanoSQL chatbot in a company’s customer service center.
고객이 계량기 사진을 업로드하면 AI는 먼저 사진에서 계량기 번호를 인식하고 그 계량기 번호에 해당하는 고객 이름을 ERP나 CRM같은 내부 시스템에서 검색합니다. 고객 이름을 찾은 후에는 마찬가지로 내부 시스템에서 최근 요금을 확인하게 됩니다.
In this process, unstructured data such as photos or videos is handled by exchanging only file paths and embedding vectors, fundamentally preventing data leakage at the source. Similarly, when utilizing internal systems like ERP, only the database (DB) schema is exchanged, ensuring that the actual DB data does not leave the internal environment.
After the meter photo is uploaded, the chatbot continues responding by referencing the previous conversation history and using context without requiring the user to re-enter their question. It also provides additional analysis results as part of its response.
What makes ThanoSQL different?
- When generating responses, the application of Retrieval-Augmented Generation (RAG)
extends to real time updates from internal company systems, enabling responses to personalized customer inquiries. - Even when using public LLMs, there is no risk of corporate data being exposed externally.
In addition to the built-in scenarios,
customized scenarios according to customer requirements can be applied within a few weeks.
Inquiry