Commercial chatbots are a pain. At best, it picks up on your keywords and points you in the right direction of the website you are on. In some scenarios, they connect you with a live agent. At worst, it does not understand your query and your conversation with the bot plods around in frustrating circles.
We would like to change that. Imagine ChatGPT, but not just planning travel itineraries, or writing essays. Imagine ChatGPT, but bespoke, built to solve your specific business needs such as answering your customer’s questions intelligently.
A customer could request, “Give me a list of the cheapest flights from Singapore to Manila for the month of April”, and instead of the chatbot pointing the customer to the Booking section of the website, it could actually provide that list within the chat. And then the customer could say, “Book me that flight on Tuesday 2 May - here is my passport number” and your chatbot could do that, all whilst keeping the data secure, within your own system.
ChatGPT has become all the rage, setting the record for the fastest-growing user base, beating apps like TikTok and Instagram. Having the cake is nice, but imagine if you had the flour, eggs and butter to make your own cake — just the way you like it.
Increased investments for AI solutions in Asia Pacific
Intelligent chatbots would be more helpful if the answers they churn out are targeted, specific, and catered to your stakeholders. Imagine customer satisfaction and approval burgeoning, productivity rising, and operations improving — all thanks to artificial intelligence.
See also: 80% of AI projects are projected to fail. Here's how it doesn't have to be this way
In Asia Pacific, organisations are expected to adopt more AI systems with numbers soaring to 76%, nearly doubling from the previous year's 39%, with 73% anticipating increased tech investment in the area.
However, this increased investment is coupled with the concern of keeping your customer data within your own realm. As chatbots increasingly become the first point of contact between the organisation and the customer, more confidential information will be relayed through their platform – threatening to breach organisational security and digital trust.
By building your own bot, you retain control of privacy and security, customisation options, and the ability to utilise your data effectively, rather than handing your proprietary data to a third party. It also enables easier integration, facilitates continuous improvement, and prevents dependence on third-party platforms.
See also: Responsible AI starts with transparency
As Benjamin Franklin said: “the best investment is in the tools of one’s own trade”.
It is vital that open and transparent large language models (LLMs) are built, trained and powered using relevant data as the reality is that the quality of the output they generate is largely determined by the content they consume.
Generic LLM-enabled bots such as ChatGPT trained on general information will not suffice for company-specific purposes, instead, companies should solve for customer experiences that differentiate their products and services using their data and freely demonstrate how their solution was built to establish trust upfront.
Leveraging your own data
While this open-source solution is available for all, making the most out of it relies on data. Most companies are idly sitting on a treasure trove of data to support these ambitions. For particular use cases, the information you need is likely available in existing documentation and knowledge bases. These invaluable sources provide relevant and meaningful content to your customers. They include structured data such as customer relationship management, product databases, manuals, and policies and unstructured data like survey feedback, customer complaint logs, chat transcripts, product reviews or call recordings.
To support these use cases, it is important to leverage structured and unstructured data assets by establishing a centralised data management strategy. The key to building an intelligent chatbot is quality data. There is no intelligent chatbot without data.
There is an exciting chance that we might enter a new age of AI. One that is mainstream, accessible, and full of real-life, practical uses. Not lofty technology that requires expensive hardware that only some can afford, or obscure forms of currency to build upon already existing wealth. With open-sourced LLMs, just about anyone with a computer and some resources can use AI like this to better jobs, lives, and online experiences. That’s what technology should be: Digitally-secure, practical, and accessible.
Nick Eayrs is the vice president of Field Engineering for Asia Pacific and Japan at Databricks