According to OpenAI's website, Since its launch in 2023, Decagon has established itself in the customer support automation sector, utilising a combination of OpenAI models including GPT-3.5, GPT-4, GPT-4o, GPT-4 Turbo, and OpenAI o1-mini to power its service platform.

As stated on OpenAI's website, "We know that latency has a direct impact on customer satisfaction. Every second counts when you're dealing with real-time customer support," says Ashwin Sreenivas, Decagon's co-founder and CTO.

OpenAI reports that the company employs different OpenAI models to optimise performance across various tasks. For instance, Decagon fine-tuned GPT-3.5 specifically for rewriting customer queries before they enter retrieval-augmented generation (RAG) workflows, while utilising GPT-4 for more complex decision-making processes.

In OpenAI's case study, Jesse Zhang, Decagon's co-founder and CEO, explains the advantage of their approach: "This allows us to both capture customers' business logic and create all the software surface area around the agent that just wasn't possible before LLMs."

The results have been significant. According to OpenAI's website, Sreenivas states, "For one of our largest customers, we handle 91% of all their global support, without a human being involved." The platform can be deployed rapidly, with "core infrastructure can be up and running in days" for new customers.



Share this post
The link has been copied!