Smartly, a rapidly growing digital advertising platform helping brands scale campaigns across channels, has implemented Meta's open-source Llama 3 model to automate critical aspects of its customer support operations.

The deployment has transformed Smartly's support operations by automating ticket creation, summarising technical issues, and generating detailed resolution messages. This implementation has not only improved operational efficiency but also significantly enhanced staff morale and customer satisfaction, strengthening the company's competitive position in the digital advertising technology market.

Prior to the AI deployment, Smartly's support teams faced mounting inefficiencies as they manually processed growing ticket volumes, duplicated information across platforms, and crafted detailed technical explanations for customers. These repetitive tasks consumed valuable agent time that could otherwise be dedicated to solving complex customer problems.

Smartly's technical requirements presented implementation challenges that eliminated many commercial AI options. The company needed an advanced language model capable of understanding complex technical concepts across multiple platforms while meeting stringent security requirements. Most importantly, the solution needed to operate entirely within Smartly's private infrastructure, as sending customer data to third-party cloud services wasn't acceptable.

The company deployed Llama 3 on its existing Kubernetes infrastructure, optimising the model to run efficiently within resource constraints. Rather than fine-tuning the model – which would have required significant data and expertise – Smartly's team used basic prompt engineering and few-shot learning techniques to adapt the model to their specific requirements. This approach provided clear instructions and examples of formatting for ticket titles, descriptions, context, and writing style.

The implementation yielded dramatic business results. The AI automation system eliminated approximately 80% of duplicative work in ticket handling while delivering consistent, professional customer messages. Resolution drafting time was reduced by approximately half, and ticket quality improved through standardised formats and clearer technical explanations.

Beyond the immediate efficiency gains, the implementation had significant workforce impact. Support agents reported feeling less burdened by repetitive tasks and more empowered to focus on complex problem-solving and customer engagement. This shift contributed to improved team morale and confidence, addressing a critical retention challenge in customer support operations.

The implementation process was not without challenges. The Smartly team initially encountered performance bottlenecks when running the model on standard CPU resources within their Kubernetes environment. To address these limitations, they transitioned to GPU nodes, which required developing new technical capabilities within the team. The open-source nature of Llama proved valuable during this process, as the Smartly team leveraged community resources to optimise model performance and infrastructure scaling.

Smartly's implementation demonstrates the value of deploying generative AI for customer service automation, particularly for organisations with stringent data privacy requirements. By running the model within their own infrastructure, the company maintained complete control over customer data while still achieving significant operational improvements.

Looking ahead, Smartly plans to expand its use of Llama for developing additional internal tools, including ticket categorisation systems, product support analytics, and writing suggestions for error messages. These initiatives aim to further enhance operational efficiency while maintaining the company's commitment to data privacy and security.


Share this post
The link has been copied!