Less than two weeks after the launch of Lamini Memory Tuning, Lamini AI has officially partnered with Meta.
Excited to share that @LaminiAI is partnering with @Meta 🤝 to bring you code recipes for finetuning factual LLMs & agentic workflows 🎉
— Sharon Zhou (@realSharonZhou) June 25, 2024
Recipe #1: How to Memory Tune Llama-3 into a SQL agent for your precise schema: 30% -> 95% accuracy.https://t.co/nAwJX6vWHG
Code 👇🏻
Lamini AI announced Lamini Memory Tuning on June 13, wherein the tool has showcased the ability to improve factual accuracy while reducing hallucinations by as much as 95%.
“Lamini Memory Tuning is a research breakthrough that overcomes a seeming paradox in the AI world: achieving precise factual accuracy (i.e. no hallucinations) while upholding the generalisation capabilities that make LLMs valuable in the first place,” the startup said.
The tuning method was used on open-source models like LLaMa 3 and Mistral 3. Now, however, the company has partnered with Meta to improve LLaMa 3’s baseline performance by improving the quality of SQL queries.
As part of this, Meta published a repository of Llama 3 Lamini recipes to help tune Llama models, specifically for enterprises.
“Lamini Memory Tuning is a new tool you can use to embed facts into LLMs that improve factual accuracy and reduce hallucinations. Inspired by information retrieval, this method has set a new standard of accuracy for LLMs with less developer effort,” the repository stated.
According to Lamini, the memory-tuning tool is also able to reduce response times by 50% while also decreasing workloads for data teams and increasing the overall reliability of queries, subsequently increasing the accuracy rates as well.
This is not the first time a tool has attempted to improve the efficiency of SQL queries. Previously, researchers at Nanyang Technological University, Singapore University of Technology and Design, and Alibaba‘s DAMO Academy recently introduced LLM-R2, which was a query rewrite system that helped significantly boost SQL query efficiency.