AI News - Latest Updates and Trending News in Artificial Intelligence https://analyticsindiamag.com/ai-news-updates/ Artificial Intelligence news, conferences, courses & apps in India Wed, 14 Aug 2024 09:01:01 +0000 en-US hourly 1 https://analyticsindiamag.com/wp-content/uploads/2019/11/cropped-aim-new-logo-1-22-3-32x32.jpg AI News - Latest Updates and Trending News in Artificial Intelligence https://analyticsindiamag.com/ai-news-updates/ 32 32 TII Unveils Falcon Mamba 7B, Outperforming Llama 3.18B and Other SLMs  https://analyticsindiamag.com/ai-news-updates/tii-unveils-falcon-mamba-7b-outperforming-llama-3-18b-and-other-slms/ Wed, 14 Aug 2024 06:52:52 +0000 https://analyticsindiamag.com/?p=10132657 TII Falcon Mamba 7B

The open-source state space language model Mamba 7B outperforms Meta’s Llama 3.1 8B, Llama 3 8B, Mistral’s 7B, and claims the top spot on Hugging Face’s benchmark leaderboard.

The post TII Unveils Falcon Mamba 7B, Outperforming Llama 3.18B and Other SLMs  appeared first on AIM.

]]>
TII Falcon Mamba 7B

The Technology Innovation Institute (TII), the applied research arm of Abu Dhabi’s Advanced Technology Research Council (ATRC), has launched the Falcon Mamba 7B, a groundbreaking addition to its Falcon series of LLMs. Open-source State Space Language Model (SSLM), Falcon Mamba 7B has been independently verified by Hugging Face to outshine all competitors.

Marking a significant departure from previous Falcon models, which relied on transformer-based architecture, the Falcon Mamba 7B introduces SSLM technology to the Falcon lineup. This model not only outperforms Meta’s Llama 3.1 8B, Llama 3 8B, and Mistral’s 7B in new benchmarks but also claims the top spot on Hugging Face’s tougher benchmark leaderboard.

Source: Falcon

SSLMs excel at processing complex, time-evolving information, making them ideal for tasks like book-length comprehension, estimation, forecasting, and control tasks. Falcon Mamba 7B demonstrates superior capabilities in Natural Language Processing, machine translation, text summarisation, computer vision, and audio processing, with significantly lower memory requirements compared to traditional transformer models.

H.E. Faisal Al Bannai, Secretary General of ATRC and Adviser to the UAE President for Strategic Research and Advanced Technology Affairs “The Falcon Mamba 7B marks TII’s fourth consecutive top-ranked AI model, reinforcing Abu Dhabi as a global hub for AI research and development. This achievement highlights the UAE’s unwavering commitment to innovation.”

Source: Falcon

TII Continues Growth with SLMs

In a focussed shift to building small language models, Hakim Hacid, executive director and acting chief researcher at Technology Innovation Institute (TII), had discussed the same with AIM in an exclusive interaction, earlier this year. 

“We were asking at some point the question, as to how big should we go? I think now the question is how small we could go by keeping a small model,” said Hacid, saying that they are exploring that path. 

Further, he said that they are making models smaller because, again, “if we want the pillar of the deployment to succeed, we need to actually have models that can run in devices, and in infrastructure that is not highly demanding.”

With over 45 million downloads of Falcon LLMs to date, the Falcon Mamba 7B continues TII’s tradition of pioneering research and open-source contributions. The model will be released under the TII Falcon License 2.0, a permissive software licence based on Apache 2.0, emphasising the responsible use of AI.

TII continues to build on the open-source culture and believes not everyone will be able to sustain it. “You need a lot of funding to sustain open-source and we believe that not everyone will be able to do it,” said Hacid. 

The post TII Unveils Falcon Mamba 7B, Outperforming Llama 3.18B and Other SLMs  appeared first on AIM.

]]>
Acer Launches New AI-Powered Chromebook Laptops in India https://analyticsindiamag.com/ai-news-updates/acer-launches-new-ai-powered-chromebook-laptops-in-india/ Wed, 14 Aug 2024 06:07:16 +0000 https://analyticsindiamag.com/?p=10132642

These advanced laptops are specifically designed to cater to the demands of the enterprise and education sector.

The post Acer Launches New AI-Powered Chromebook Laptops in India appeared first on AIM.

]]>

Acer recently launched its latest Chromebook Plus models, the Acer Chromebook Plus 14 and 15 laptops, in India. The all-new Chromebook Plus with the latest built-in Google Gemini AI features offering robust performance, enhanced productivity features, and a sleek, professional design.

These advanced laptops are specifically designed to cater to the demands of the enterprise and education sector.

Acer Chromebook Plus offers built-in Google apps and powerful AI capabilities. It also offers Google Photos Magic Eraser, File Sync, Wallpaper generation, AI-created video backgrounds, and Adobe Photoshop on the web to help consumers boost their productivity.

Powered by a range of Powerful Intel® & AMD® processor variants, these Chromebooks ensure robust performance for multitasking and running demanding applications. The Chromebook Plus 14 has two variants. One with Intel® Core™ i3-N305 processor and another with AMD® Ryzen® 7000 Series Processor, while the Chromebook Plus 15 offers Up to Intel® 13th Gen Core™ i7-1355U processor.

All the models support up to 16GB LPDDR5X SDRAM and Storage up to 512 GB PCIe NVMe SSD, ensuring fast data access and ample space for important files and applications.

“With powerful Intel® & AMD® processors, vibrant displays, Powerful AI capabilities, and robust security features, we believe these Chromebooks will significantly enhance productivity and learning experiences. Our goal is to offer solutions that empower professionals and students to achieve more, and the Chromebook Plus 14 and 15 embody this vision perfectly,” Sudhir Goel, Chief Business Officer at Acer India, said.

Designed for durability and reliability, these Chromebooks have undergone rigorous military-grade reliability tests, including mechanical shock, transit drop, vibration, and resistance to sand, dust, humidity, and extreme temperatures.

In a previous interaction with AIM, Goel said, “At Computex 2024, we have showcased a lot of new products, which we are thrilled to introduce to the Indian market in the coming year.”

PC makers are hoping AI could help pull the market from the stalemate that it was last year. Research firm Canalys predicts that the PC market will see an 8% annual growth in 2024 as more AI PCs hit the market. Canalys also predicts AI PCs will capture 60% of the market by 2027.

The post Acer Launches New AI-Powered Chromebook Laptops in India appeared first on AIM.

]]>
Cognizant Expands to Indore, Creating 1,500 Jobs https://analyticsindiamag.com/ai-news-updates/cognizant-expands-to-indore-creating-1500-jobs/ https://analyticsindiamag.com/ai-news-updates/cognizant-expands-to-indore-creating-1500-jobs/#respond Wed, 14 Aug 2024 04:55:54 +0000 https://analyticsindiamag.com/?p=10132626 Cognizant Expands to Indore, Creating 1,500 Jobs

Cognizant’s expansion in Indore adds to its existing presence in cities like Bengaluru, Bhubaneswar, Chennai, and others across India.

The post Cognizant Expands to Indore, Creating 1,500 Jobs appeared first on AIM.

]]>
Cognizant Expands to Indore, Creating 1,500 Jobs

Cognizant has expanded its presence in India by opening its first centre in Indore, Madhya Pradesh, a move set to create over 1,500 jobs with potential growth to 20,000 in the future.

The new facility, spanning 46,000 square feet, was inaugurated by Madhya Pradesh Chief Minister Mohan Yadav, who emphasised the importance of intellectual property in the 21st century. “The 21st century is the century of intellectual property, which will make its mark in the world on the basis of information technology and artificial intelligence (AI),” Yadav stated during the ceremony.

Ravi Kumar S, CEO of Cognizant, posted his excitement about the new launch on LinkedIn. “This is the 2nd one after our announcement early this year of a new center in Bhubaneswar. We love the enthusiasm amongst our Cognizant associates about this strategy of expansion into smaller cities in India. We are the largest IT employer in Coimbatore and our aspirations are to be the largest in Indore and Bhubaneswar.”

Surya Gummadi, EVP of Cognizant and President of Cognizant Americas, highlighted the strategic importance of the Indore center, saying, “Indore will seamlessly integrate into our existing delivery network across India, focus on innovative solutions for our global clients, create new opportunities for local talent, and bring our offices closer to where our associates live.”

Located at Brilliant Titanium in the heart of Indore, the centre has a seating capacity for 500 and can accommodate up to 1,250 associates in a hybrid work model. Cognizant’s expansion in Indore adds to its existing presence in cities like Bengaluru, Bhubaneswar, Chennai, and others across India.

Madhya Pradesh has secured investment proposals worth INR 3,200 crore from Google, NVIDIA, and Microsoft, in a single day during an interactive session held in Bengaluru on August 7-8, 2024.

The post Cognizant Expands to Indore, Creating 1,500 Jobs appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-news-updates/cognizant-expands-to-indore-creating-1500-jobs/feed/ 0
This Mumbai-Based Startup Has Released India’s Very Own Harvey AI https://analyticsindiamag.com/ai-origins-evolution/this-mumbai-based-startup-has-released-indias-very-own-harvey-ai/ https://analyticsindiamag.com/ai-origins-evolution/this-mumbai-based-startup-has-released-indias-very-own-harvey-ai/#respond Tue, 13 Aug 2024 12:09:19 +0000 https://analyticsindiamag.com/?p=10132560

LexLegis AI has been trained on one crore legal documents aggregated over 25 years. The AI tool is aimed at legal professionals, offering detailed analyses for legal research.

The post This Mumbai-Based Startup Has Released India’s Very Own Harvey AI appeared first on AIM.

]]>

Mumbai-based legal tech company LexLegis has set itself apart as “India’s answer to Harvey AI,” having opened for access as of this week.

LexLegis AI has been trained on one crore legal documents aggregated over 25 years. The AI tool is aimed at legal professionals, offering detailed analyses for legal research.

“It is to help simplify and demystify the legal complexities for everyone and to save time on the vast amounts of time that we’re spending on legal research. The tool enables users to efficiently navigate through thousands of pages and extract meaningful, actionable information,” said co-founder and managing director Saakar S Yadav.

The legal research company, which was founded in 1998, has reinvented itself this year with the goal of building an LLM for Indian law. The company was founded by the late S C Yadav, who served as the Chief Commissioner of Income Tax, and his son Saakar S Yadav.

Over the years, the company worked on several legal tech solutions. Shortly after its founding, the company developed and launched a search engine catered specifically towards legal professionals and tax consultants, to help in the understanding of the taxation domain.

Additionally, they also built the largest database of judgments in India in 2004, followed a decade later by the development of the National Judicial Reference System (NJRS), which is the world’s largest repository of appeals for the Income Tax Department.

With LexLegis AI, the company has leveraged its 25 years of experience within the industry to offer an overarching tool to help legal professionals, businesses and researchers cut down on the time used to research and find citations for relevant cases.

Speaking on the tool, Yadav stated that while the tool currently focuses on tax law, it aims to inculcate all fields of law for use.

While previously AIM has covered tools to assist legal professionals, this is one of the first Indian-made LLMs for law, focusing solely on the Indian legal system.

The post This Mumbai-Based Startup Has Released India’s Very Own Harvey AI appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-origins-evolution/this-mumbai-based-startup-has-released-indias-very-own-harvey-ai/feed/ 0
CoRover.ai Joins NVIDIA Inception to Accelerate BharatGPT  https://analyticsindiamag.com/ai-news-updates/corover-ai-joins-nvidia-inception-to-accelerate-bharatgpt/ https://analyticsindiamag.com/ai-news-updates/corover-ai-joins-nvidia-inception-to-accelerate-bharatgpt/#respond Tue, 13 Aug 2024 11:32:10 +0000 https://analyticsindiamag.com/?p=10132532 CoRover.ai is the Silent Winner of Indian LLM Race

The NVIDIA Inception membership will provide CoRover with access to NVIDIA's resources, including GPUs, compute power, and software support.

The post CoRover.ai Joins NVIDIA Inception to Accelerate BharatGPT  appeared first on AIM.

]]>
CoRover.ai is the Silent Winner of Indian LLM Race

CoRover.ai, the company behind BharatGPT, announced its inclusion in NVIDIA Inception, a program designed to support startups advancing industries through technological innovation.

CoRover.ai, known for its human-centric conversational AI platform, has developed BharatGPT, India’s first indigenous generative AI platform. The platform is accessible across various channels, formats, and languages, serving 1.3 billion users. 

CoRover’s solutions include virtual assistants like chatbots, voicebots, and videobots, which are deployed across multiple sectors, including government and private organizations like IRCTC, LIC, and the Indian Navy.

The NVIDIA Inception membership will provide CoRover with access to NVIDIA’s resources, including GPUs, compute power, and software support. This partnership is expected to accelerate CoRover’s development of AI-driven customer engagement solutions.

 “As we are committed to addressing real business use cases in a B2B2C landscape, having access to NVIDIA’s technological know-how and resources through NVIDIA Inception will help CoRover effectively handle large language models and domain-specific models, automating conversational AI use cases,” said Ankush Sabharwal, CEO of CoRover.

NVIDIA Inception supports startups with benefits such as NVIDIA Deep Learning Institute credits, preferred pricing on hardware and software, and ongoing technological assistance, aiding in product development, prototyping, and deployment.

CoRover.ai recently announced a strategic partnership with AI auditing firm EthosAI.one to advance the development of responsible AI. 

The partnership aims to ensure the reliability, fairness, and accuracy of BharatGPT, reinforcing it as a trustworthy AI solution. EthosAI.one will continuously audit and enhance BharatGPT models, aligning them with the highest ethical standards.

The post CoRover.ai Joins NVIDIA Inception to Accelerate BharatGPT  appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-news-updates/corover-ai-joins-nvidia-inception-to-accelerate-bharatgpt/feed/ 0
Sakana AI Releases AI Scientist which Writes Scientific Papers for $15 https://analyticsindiamag.com/ai-news-updates/sakana-ai-releases-ai-scientist-which-writes-scientific-papers-for-15/ https://analyticsindiamag.com/ai-news-updates/sakana-ai-releases-ai-scientist-which-writes-scientific-papers-for-15/#respond Tue, 13 Aug 2024 08:45:04 +0000 https://analyticsindiamag.com/?p=10132453 Sakana AI

To ensure the quality of its work, the system runs a simulated review process for evaluation, mimicking the human scientific community's peer-review process.

The post Sakana AI Releases AI Scientist which Writes Scientific Papers for $15 appeared first on AIM.

]]>
Sakana AI

Japanese AI startup, Sakana AI, has released, ‘The AI Scientist,’ which introduces the first comprehensive system for fully automatic scientific discovery, enabling foundation models such as LLMs to perform research independently.

Curated in collaboration with the Foerster Lab for AI Research at the University of Oxford, and Jeff Clune and Cong Lu at the University of British Columbia, this AI system is capable of conducting independent scientific research and communicating its findings.

The AI Scientist harnesses frontier LLMs to generate research ideas, write code, execute experiments, visualise results, and draft scientific papers. It is also dubbed as an innovative framework, representing a significant step toward fully automatic scientific discovery.

To ensure the quality of its work, the system runs a simulated review process for evaluation, mimicking the human scientific community’s peer-review process.

Remarkably, each idea generated by the AI was developed into a full paper at a cost of less than $15 per paper and to evaluate the generated papers, researchers have designed and validated an automated reviewer. 

https://x.com/SakanaAILabs/status/1823178623513239992Sakana AI also recently introduced EvoSDXL-JP, an image generation model developed through Evolutionary Model Merge where it generates Japanese styles 10x faster and is available on HuggingFace with a demo for research and education.

The post Sakana AI Releases AI Scientist which Writes Scientific Papers for $15 appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-news-updates/sakana-ai-releases-ai-scientist-which-writes-scientific-papers-for-15/feed/ 0
Cosine Unveils Genie, the AI Software Engineer that Beats Cognition’s Devin https://analyticsindiamag.com/ai-news-updates/cosine-unveils-genie-the-ai-software-engineer-that-beats-cognitions-devin/ https://analyticsindiamag.com/ai-news-updates/cosine-unveils-genie-the-ai-software-engineer-that-beats-cognitions-devin/#respond Tue, 13 Aug 2024 06:19:43 +0000 https://analyticsindiamag.com/?p=10132400 Cosine Unveils Genie, the AI Software Engineer that Beats Cognition’s Devin

Cosine has secured $2.5 million in funding from SOMA and Uphonest, with additional investment from Lakestar and Focal and is part of the YC-W23 batch.

The post Cosine Unveils Genie, the AI Software Engineer that Beats Cognition’s Devin appeared first on AIM.

]]>
Cosine Unveils Genie, the AI Software Engineer that Beats Cognition’s Devin

The race for building a team of AI software engineers doesn’t stop. After Cognition’s Devin, Cosine, the human reasoning lab, has introduced Genie, hailed as the most capable AI software engineering model globally, achieving 30.08% on SWE-Bench evaluations. 

Genie is designed to emulate the cognitive processes of human engineers, enabling it to solve complex problems with remarkable accuracy and efficiency. “We believe that if you want a model to behave like a software engineer, it has to be shown how a human software engineer works,” said Alistair Pullen, the founder of Cosine.

Moreover, this UK-based AI startup Cosine has secured $2.5 million in funding from SOMA and Uphonest, with additional investment from Lakestar and Focal and is part of the YC-W23 batch. 

As the first AI Software Engineering colleague, Genie is trained on data that mirrors the logic, workflow, and cognitive processes of human engineers. 

This allows it to overcome the limitations of existing AI tools, which are often extensions of foundational models with added features like web browsers or code interpreters. Unlike these, Genie can tackle unseen problems, iteratively test solutions, and proceed logically, akin to a human engineer.

Genie has set a new standard on SWE-Bench, achieving a score of 30.08%, a 57% improvement over the previous best scores held by Amazon’s Q and Code Factory. 

This milestone not only represents the highest score ever recorded but also the largest single increase in the benchmark’s history. Genie’s enhanced reasoning and planning capabilities extend beyond software engineering, positioning it as a versatile tool for various domains.

In its development, Genie was evaluated using SWE-Bench and HumanEval, with a strong focus on its ability to solve software engineering problems and retrieve the correct code for tasks. 

Genie scored 64.27% in retrieving necessary code lines, identifying 91,475 out of 142,338 required lines. This marks significant progress, though Cosine acknowledges room for improvement in this area.

Genie’s development involved overcoming challenges related to training models with limited context windows. Early efforts using smaller models highlighted the need for a larger context model, leading to Genie’s training on billions of tokens. The training mix was carefully selected to ensure proficiency in the programming languages most relevant to users.

Cosine’s innovative approach to Genie’s development included the use of self-improvement techniques, where the model was exposed to imperfect scenarios and learned to correct its mistakes. This iterative process significantly strengthened Genie’s problem-solving abilities.

Looking ahead, Cosine plans to continue refining Genie, expanding its capabilities across more programming languages and frameworks. The company aims to develop smaller models for simpler tasks and larger ones for complex challenges, leveraging their unique dataset. Exciting future developments include fine-tuning Genie on specific codebases, enabling it to understand large, legacy systems even in less common languages.

The post Cosine Unveils Genie, the AI Software Engineer that Beats Cognition’s Devin appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-news-updates/cosine-unveils-genie-the-ai-software-engineer-that-beats-cognitions-devin/feed/ 0
‘It’s Extremely Important that Future of Tech is Shaped by Democracies and their Partner Countries,’ says Rajeev Chandrasekhar https://analyticsindiamag.com/ai-news-updates/future-of-tech-is-shaped-by-democracies-and-their-partner-countries-says-rajeev-chandrasekhar/ https://analyticsindiamag.com/ai-news-updates/future-of-tech-is-shaped-by-democracies-and-their-partner-countries-says-rajeev-chandrasekhar/#respond Tue, 13 Aug 2024 05:51:44 +0000 https://analyticsindiamag.com/?p=10132383 ‘Future of Tech is Shaped by Democracies and their Partner Countries,’ says Rajeev Chandrasekhar

Paul Buchheit, warned that if China leads the AI race, the world could face a permanent lockdown.

The post ‘It’s Extremely Important that Future of Tech is Shaped by Democracies and their Partner Countries,’ says Rajeev Chandrasekhar appeared first on AIM.

]]>
‘Future of Tech is Shaped by Democracies and their Partner Countries,’ says Rajeev Chandrasekhar

Paul Buchheit, the creator of Gmail recently expressed concerns about the potential dangers of AI development in countries like China. Buchheit warned that if China leads the AI race, the world could face a permanent lockdown, with even our thoughts under surveillance and censorship. 

Responding to Buchheit’s remarks, Rajeev Chandrasekhar, Former Union Minister of India, emphasised the importance of ensuring that democracies and their neighboring nations shape the future of technology. 

He stated, “Its extremely important – more than critical – that future of Tech is shaped by  democracies and their partner countries” 

There is no denying that despite pressing concerns about the ethical and privacy implications of AI, China remains a central player on the global stage. 

China’s Approach to AI Development

Source: Reddit

In October 2023, Chinese tech giant Baidu unveiled Ernie 4.0, the latest version of its generative AI model, claiming capabilities comparable to OpenAI’s GPT-4. 

This significant advancement highlights China’s rapid progress in AI, driven by substantial government support and strategic initiatives like the ‘New Generation Artificial Development Plan’. 

Launched in 2017, this plan aims to position China as a global AI leader by 2023, emphasising research funding, talent recruitment, and infrastructure development. These efforts underscore China’s commitment to dominating the AI sector, often pushing the boundaries of ethical and privacy considerations. 

China’s AI development is also fueled by leading companies including Tencent, Alibaba, Baidu, and SenseTime, which attract top group talent and drive innovation. 

Meanwhile, China’s strides in hardware and robotics, with companies like Dreame Technology and Fourier Intelligence at the forefront, reflect a comprehensive approach to AI applications. 

Additionally, China’s advancements in facial recognition technology, widely deployed in public spaces, illustrate the country’s capability to implement AI solutions on a large scale. These often move rapidly as ethical concerns are not always given paramount importance. 

The post ‘It’s Extremely Important that Future of Tech is Shaped by Democracies and their Partner Countries,’ says Rajeev Chandrasekhar appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-news-updates/future-of-tech-is-shaped-by-democracies-and-their-partner-countries-says-rajeev-chandrasekhar/feed/ 0
Devika Creator Launches Asterisk, YC-backed AI Agent Startup https://analyticsindiamag.com/ai-news-updates/devika-creator-launches-asterisk-yc-backed-ai-agent-startup/ https://analyticsindiamag.com/ai-news-updates/devika-creator-launches-asterisk-yc-backed-ai-agent-startup/#respond Mon, 12 Aug 2024 16:32:00 +0000 https://analyticsindiamag.com/?p=10132311 Devika

The team has played a key role in securing major companies, including Google, Mastercard, Okta, NVIDIA, and Microsoft.

The post Devika Creator Launches Asterisk, YC-backed AI Agent Startup appeared first on AIM.

]]>
Devika

Mufeed VH, the creator of the AI software engineer Devika, has launched its AI tool and startup, Asterisk, along with Vivek R and Asjid Kalam.

Asterisk, a Y Combinator S24-backed AI agent, is revolutionising cybersecurity by automatically detecting and patching security vulnerabilities in codebases. Unlike traditional static security tools, which produce nearly 95% false positives and miss critical business logic errors, Asterisk offers a groundbreaking solution.

The AI agent mimics the analysis process of human security experts, identifying vulnerabilities such as unauthorised access, privilege escalation, and cost-inflating bugs. Asterisk operates autonomously, testing vulnerabilities in a sandbox environment and producing reports without any user intervention, ensuring zero false positives.

Asterisk confirms vulnerabilities by launching a sandbox environment, running the scanned software, and actively attempting to exploit the identified bugs. When Asterisk flags a vulnerability, it’s a confirmed threat.

The team has played a key role in securing major companies, including Google, Mastercard, Okta, NVIDIA, and Microsoft.

Asterisk possesses also a deep understanding of a company’s codebase, enabling it to simulate attacks like a malicious hacker would. This allows it to devise attack scenarios, similar to what was seen in the recent CrowdStrike incident.

Mufeed was earlier the founder of Lyminal and Stition.AI, which is now known as Asterisk, where the team was researching on security within AI models. He was also the gold medalist at the IndiaSkills 2021 Nationals in cybersecurity when he was just 19 years old. 

Kalam is Silver medalist at IndiaSkills and former Security Research Engineer at Emirates National Bank (UAE). Vivek is former Distributed Systems/Platforms Engineer at Chorus One, one of the largest Proof-of-Stake (POS) validators.

The post Devika Creator Launches Asterisk, YC-backed AI Agent Startup appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-news-updates/devika-creator-launches-asterisk-yc-backed-ai-agent-startup/feed/ 0
Convin Launches a 7 Bn Parameter LLM Tailored for Indian Contact Centres https://analyticsindiamag.com/ai-news-updates/convin-launches-a-7-bn-parameter-llm-tailored-for-indian-contact-centres/ https://analyticsindiamag.com/ai-news-updates/convin-launches-a-7-bn-parameter-llm-tailored-for-indian-contact-centres/#respond Mon, 12 Aug 2024 12:16:00 +0000 https://analyticsindiamag.com/?p=10132278

With the new model, Convin anticipates a 200%  increase in customer acquisition and a 3X boost in overall revenue for 2024-25.

The post Convin Launches a 7 Bn Parameter LLM Tailored for Indian Contact Centres appeared first on AIM.

]]>

Convin, an AI-powered conversation intelligence platform for call centre setups, recently launched its advanced Large Language Model (LLM), with 7 billion parameters. This model is specifically designed to improve the business output and resolve the unique challenges of customer-facing teams such as sales, support, and collections. 

Convin’s new LLM addresses these gaps and significantly outperforms leading models like GPT-3.5 by 40% and GPT-4 Turbo by 20% in accuracy, according to the company.

Trained on over 200 billion tokens and supporting 35+ Indian and South Asian languages, including codemixed variations, the model ensures precise transcriptions, zero to low hallucinations, and contextual understanding. It reassures businesses of a critical advantage in delivering high-quality, culturally sensitive customer interactions.

“Traditional language models often fail to deliver accurate results, but purpose-built models such as Convin LLM produce better results and are more accurate. By addressing major challenges such as agent inefficiencies, call centres can improve handling time, response time, and inconsistent customer experience. This streamlines processes and enhances customer satisfaction by providing precise, data-driven insights and predictive analytics. As a result, call centres realize a substantial cost reduction and new revenue generation,” Atul Shree, CTO of Convin, said.


The process begins by identifying specific objectives related to inefficiencies in the contact centre setup and selecting relevant data sources. Data is then collected and preprocessed to ensure high quality, including filtering, deduplication, and tokenization.

Pre-training on this cleaned dataset helps the model understand linguistic patterns and adapt to different languages. Finally, the model undergoes fine-tuning with task-specific labelled data, refining its parameters to predict labels accurately and deliver optimal performance.

With the enhanced capabilities and efficiencies introduced by this model, Convin anticipates a 200%  increase in customer acquisition and a 3X boost in overall revenue for 2024-25.

The post Convin Launches a 7 Bn Parameter LLM Tailored for Indian Contact Centres appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-news-updates/convin-launches-a-7-bn-parameter-llm-tailored-for-indian-contact-centres/feed/ 0
CloudKeeper Acquires AI-Automation-Led Cloud Optimisation Startup WiseOps https://analyticsindiamag.com/ai-news-updates/cloudkeeper-acquires-ai-automation-led-cloud-optimisation-startup-wiseops/ https://analyticsindiamag.com/ai-news-updates/cloudkeeper-acquires-ai-automation-led-cloud-optimisation-startup-wiseops/#respond Mon, 12 Aug 2024 08:30:00 +0000 https://analyticsindiamag.com/?p=10132182

With a growing customer base of 50 clients, the startup has achieved over $100,000 in revenue till date.

The post CloudKeeper Acquires AI-Automation-Led Cloud Optimisation Startup WiseOps appeared first on AIM.

]]>

CloudKeeper, a leading provider of comprehensive cloud cost optimisation services, has acquired WiseOps, an AI automation platform specialising in AWS cost and usage optimisation.

While the financial details of the deal were not disclosed, CloudKeeper made payments in equity and cash.

WiseOps previously secured an undisclosed pre-seed investment from CORE91.VC in December 2023. With a growing customer base of 50 clients, the startup has achieved over $100,000 in revenue till date.


Founded in 2023, the startup is known for AI-driven recommendations and automated optimisations, empowering teams to significantly reduce cloud spend without compromising on performance or workflow efficiency.

By integrating WiseOps’ intelligent tools into CloudKeeper’s robust ecosystem, clients can now access a truly end-to-end cloud optimisation solution that promises enhanced savings and workflow efficiency.

Sharing about the journey of the company, Praneet Chandra, Co-founder of WiseOps, stated, “Fifteen months ago, Ronak and I founded WiseOps in response to companies struggling with rising costs and cloud infrastructure challenges, leading to layoffs. Our journey began with our first customer, where we reduced their cloud bill by 50%.”

“WiseOps was the missing piece of the puzzle,” said Deepak Mittal, co-founder and CEO of CloudKeeper. “By joining forces with them, CloudKeeper has become a truly comprehensive cloud cost optimization solution. It will enable us to cater to a broader range of clients, address more complex use cases, and help businesses optimize and engineer their cloud environments more effectively.”

The post CloudKeeper Acquires AI-Automation-Led Cloud Optimisation Startup WiseOps appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-news-updates/cloudkeeper-acquires-ai-automation-led-cloud-optimisation-startup-wiseops/feed/ 0
Gupshup Boosts Workforce by 20% Amid Rising Demand for Conversational AI Solutions https://analyticsindiamag.com/ai-news-updates/gupshup-boosts-workforce-by-20-amid-rising-demand-for-conversational-ai-solutions/ https://analyticsindiamag.com/ai-news-updates/gupshup-boosts-workforce-by-20-amid-rising-demand-for-conversational-ai-solutions/#respond Mon, 12 Aug 2024 07:32:50 +0000 https://analyticsindiamag.com/?p=10132196

The demand for Gen AI-powered conversational experiences globally has seen the company double its team size in Brazil and ramp up hiring efforts in China, the GCC region, Indonesia, Malaysia and Turkey.

The post Gupshup Boosts Workforce by 20% Amid Rising Demand for Conversational AI Solutions appeared first on AIM.

]]>

Gupshup is witnessing strong demand for its conversational AI solutions in India and international markets. To fuel this growth, the company undertook accelerated hiring in FY24, growing its workforce by 20% to 1400 people.

The hires will support Gupshup’s growth and expansion across India, Latin America, Middle East, SEA, Africa and Europe.

Additionally, the company made several senior-level hires across Marketing, GTM (go-to-market), Engineering, and Solutions. Already a profitable unicorn, Gupshup saw 40% YoY growth last year, driven by brands’ escalating demand to engage customers through conversational advertising, marketing, and support on messaging channels.

“As we continue to expand our footprint globally, we are actively seeking top talent across engineering, product development, marketing, and customer support roles to drive this conversational revolution. Our people are our greatest asset, and we are committed to building a diverse and inclusive workforce that can unlock massive value for our customers through innovative conversational experiences,” Madhuri Nandgaonkar, VP – HR, Gupshup said.

While India has been a primary market for Gupshup, geographies like Latin America, Middle East, APAC, Africa and Europe have emerged as key growth drivers for the company over the last 3 years.

The demand for Gen AI-powered conversational experiences globally has seen the company double its team size in Brazil and ramp up hiring efforts in China, the GCC region, Indonesia, Malaysia and Turkey.

In FY25, the company aims to boost its engineering and Go-To-Market (GTM) teams, followed by product development and customer support, across India and other geographies. 

According to the company, this year, the company also saw a 15% increase in women in senior leadership positions. The company hired 22% women employees and 33% women interns as part of their internship program.

Gupshup has doubled its customer base in several of the international markets and works with leading global brands including, L’Oréal, P&G, Grupo Carso, GoJek, Nestle, Petromin, and Netflix among others.

Earlier this year, Gupshup further expanded its product suite with the launch of Conversation Cloud – a comprehensive suite of SaaS tools aimed at revolutionizing business.

The post Gupshup Boosts Workforce by 20% Amid Rising Demand for Conversational AI Solutions appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-news-updates/gupshup-boosts-workforce-by-20-amid-rising-demand-for-conversational-ai-solutions/feed/ 0
Kyndryl Launches Global Security Operations Centre in Bengaluru https://analyticsindiamag.com/ai-news-updates/kyndryl-launches-global-security-operations-centre-in-bengaluru/ https://analyticsindiamag.com/ai-news-updates/kyndryl-launches-global-security-operations-centre-in-bengaluru/#respond Mon, 12 Aug 2024 06:52:45 +0000 https://analyticsindiamag.com/?p=10132147

The centre features high-level cyber engineering that analyses evolving compromise indicators and incident impacts to provide customers with decisive insights.

The post Kyndryl Launches Global Security Operations Centre in Bengaluru appeared first on AIM.

]]>

Kyndryl announced it has launched a Security Operations Centre (SOC) in Bengaluru, India that offers comprehensive support and advanced protection capabilities for the entire cyber threat lifecycle, using AI, specifically machine learning and integrated automation systems.  

The SOC in Bengaluru is designed to be a cyber defence hub that operates around the clock to offer cyber threat intelligence and incident response, collaborating with Kyndryl’s global network of cybersecurity experts.

Kyndryl provides a hybrid model that allows organisations to selectively outsource certain cybersecurity functions or fully outsource the end-to-end management of their cybersecurity operations to Kyndryl.

It will also be a centre of excellence for cybersecurity management with specialised skills, certifications and experience in cybersecurity platform management, and technologies to support security events, operational management and monitoring. 

Kyndryl’s SOC capabilities include multiple- level incident monitoring, malware labs, threat hunting and security information and event management (SIEM) that monitor and correlate security events.

The SOC features high-level cyber engineering that analyses evolving compromise indicators and incident impacts to provide customers with decisive insights. The SOC also helps ensure compliance with government data protection regulations and adapts to evolving cyber threats and regulatory requirements.

The SOC is underpinned by Kyndryl’s Security Operations as a platform (SOaap) capability. The SOaap is a single, unified digital platform that provides a centralized view to help monitor, detect, prevent and respond to the latest cyber threats in real-time, in a flexible delivery and collaborative approach with Kyndryl’s global partnership ecosystem.

Integrated on Kyndryl Bridge, the SOaap enables Kyndryl to provide enhanced visibility, risk and threat management to a customer’s entire IT estate to determine the impact of any threats more quickly while also streamlining the orchestration required between IT Operations and Cybersecurity Operations. 

“We are addressing the critical security challenges faced by C-Suite leaders, the need for enhanced operational efficiency, compliance with evolving security regulations, and integration with new technologies. Our focus is on managing increased workloads, responding to dynamic business needs, and defending against an expanded attack surface. With the Indian government’s strong focus on data security policies, we are committed to leading the way with innovative and responsible enterprise resilience services to make India Cyber Surakshit,” Lingraju Sawkar, President, Kyndryl India, said.

The post Kyndryl Launches Global Security Operations Centre in Bengaluru appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-news-updates/kyndryl-launches-global-security-operations-centre-in-bengaluru/feed/ 0
Google, NVIDIA, and Microsoft to Invest INR 3,200 Crore in Madhya Pradesh https://analyticsindiamag.com/ai-news-updates/google-nvidia-and-microsoft-to-invest-inr-3200-crore-in-madhya-pradesh/ https://analyticsindiamag.com/ai-news-updates/google-nvidia-and-microsoft-to-invest-inr-3200-crore-in-madhya-pradesh/#respond Fri, 09 Aug 2024 10:27:50 +0000 https://analyticsindiamag.com/?p=10131994 Google, NVIDIA, and Microsoft to Invest INR 3,200 Crore in Madhya Pradesh

NVIDIA suggested creating a blueprint to transform Madhya Pradesh into the “Intelligence Capital of India.”

The post Google, NVIDIA, and Microsoft to Invest INR 3,200 Crore in Madhya Pradesh appeared first on AIM.

]]>
Google, NVIDIA, and Microsoft to Invest INR 3,200 Crore in Madhya Pradesh

Madhya Pradesh has secured investment proposals worth INR 3,200 crore in a single day during an interactive session held in Bengaluru on August 7-8, 2024. The event, organised by Invest Madhya Pradesh, drew over 500 participants, including leading industrialists and investors from the IT, textiles, aerospace, and pharmaceutical sectors.

Among the significant proposals, Google Cloud announced plans to establish a startup hub and Center of Excellence in the state. Chief Minister Mohan Yadav shared that these initiatives aim to enhance the local skilled workforce. 

Meanwhile, chip giant NVIDIA suggested creating a blueprint to transform Madhya Pradesh into the “Intelligence Capital of India.”

The session also saw participation from more than 15 international diplomatic missions and major companies like Infosys, Cognizant, and TCS, further solidifying Madhya Pradesh’s reputation as a prime investment destination.

Chief Minister Yadav highlighted that the proposals could generate approximately 7,000 new jobs in the state, providing a substantial boost to the local economy.

“Discussions were held with these companies regarding the development of information technology in Madhya Pradesh and their future plans, and I am hopeful that with the kind of positive response that we have got, we will witness many IT companies setting up their campuses in Madhya Pradesh,” he said.

In February, Google had signed an MoU with the Maharashtra government to advance scalable AI solutions in sectors like agriculture and healthcare. The recent proposals in Madhya Pradesh signal a continued commitment from tech giants to invest in India’s technological growth.

The post Google, NVIDIA, and Microsoft to Invest INR 3,200 Crore in Madhya Pradesh appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-news-updates/google-nvidia-and-microsoft-to-invest-inr-3200-crore-in-madhya-pradesh/feed/ 0
Namma Yatri Launches Lifetime Zero-Commission Cab Service on ONDC in Delhi NCR https://analyticsindiamag.com/ai-news-updates/namma-yatri-launches-lifetime-zero-commission-cab-service-on-ondc-in-delhi-ncr/ https://analyticsindiamag.com/ai-news-updates/namma-yatri-launches-lifetime-zero-commission-cab-service-on-ondc-in-delhi-ncr/#respond Fri, 09 Aug 2024 08:26:23 +0000 https://analyticsindiamag.com/?p=10131976 namma yatri

The app operates on a zero-commission model, aiming to provide affordable transportation while increasing driver earnings by 15-20%.

The post Namma Yatri Launches Lifetime Zero-Commission Cab Service on ONDC in Delhi NCR appeared first on AIM.

]]>
namma yatri

MovingTech, the company behind the Namma Yatri app, has introduced its new lifetime zero-commission cab service under the “Yatri” brand in Delhi NCR. The launch marks a significant expansion of the app, which is part of the ONDC Network and offers community-driven transportation solutions.

Yatri, India’s first open and community-driven cab and auto service app, has transitioned from a successful auto pilot to a full-scale launch of its cab services. The app operates on a zero-commission model, aiming to provide affordable transportation while increasing driver earnings by 15-20%. This model supports better vehicle maintenance and enhances safety with advanced features

“ONGC’s open architecture and network-centric approach are paving the way for better affordability and accessibility of transportation options. It brings in a level playing field for drivers with higher earnings. We remain committed to catalyzing such community-driven initiatives that put people at the center while driving India towards a future of smart, efficient, and inclusive mobility solutions,” said Shireesh Joshi, Chief Business Officer of ONDC.

Yatri operates as a fully open-source, open network, and open data platform. It features dynamic booking and smart dispatch processes, offering Auto, AC Mini, AC Sedan, and AC XL Cab services at low prices. The ONDC Network, which includes sectors like food, grocery, and e-commerce, aims to unify various transportation modes under a single network.

“We’re excited to expand our cab services in the heart of the nation. Yatri is more than an app, it is a movement committed to transforming the lives of drivers and citizens. Yatri fosters close collaboration between Samaaj, Sarkar, and Bazaar to create a more connected and empowered city. In our commitment to support the EV mission, we won’t charge any subscription fees or commission for Electric Autos & Cabs in Delhi NCR till Mar 2026,” said Shan M S, Co-Founder of MovingTech.

Google Backed 

Google has also invested in Moving Tech, the parent company of Namma Yatri. Namma Yatri recently came up with new features in its service including rentals and instant travel. 

Previously in 2020, Google said it plans to invest $10 billion in India over the next five to seven years as the search giant looks to help accelerate the adoption of digital services in the key overseas market.

Namma Yatri, initially a subsidiary of the payments company Juspay, became a separate entity called Moving Tech in April. CEO Magizhan Selvan and COO Shan M S, formerly with Juspay, now lead this new mobility business. 

In India, Namma Yatri competes with the likes of Uber, Ola and Swiggy-backed Rapido.

One Namma Yatri driver told AIM that the low subscription fee compared to Uber and Ola has helped him. Within six months of switching to Namma Yatri, he was able to fund his two children’s weddings. The app has boarded 49,000 auto drivers and 550,000 users in five months, with approximately INR 12 crores ($1.5 million) paid out to drivers. It celebrated 500 million downloads in March.

Although Namma Yatri currently lacks features like bike taxis and carpooling, with Google’s support it may soon expand to include these options. 

The founders said that Namma Yatri will leverage the new funds to grow its engineering and R&D competencies, and also include more types of transportation, including buses. 

On the other hand, Google has found an ideal partner to strengthen its presence in India’s transportation sector. Given its expertise in revolutionising online purchases, it is set to replicate the ‘Google Pay moment’ in Indian transportation soon.

The post Namma Yatri Launches Lifetime Zero-Commission Cab Service on ONDC in Delhi NCR appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-news-updates/namma-yatri-launches-lifetime-zero-commission-cab-service-on-ondc-in-delhi-ncr/feed/ 0
Amgen is Launching Biotechnology and AI Hub in Hyderabad https://analyticsindiamag.com/ai-news-updates/amgen-is-launching-biotechnology-and-ai-hub-in-hyderabad/ https://analyticsindiamag.com/ai-news-updates/amgen-is-launching-biotechnology-and-ai-hub-in-hyderabad/#respond Fri, 09 Aug 2024 06:33:34 +0000 https://analyticsindiamag.com/?p=10131969 Amgen is Launching Biotechnology and AI Hub in Hyderabad

Som Chattopadhyay has been appointed as the national executive for India, leading the expanded operations.

The post Amgen is Launching Biotechnology and AI Hub in Hyderabad appeared first on AIM.

]]>
Amgen is Launching Biotechnology and AI Hub in Hyderabad

Amgen, a company focusing on biotechnology, discovering, developing, manufacturing, and delivering innovative medicines, has announced plans to establish a new technology and innovation site in Hyderabad, India, to accelerate digital capabilities across its global operations. 

The site, named Amgen India, will support the advancement of Amgen’s drug pipeline and is expected to be operational by Q4 2024.

Located in HITEC City, Hyderabad, the facility will span six floors of the RMZ Spire Tower 110, accommodating up to 3,000 employees. Hyderabad was selected for its strong talent pool in medicine, life sciences, data sciences, and AI.

David M. Reese, M.D., executive vice president and CTO at Amgen, highlighted the significance of the new site, stating, “At a time when a quickly aging global population needs more innovation, the convergence of biotechnology and technology is enabling Amgen to work with greater speed, confidence, and efficiency — an incredibly exciting milestone for which we have been preparing for over a decade.”

Amgen India will focus on developing new technology solutions and digital capabilities to enhance operational efficiencies across the enterprise. The site will create roles in AI, data science, life science, and other global capabilities as the operation expands.

Som Chattopadhyay has been appointed as the national executive for India, leading the expanded operations. 

Telangana Chief Minister Sri Anumula Revanth Reddy welcomed the development, stating, “We are proud to welcome a global trailblazer of the biotechnology industry. Amgen’s unwavering mission to serve patients will be incredibly inspiring for the world-class technology talent seeking to make a meaningful impact on people around the world.”

Amgen, with nearly 27,000 employees, has a global presence in approximately 100 countries and regions, including India.

With more than 40 years in the industry, Amgen continues to advance a broad pipeline of treatments for cancer, heart disease, osteoporosis, inflammatory diseases, and rare conditions.

The post Amgen is Launching Biotechnology and AI Hub in Hyderabad appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-news-updates/amgen-is-launching-biotechnology-and-ai-hub-in-hyderabad/feed/ 0
Hugging Face Acquires XetHub to Build and Scale Millions of Large LLMs https://analyticsindiamag.com/ai-news-updates/hugging-face-acquires-xethub-to-build-and-scale-millions-of-large-llms/ https://analyticsindiamag.com/ai-news-updates/hugging-face-acquires-xethub-to-build-and-scale-millions-of-large-llms/#respond Thu, 08 Aug 2024 17:01:40 +0000 https://analyticsindiamag.com/?p=10131935

​​“Big models are here to stay,” says Clem Delangue.

The post Hugging Face Acquires XetHub to Build and Scale Millions of Large LLMs appeared first on AIM.

]]>

Hugging Face, an AI and machine learning platform, has acquired XetHub, a Seattle-based company focused on scaling Git for large datasets and AI models. The acquisition aims to enhance Hugging Face’s capabilities in managing and versioning large datasets and models, a critical need as the AI community scales to even larger models and datasets.

“This is the real 🍓—welcome to @xetdata. We’re just getting started!” posted Hugging Face Chief Clement Delangue on X, in reference to OpenAI’s project Strawberry.

​​“Big models are here to stay,” said Delangue. “What we want is to make the development of AI closer to what software engineering is — make it drastically faster,” he added.

Founded in 2021 by Yucheng Low, Ajit Banerjee, and Rajat Arya, XetHub has developed technology that enables Git to handle terabyte-scale repositories, allowing teams to work efficiently with evolving datasets and models. This acquisition aligns with Hugging Face’s long-term goal of optimising storage and versioning for AI development, moving away from the limitations of Git LFS, which was not designed to handle the immense file sizes typical in AI.

“The XetHub team will help us unlock the next 5 years of growth of HF datasets and models by switching to our own, better version of LFS as storage backend for the Hub’s repos,” said Hugging Face CTO Julien Chaumond.

He further added that XetHub’s technology would unlock significant growth for the platform by enabling more efficient data management. For example, instead of re-uploading entire files, users will only need to upload modified chunks, streamlining updates and reducing storage needs. This improvement is crucial as AI models continue to grow in size, with trillion-parameter models like the BigLlama-3.1-1T already on the horizon.

XetHub, which started in 2021 with support from Madrona and other angel investors, was built by a team experienced in scaling AI infrastructure, including work on Apple’s internal machine learning infrastructure. 

The team will now integrate XetHub’s technology into the Hugging Face platform, aiming to make AI collaboration and development easier for its vast community of users.

In the announcement, Yucheng Low, co-founder of XetHub, highlighted the importance of data in AI’s evolution and expressed excitement about joining Hugging Face to continue their mission of enhancing AI collaboration at scale.

Hugging Face is currently handling a significant volume of data, with over 1.3 million model repositories, 450,000 datasets, and 680,000 spaces, totaling 12 petabytes of data stored in LFS. The acquisition of XetHub is expected to help manage this growing demand more efficiently.

Hugging Face’s infrastructure team is also expanding and actively hiring to support the ongoing development of its platform.

Previously Hugging Face acquired Spanish-based startup Agrilla for $10 million. Argilla specialises in collaborative software for AI professionals, focusing on data annotation and enhancing NLP with human-machine collaboration. This acquisition helps Hugging Face improve its data annotation capabilities and integrate human feedback into AI model training.

Hugging recently announced profitability as well. Founded in 2016, Hugging Face secured $235 million at a $4.5 billion valuation in a Series D funding round last year from major players including Google, Amazon, NVIDIA, Salesforce, AMD, Intel, IBM, and Qualcomm.

The post Hugging Face Acquires XetHub to Build and Scale Millions of Large LLMs appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-news-updates/hugging-face-acquires-xethub-to-build-and-scale-millions-of-large-llms/feed/ 0
Mistral Releases La Plateforme for Building AI Agents https://analyticsindiamag.com/ai-news-updates/mistral-releases-la-plateforme-for-building-ai-agents/ https://analyticsindiamag.com/ai-news-updates/mistral-releases-la-plateforme-for-building-ai-agents/#respond Thu, 08 Aug 2024 11:16:40 +0000 https://analyticsindiamag.com/?p=10131913 Mistral AI

For developers seeking to integrate agent creation into existing workflows, the Agent API provides a programmatic solution.

The post Mistral Releases La Plateforme for Building AI Agents appeared first on AIM.

]]>
Mistral AI

Mistral has introduced two primary methods for creating custom agents, catering to both non-technical users and developers. 

Mistral has released its Agents API for building agents on Mistral’s AI models or fine-tuned model on Le Chat.

Along with this, Mistral has also announced the La Plateforme Agent Builder offering a user-friendly interface that allows users to easily create and configure agents. 

For developers seeking to integrate agent creation into existing workflows, the Agent API provides a programmatic solution.

Users can start building their own agents by accessing the Agent Builder at La Plateforme’s console. The tool offers various customisation options, including the selection of models such as “Mistral Large 2” (mistral-large-2407), “Mistral Nemo” (open-mistral-nemo), “Codestral” (codestral-2405), or any fine-tuned models.

Additional customisation includes setting the sampling temperature, which determines the randomness of the agent’s output, and adding optional instructions and few-shot learning demonstrations to guide the agent’s behaviour. 

Once the agent is deployed, users can interact with it via API using the agent_id or enable chat functionality on Le Chat.

Meta CEO Mark Zuckerberg recently said there could possibly be more AI agents in the world than humans. While speaking to Rowan Cheung in a podcast, he said there are millions of small businesses in the world, and in the future all of them could have AI agents carrying out some functions for the company, like customer support and sales.

The post Mistral Releases La Plateforme for Building AI Agents appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-news-updates/mistral-releases-la-plateforme-for-building-ai-agents/feed/ 0
Lentra Joins AWS’ Independent Software Vendor Accelerate Programme https://analyticsindiamag.com/ai-news-updates/lentra-joins-aws-independent-software-vendor-accelerate-programme/ https://analyticsindiamag.com/ai-news-updates/lentra-joins-aws-independent-software-vendor-accelerate-programme/#respond Thu, 08 Aug 2024 11:05:00 +0000 https://analyticsindiamag.com/?p=10131903

The Pune-based AI startup has been a major player in offering loan origination and management services to banks and other financial institutions.

The post Lentra Joins AWS’ Independent Software Vendor Accelerate Programme appeared first on AIM.

]]>

Digital SaaS lending platform, Lentra, has announced that it has officially joined AWS’ Independent Software Vendor (ISV) Accelerate Programme.

The Pune-based AI startup has been a major player in offering loan origination and management services to banks and other financial institutions, making use of data analytics. Currently, it caters to several major banks operating in India, including HDFC, Citi, Standard Chartered and Federal Bank.

The programme is offered as a co-sell for AWS Partners who provide solutions run on or integrated with AWS. As part of the programme, Lentra, in addition to other participating ISVs, is connected with AWS’ sales wing to help drive new business.

“By joining forces with AWS, we’ll leverage the power of cloud computing to drive innovation and deliver unparalleled value to our clients. This partnership will streamline our ability to deliver industry-leading digital lending solutions to AWS customers,” said D Venkatesh, founder and CEO of Lentra.

As of now, the company’s platform has processed $20 billion in loans for over 30 banks since its inception in 2019, with an overall contribution to reducing customer acquisition costs and as many as three million loan applications processed monthly.

Now, with its newfound partnership with AWS, the startup has access to global AWS field sellers and co-sell support and benefits from AWS, allowing them to meet customer needs more efficiently.

With Lentra catering to some of the largest banks in India, this is likely to improve banking experiences within the country as well. 

The post Lentra Joins AWS’ Independent Software Vendor Accelerate Programme appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-news-updates/lentra-joins-aws-independent-software-vendor-accelerate-programme/feed/ 0
LG AI Research Launches EXAONE 3.0: A Bilingual Language Model for Open Research https://analyticsindiamag.com/ai-news-updates/lg-ai-research-launches-exaone-3-0-a-bilingual-language-model-for-open-research/ https://analyticsindiamag.com/ai-news-updates/lg-ai-research-launches-exaone-3-0-a-bilingual-language-model-for-open-research/#respond Thu, 08 Aug 2024 11:02:57 +0000 https://analyticsindiamag.com/?p=10131897

This model is the pioneering open-access member of the EXAONE family, designed to democratize expert-level AI capabilities.

The post LG AI Research Launches EXAONE 3.0: A Bilingual Language Model for Open Research appeared first on AIM.

]]>

LG AI Research has announced the release of EXAONE 3.0, a 7.8 billion parameter instruction-tuned language model, marking a significant advancement in the field of large language models (LLMs). This is the first open model in the EXAONE family, aimed at democratising access to expert-level artificial intelligence capabilities. 

The release is intended to foster innovation and collaboration within the AI community by providing a high-performance model for non-commercial research purposes.

EXAONE 3.0 has been extensively evaluated across a variety of benchmarks, demonstrating competitive real-world performance and instruction-following capabilities. It excels particularly in Korean, while also achieving strong results in English across general tasks and complex reasoning. 

The model’s architecture is based on a decoder-only transformer with advanced features like rotary position embeddings and grouped query attention, supporting a maximum context length of 4,096 tokens.

The model was trained using a diverse dataset, ensuring robust performance in real-world scenarios. It features a bilingual tokeniser designed to optimise performance in both English and Korean, addressing the linguistic complexities of the Korean language. 

The training process involved extensive pre-training and post-training techniques, including supervised fine-tuning and direct preference optimisation, to enhance the model’s instruction-following capabilities.

The EXAONE 3.0 model is available on Hugging Face for research purposes, supporting the broader AI community’s efforts in developing innovative applications. LG AI Research aims to integrate advanced AI into everyday life, making expert knowledge accessible to a wider audience. The model is expected to contribute significantly to advancements in Expert AI, particularly in bilingual environments.

LG AI Research has emphasised the importance of responsible AI development, conducting thorough compliance reviews and ethical assessments to minimise risks associated with data usage. The model has been evaluated for potential social and ethical issues, with measures in place to ensure its safe and ethical deployment.

The post LG AI Research Launches EXAONE 3.0: A Bilingual Language Model for Open Research appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-news-updates/lg-ai-research-launches-exaone-3-0-a-bilingual-language-model-for-open-research/feed/ 0
Google DeepMind’s Table Tennis Robot Competes at Human-Level https://analyticsindiamag.com/ai-news-updates/google-deepminds-table-tennis-robot-competes-at-human-level/ https://analyticsindiamag.com/ai-news-updates/google-deepminds-table-tennis-robot-competes-at-human-level/#respond Thu, 08 Aug 2024 10:44:44 +0000 https://analyticsindiamag.com/?p=10131894 Google DeepMind

A 6 DoF ABB 1100 robotic arm on linear gantries won 45% of matches against human players of various skill levels

The post Google DeepMind’s Table Tennis Robot Competes at Human-Level appeared first on AIM.

]]>
Google DeepMind

Researchers from Google DeepMind have developed a robot capable of playing table tennis at an amateur human level. This robotic system, featuring a 6 DoF ABB 1100 arm mounted on linear gantries, has been tested against human players of varying skill levels, winning 45% of the matches overall. 

The robot’s design utilises a hierarchical and modular policy architecture, which includes low-level controllers for specific skills and a high-level controller for decision-making based on match statistics.

The robot employs advanced techniques to bridge the simulation-to-real-world gap, enabling it to adapt to unseen opponents and improve its decision-making process. It uses a combination of reinforcement learning and imitation learning to train its skills in simulation before deploying them in real-world matches. The system’s adaptability and strategic capabilities are enhanced by real-time tracking of match statistics and opponent performance.

In a user study involving 29 participants, the robot demonstrated solid amateur-level performance, winning all matches against beginners and 55% against intermediate players. However, it struggled against advanced players. Participants found the experience engaging and enjoyable, with 26 out of 29 expressing interest in playing with the robot again.

Despite its success, the robot faces challenges in handling fast and low balls, as well as accurately detecting spin. Future research aims to address these limitations by improving control algorithms and hardware optimisations, enhancing collision detection, and refining the robot’s strategic capabilities. This development marks a significant step towards achieving human-level performance in robotics, with potential applications beyond table tennis in various real-world tasks.

Google Deepmind’s robotics wing has constantly been advancing its research, as a few months back, they introduced ALOHA 2, a robotics technology with more dexterity to tasks with low-cost robots and AI. This development is likely an extension of the same research using ALOHA 2.

The post Google DeepMind’s Table Tennis Robot Competes at Human-Level appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-news-updates/google-deepminds-table-tennis-robot-competes-at-human-level/feed/ 0
Sam Altman Confirms OpenAI’s Project Strawberry https://analyticsindiamag.com/ai-news-updates/sam-altman-confirms-openais-project-strawberry/ https://analyticsindiamag.com/ai-news-updates/sam-altman-confirms-openais-project-strawberry/#respond Thu, 08 Aug 2024 06:27:14 +0000 https://analyticsindiamag.com/?p=10131840

OpenAI's teams are working on Strawberry to improve the models' ability to perform long-horizon tasks (LHT), which require planning and executing a series of actions over an extended period. 

The post Sam Altman Confirms OpenAI’s Project Strawberry appeared first on AIM.

]]>

OpenAI chief Sam Altman has hinted in a cryptic post that the AI startup is working on a project known internally as “Project Strawberry.” On X, Altman shared a post saying, “I love summer in the garden,” accompanied by an image of a pot with strawberries.

Project Strawberry, also referred to as Q*, was recently revealed in a Reuters report, which said that it will significantly enhance the reasoning capabilities of OpenAI’s AI models. “Some at OpenAI believe Q* could be a breakthrough in the startup’s search for artificial general intelligence (AGI),” said the report. 

Project Strawberry involves a novel approach that allows AI models to plan ahead and navigate the internet autonomously to perform in-depth research. This advancement could address current limitations in AI reasoning, such as common sense problems and logical fallacies, which often lead to inaccurate outputs.

AI Insider who goes by the name Jimmy Apples recently revealed that the Q* hasn’t been released yet as they ( OpenAI) aren’t happy with the latency and other ‘little things’ they want to further optimise.  

OpenAI’s teams are working on Strawberry to improve the models’ ability to perform long-horizon tasks (LHT), which require planning and executing a series of actions over an extended period. 

The project involves a specialised “post-training” phase, adapting the base models for enhanced performance. This method resembles Stanford’s 2022 “Self-Taught Reasoner” (STaR), which enables AI to iteratively create its own training data to reach higher intelligence levels.

OpenAI recently announced DevDay 2024, a global developer event series scheduled to take place in San Francisco on October 1, London on October 30, and Singapore on November 21. While the company has stated that the focus will be on advancements in the API and developer tools, there is speculation that OpenAI might also preview its next frontier model.

Recently, a new model in the LMsys chatbot arena showed strong performance in math. Interestingly, before the release of GPT-4o and GPT-4o Mini, these models were also observed in the chatbot arena a few days earlier.

The internal document indicates that Project Strawberry includes a “deep-research” dataset for training and evaluating the models, though the contents of this dataset remain undisclosed.

This innovation is expected to enable AI to conduct research autonomously, using a “computer-using agent” (CUA) to take actions based on its findings. Additionally, OpenAI plans to test Strawberry’s capabilities in performing tasks typically done by software and machine learning engineers.

Last year, it was reported that Jakub Pachocki and Szymon Sidor, two leading OpenAI researchers, used Ilya Sutskever’s work to develop a model called Q* (pronounced “Q-Star”) that achieved an important milestone by solving math problems it had not previously encountered.

Sutskever, raised concerns among some staff that the company didn’t have proper safeguards in place to commercialise such advanced AI models. Notably, he  left OpenAI and recently founded his own company called Safe Superintelligence. Following that Pachocki was appointed as the new chief AI scientist. 

What is Q*? 

Q* is probably a combination of Q-learning and A* search.  OpenAI’s Q* algorithm is considered a breakthrough in AI research, particularly in the development of AI systems with human reasoning capabilities. Q* combines elements of Q-learning and A* (A-star search), which leads to an improvement in goal-oriented thinking and solution finding. 

This algorithm shows impressive capabilities in solving complex mathematical problems (without prior training data) and symbolizes an evolution towards general artificial intelligence (AGI).

Q-learning  is a foundational concept in the field of AI, specifically in the area of reinforcement learning. Q-learning’s algorithm is categorised as model-free reinforcement learning, and is designed to understand the value of an action within a specific state. 

The ultimate goal of Q-learning is to find an optimal policy that defines the best action to take in each state, maximising the cumulative reward over time.

Q-learning is based on the notion of a Q-function, aka the state-action value function. This function operates with two inputs: a state and an action. It returns an estimate of the total reward expected, starting from that state, alongside taking that action, and thereafter following the optimal policy. 

OpenAI has recently unveiled a five-level classification system to track progress towards achieving artificial general intelligence (AGI) and superintelligent AI. The company currently considers itself at Level 1 and anticipates reaching Level 2 in the near future.

Other tech giants like Google, Meta, and Microsoft are also exploring techniques to enhance AI reasoning. However, experts like Meta’s Yann LeCun argue that large language models may not yet be capable of human-like reasoning.

The post Sam Altman Confirms OpenAI’s Project Strawberry appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-news-updates/sam-altman-confirms-openais-project-strawberry/feed/ 0
Genpact Appoints Sanjeev Vohra as First Chief Technology & Innovation Officer https://analyticsindiamag.com/ai-news-updates/genpact-appoints-sanjeev-vohra-as-first-chief-technology-innovation-officer/ https://analyticsindiamag.com/ai-news-updates/genpact-appoints-sanjeev-vohra-as-first-chief-technology-innovation-officer/#respond Thu, 08 Aug 2024 04:06:52 +0000 https://analyticsindiamag.com/?p=10131823 Genpact Appoints Sanjeev Vohra as First Chief Technology & Innovation Officer

Prior to joining Genpact, Vohra led Accenture Applied Intelligence, where he significantly expanded the company's Data and AI business and advised C-suite executives on leveraging data, advanced analytics, and AI for strategic growth.

The post Genpact Appoints Sanjeev Vohra as First Chief Technology & Innovation Officer appeared first on AIM.

]]>
Genpact Appoints Sanjeev Vohra as First Chief Technology & Innovation Officer

Genpact has named Sanjeev Vohra as its first chief technology & innovation officer, effective immediately. Vohra will report directly to President and CEO Balkrishan “BK” Kalra.

Vohra, who brings over 30 years of technology and consulting experience, is set to accelerate Genpact’s AI and advanced technology initiatives. His role will involve driving the company’s technology strategy, innovation framework, strategic partnerships, and talent development.

“Sanjeev brings tremendous AI and advanced technologies expertise to Genpact and an inclusive, people-first leadership style to our team,” said Kalra. “His visionary thinking will be invaluable as we embed AI and advanced technologies in every client conversation and deepen our internal technology expertise.”

Prior to joining Genpact, Vohra led Accenture Applied Intelligence, where he significantly expanded the company’s Data and AI business and advised C-suite executives on leveraging data, advanced analytics, and AI for strategic growth. His leadership roles at Accenture focused on digital transformation and creating new growth opportunities.

“I am excited to lead Genpact’s AI and advanced technology initiatives at such a pivotal moment,” Vohra stated. “Genpact’s deep domain expertise and commitment to scaling technology will enable us to accelerate digital transformation for our clients.”

Vohra’s appointment underscores Genpact’s commitment to enhancing its technological capabilities and delivering innovative solutions to its clients.

In April, Genpact appointed its chief human resources officer (CHRO), Piyush Mehta, as the Country Manager for India, signalling the company’s deepening commitment to innovation, growth, and talent development in the region. With his expanded responsibilities, he aims to leverage his expertise to further drive the company’s AI-first approach in India, delivering value to key stakeholders while continuing to oversee the global HR function.

The post Genpact Appoints Sanjeev Vohra as First Chief Technology & Innovation Officer appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-news-updates/genpact-appoints-sanjeev-vohra-as-first-chief-technology-innovation-officer/feed/ 0
Figure AI Founder Takes a Dig at Elon Musk After Figure 02 Launch https://analyticsindiamag.com/ai-news-updates/figure-ai-founder-takes-a-dig-at-elon-musk-after-figure-02-launch/ https://analyticsindiamag.com/ai-news-updates/figure-ai-founder-takes-a-dig-at-elon-musk-after-figure-02-launch/#respond Wed, 07 Aug 2024 09:45:40 +0000 https://analyticsindiamag.com/?p=10131733 Figure 02

Interestingly, the latest video of Figure 02 is eerily similar to the demo video of Optimus Gen-2.

The post Figure AI Founder Takes a Dig at Elon Musk After Figure 02 Launch appeared first on AIM.

]]>
Figure 02

With the recent launch of Figure AI’s next version of humanoid Figure 02, founder and CEO of the company Brett Adcock, has taken a friendly dig at Tesla founder Elon Musk who is developing their version of advanced humanoid with Optimus.

Sharing the famous meme from the movie Captain Phillips, Adcock replied to an older tweet of Musk, where he challenges the Figure AI founder with a simple, “Bring it on.” Interestingly, Musk has removed the old tweet (below screenshot taken before deleting). 

Source: X

In March, Adcock announced Figure’s new valuation of $2.6B valuation raising $675M from prominent investors including Microsoft, OpenAI, NVIDIA, Jeff Bezos and other VC firms. In Adcock’s tweet explaining the collaboration agreement to develop next gen AI models for robots, with these investors, Musk chose to bring on the competition in a seemingly friendly way. It is to this tweet that Adcock chose to reply now.  

Humanoid Race

With the humanoid race gaining steam more than ever, big tech companies are spending big on robotics. NVIDIA has been powering the next humanoid race with their products including Nim Microservices, OSMO, and MimicGen. 

The last update of Musk’s Optimus Gen-2 humanoid came in December last year, where the humanoid was able to perform more movements that involved dexterity with its hands. 

Funnily, the latest video of Figure 02 is eerily similar to the demo video of Optimus Gen-2. At this level of competition, it will be interesting to see who would emerge as the ultimate winner in the robotics race. 

The post Figure AI Founder Takes a Dig at Elon Musk After Figure 02 Launch appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-news-updates/figure-ai-founder-takes-a-dig-at-elon-musk-after-figure-02-launch/feed/ 0
LLaVA-OneVision: A New Era for Multimodal AI Models https://analyticsindiamag.com/ai-news-updates/llava-onevision-a-new-era-for-multimodal-ai-models/ https://analyticsindiamag.com/ai-news-updates/llava-onevision-a-new-era-for-multimodal-ai-models/#respond Wed, 07 Aug 2024 09:33:59 +0000 https://analyticsindiamag.com/?p=10131734

LLaVA-OneVision excels in chart interpretation, visual reasoning, and real-world image comprehension, rivaling advanced commercial models like GPT-4V.

The post LLaVA-OneVision: A New Era for Multimodal AI Models appeared first on AIM.

]]>

A team of researchers has introduced LLaVA-OneVision, a new open-source large multimodal model (LMM) that demonstrates unprecedented capabilities across single-image, multi-image, and video understanding tasks. The model, developed by consolidating insights from the LLaVA-NeXT blog series, achieves state-of-the-art performance on various benchmarks and exhibits emerging capabilities through task transfer. 

Read the full paper here

LLaVA-OneVision outperforms existing open-source models and approaches the capabilities of advanced commercial models like GPT-4V in several areas. The model excels in tasks such as chart and diagram understanding, visual reasoning, and real-world image comprehension.

The model offers strong performance across various scenarios, including single-image, multi-image, and video processing. It demonstrates emerging capabilities through effective cross-scenario task transfer, enabling it to adapt and excel in different contexts. Additionally, LLaVA-OneVision achieves state-of-the-art results on numerous benchmarks, solidifying its position as a leading solution in its field. 

The researchers employed a curriculum learning approach, training the model in stages to handle increasingly complex tasks. They also curated a large collection of high-quality datasets for training, emphasising the importance of data quality over quantity.

LLaVA-OneVision’s architecture builds on previous LLaVA models, incorporating improvements in visual representations and training strategies. The team used the Qwen-2 language model and SigLIP vision encoder as core components.

This breakthrough has significant implications for the development of general-purpose AI assistants capable of understanding and reasoning about visual information across various modalities. The researchers have open-sourced their model, code, and datasets to facilitate further advancements in the field.

As AI continues to evolve, LLaVA-OneVision represents a significant step towards more versatile and capable multimodal systems that can understand and interact with visual information in increasingly sophisticated ways.  

The post LLaVA-OneVision: A New Era for Multimodal AI Models appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-news-updates/llava-onevision-a-new-era-for-multimodal-ai-models/feed/ 0
NatWest Group Opens New Office in Bengaluru, Will Hire 3000 Software Engineers by 2026 https://analyticsindiamag.com/ai-news-updates/natwest-group-opens-new-office-in-bengaluru-will-hire-3000-software-engineers-by-2026/ https://analyticsindiamag.com/ai-news-updates/natwest-group-opens-new-office-in-bengaluru-will-hire-3000-software-engineers-by-2026/#respond Wed, 07 Aug 2024 06:07:55 +0000 https://analyticsindiamag.com/?p=10131709

India is NatWest' second largest employee base outside the UK.

The post NatWest Group Opens New Office in Bengaluru, Will Hire 3000 Software Engineers by 2026 appeared first on AIM.

]]>

NatWest Group has announced the lease of a new office in Bengaluru located at Bagmane Constellation Business Park. The new site follows our announcement last year that NatWest Group was looking to recruit 3,000 new software engineers in India by 2026.

The seating capacity in the new office will be three times that of the current office, with the opportunity to increase this further. The state-of-the-art facility spans over 370,000 square feet over 11 floors and is a LEED-certified green building (Leadership in Energy and Environmental Design).

The location will serve as a hub for pioneering technology solutions and cutting-edge developments, supporting us as we work in simpler and smarter ways to better serve our customers.

Bengaluru is a key strategic location for NatWest Group in India, alongside Gurugram and Chennai. This expansion not only strengthens our presence in India, where we have our second-largest employee base outside the UK, but also enhances our colleague value proposition.

“Bengaluru is known for its vibrant technology sector and skilled talent pool, so this new office marks a significant chapter in our growth journey across India. Strengthening our global operations further positions us at the forefront of innovation as we continue to prioritise improving the customer and colleague experience,” said Scott Marcar, Group Chief Information Officer, NatWest Group.

Punit Sood, Head of India, NatWest Group, added, “Our new Bengaluru office is not just an expansion of physical space but a strategic investment in our future. With a modern office design to enhance productivity and create an inspiring environment for our employees it reflects our commitment to the vast potential of India’s talent and technology ecosystem.”

The post NatWest Group Opens New Office in Bengaluru, Will Hire 3000 Software Engineers by 2026 appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-news-updates/natwest-group-opens-new-office-in-bengaluru-will-hire-3000-software-engineers-by-2026/feed/ 0
Bengaluru-based Medical AI Startup SigTuple Raises $4M, to Expand Operations https://analyticsindiamag.com/ai-news-updates/bengaluru-based-medical-ai-startup-sigtuple-raises-4m-to-expand-operations/ https://analyticsindiamag.com/ai-news-updates/bengaluru-based-medical-ai-startup-sigtuple-raises-4m-to-expand-operations/#respond Wed, 07 Aug 2024 05:33:45 +0000 https://analyticsindiamag.com/?p=10131705 sigtuple

Binny Bansal serves on the board of SigTuple.

The post Bengaluru-based Medical AI Startup SigTuple Raises $4M, to Expand Operations appeared first on AIM.

]]>
sigtuple

SigTuple, a Bengaluru-based medtech startup, has secured $4 million (INR 33 crores) in an extended Series C funding round led by SIDBI Venture Capital. The funding also saw participation from existing investors, including Endiya Partners and strategic leaders from the healthcare sector.

The fresh capital will be used to drive SigTuple’s geographical expansion, broaden its product portfolio, and support regulatory clearances. This latest investment brings the company’s total funding to $50 million since its inception in 2015.

The AI-powered startup was founded by Rohit Kumar Pandey, Tathagato Rai Dastidar and Apurv Anand. In 2015, SigTuple raised $16M in a Series C funding, and previous to that had raised $19M. 

AI in Healthcare

SigTuple has been making significant strides in the digital pathology space. Its flagship product, AI100, which automates manual microscopic reviews using AI and robotics, has gained traction in the Indian market and expanded into Southeast Asia, the Middle East, and North Africa. The company is now poised to enter European and American markets.

In a major milestone, AI100 received 510(k) clearance from the US Food and Drug Administration (FDA) in September 2023, making SigTuple the third company globally and the first in India to achieve this for AI-assisted digital hematology.

Tathagato Rai Dastidar, founder & CEO of SigTuple said, “While we continue to build on the success of AI100 in India and abroad, 2024 will witness two new major product launches addressing a wide segment of the diagnostic industry, which will help make SigTuple a global brand coming out of India. We are truly excited to welcome SIDBI Venture Capital on board as the lead investor in this round. Their support is going to go a long way in making our dream of going global a reality.”

The company is set to launch two new major products in 2024. One is a next-generation device that will automate all manual microscopy in clinical labs, surpassing the capabilities of AI100. Additionally, SigTuple plans to enter the point-of-care market with a device leveraging microfluidic technology and imaging to conduct essential tests within minutes.

This funding round and product expansion plans underscore SigTuple’s commitment to revolutionising the diagnostic industry by making diagnostics decentralised and fully automating microscopic reviews of diseased samples.

Interestingly, in recent times there have been a number of AI developments in healthcare. In addition to big tech companies such as Google, Microsoft, Oracle, and others heavily investing in this segment. The companies are bringing LLM- based diagnostic measures in addition to other features to assist doctors. 

The post Bengaluru-based Medical AI Startup SigTuple Raises $4M, to Expand Operations appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-news-updates/bengaluru-based-medical-ai-startup-sigtuple-raises-4m-to-expand-operations/feed/ 0
Y Combinator-Backed Indian-Origin Founders Launch Mica AI for SaaS Sales https://analyticsindiamag.com/ai-news-updates/y-combinator-backed-indian-origin-founders-launch-mica-ai-for-saas-sales/ https://analyticsindiamag.com/ai-news-updates/y-combinator-backed-indian-origin-founders-launch-mica-ai-for-saas-sales/#respond Wed, 07 Aug 2024 03:42:05 +0000 https://analyticsindiamag.com/?p=10131691 Mica AI

A 25% reduction in sales cycle duration can be achieved with Mica AI.

The post Y Combinator-Backed Indian-Origin Founders Launch Mica AI for SaaS Sales appeared first on AIM.

]]>
Mica AI

Mica AI, a Y-Combinator-backed company co-founded by Achyuta Iyengar, Jai Yarlagadda, and Bharadwaj Swaminathan, has introduced a tool that transforms sales calls into personalised video highlights. This innovation promises to significantly enhance how sales teams communicate and close deals, making the process more efficient and effective.

Headquartered in San Francisco, the India-origin founders, who are graduates of University of California, look to tap into Saas sales business with Mica. The founders are graduates of University of Berkeley, The startup helps convert sales calls into engaging video highlight reels. 

In a typical sales process, once a sales call ends, it’s up to the main contact person to share the key points and benefits with other decision-makers in their company. This can be difficult because they might not remember or explain everything as well as the sales rep did. This communication gap can slow down or even stop potential deals, costing companies a lot of money in lost sales every year, and this is where Mica comes in. 

Revolutionising Sales Communication

The video highlights can be shared with main contacts to keep them excited and help them share the information internally. This innovative approach ensures that decision-makers who were not present on the call can quickly grasp key product features and deal terms, leading to faster sales cycles and more closed deals.

Mica’s solution is simple yet powerful. After a sales call, Mica automatically processes the recording, identifying the key points that garnered the most interest from the prospect. It then generates short videos for each of these crucial topics. These videos are compiled into a shareable link, which can be easily included in follow-up emails or sent directly to the prospect.

By offering this service, Mica not only helps in creating more informed contacts within organisations but also accelerates the decision-making process. Decision-makers can review the essential information swiftly, making it easier for them to move forward with the deal.

The founders of Mica AI are on a mission to revolutionise sales communication, making it more streamlined and impactful. With their tool, SaaS sales teams are better equipped to close more deals, ultimately driving growth and success for their businesses.

The startup claims to have a 25% reduction in sales cycle duration leading to a 30% increase in monthly deal closures. 

The post Y Combinator-Backed Indian-Origin Founders Launch Mica AI for SaaS Sales appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-news-updates/y-combinator-backed-indian-origin-founders-launch-mica-ai-for-saas-sales/feed/ 0
IBM Completes Over 1,000 Generative AI Projects in One Year https://analyticsindiamag.com/ai-news-updates/ibm-completes-over-1000-generative-ai-projects-in-one-year/ https://analyticsindiamag.com/ai-news-updates/ibm-completes-over-1000-generative-ai-projects-in-one-year/#respond Tue, 06 Aug 2024 13:53:51 +0000 https://analyticsindiamag.com/?p=10131688

IBM’s AI has automated recruiting processes, cutting employee mobility processing time by 50%, and streamlined source-to-pay processes in the supply chain, reducing invoice costs by up to 50%. 

The post IBM Completes Over 1,000 Generative AI Projects in One Year appeared first on AIM.

]]>

IBM has completed over 1,000 generative AI projects in the past year, according to Armand Ruiz, VP of Product at IBM. The company’s initiatives span a wide range of functions aimed at enhancing operational efficiency across several domains.

In customer-facing functions, IBM’s AI technologies have automated customer service with a 95% accuracy rate, improved marketing by personalising content and reducing costs by up to 40%, and advanced content creation through auto-generative commentary for sports. 

AI has significantly reduced text analysis and reading tasks for knowledge workers, leading to a 90% reduction in such work and enabling higher-value tasks and better decision-making. 

The company has also focused on HR, finance, and supply chain management. IBM’s AI has automated recruiting processes, cutting employee mobility processing time by 50%, and streamlined source-to-pay processes in the supply chain, reducing invoice costs by up to 50%. 

In planning and analysis, AI-driven automation has sped up data processing by 80%, while regulatory compliance efforts are enhanced, with improved response times to regulatory changes.

In IT development and operations, IBM’s AI supports app modernisation by generating and tuning code, which accelerates development processes. It also automates IT operations, reducing mean time to repair (MTTR) by 50%, and improves application performance with AIOps, cutting support tickets by 70%. Data platform engineering has seen a redesign in integration methods, reducing integration time by 30%.

IBM’s core business operations have benefited from AI advancements as well. Threat management has become more efficient, with incident response times reduced from hours to minutes or seconds and potential threats contained eight times faster.

Asset management practices have been optimised, reducing unplanned downtime by 43%, while product development processes, such as drug discovery, have been expedited through AI interpretation of molecular structures. 

IBM’s AI supports environmental intelligence efforts, increasing manufacturing output by 25% through better management of weather and climate impacts.

In May, IBM open sourced its Granite 13B LLM, ideal for enterprise use cases.  These models simplify coding for developers across various industries. The Granite code models are built to resolve the challenges developers face in writing, testing, debugging, and shipping reliable software.

IBM released four variations of the Granite code model, ranging in size from 3 to 34 billion parameters. The models have been tested on a range of benchmarks and have outperformed other comparable models like Code Llama and Llama 3 in many tasks.

The post IBM Completes Over 1,000 Generative AI Projects in One Year appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-news-updates/ibm-completes-over-1000-generative-ai-projects-in-one-year/feed/ 0
Advancements in 3D Mesh Generation: Introducing MeshAnything V2 and Adjacent Mesh Tokenisation https://analyticsindiamag.com/ai-news-updates/advancements-in-3d-mesh-generation-introducing-meshanything-v2-and-adjacent-mesh-tokenisation/ https://analyticsindiamag.com/ai-news-updates/advancements-in-3d-mesh-generation-introducing-meshanything-v2-and-adjacent-mesh-tokenisation/#respond Tue, 06 Aug 2024 11:24:06 +0000 https://analyticsindiamag.com/?p=10131650

AMT halves token sequence length, reducing computational load and doubling MeshAnything V2's mesh capacity to 1600 faces.

The post Advancements in 3D Mesh Generation: Introducing MeshAnything V2 and Adjacent Mesh Tokenisation appeared first on AIM.

]]>

Researchers from Nanyang Technological University, Tsinghua University, Imperial College London, and Westlake University introduced MeshAnything V2, an AI model that significantly improves the generation of artist-created meshes (AMs) aligned with given 3D shapes. This advancement marks a substantial leap in both performance and efficiency for 3D asset production. 

Read the full paper here

The key innovation behind MeshAnything V2 is the newly proposed Adjacent Mesh Tokenization (AMT) method. Unlike previous approaches that represent each face of a mesh with three vertices, AMT uses a single vertex whenever possible, resulting in a more compact and well-structured token sequence.

Experiments show that AMT reduces the token sequence length by approximately half on average, significantly decreasing computational load and memory usage. This efficiency gain allows MeshAnything V2 to generate meshes with up to 1600 faces, doubling the previous limit of 800.

The researchers conducted extensive tests to validate the effectiveness of AMT. Quantitative experiments demonstrated that MeshAnything V2 outperforms its predecessor in various metrics, including Chamfer Distance, Edge Chamfer Distance, and Normal Consistency.

MeshAnything V2 is designed to be integrated with various 3D asset production pipelines, enabling highly controllable AM generation. This capability has potential applications in industries such as gaming, movies, and virtual reality, where high-quality 3D meshes are in demand.

While the advancements are significant, the researchers acknowledge that further improvements in stability and accuracy are needed before the technology is ready for industrial applications.

The development of MeshAnything V2 builds upon recent progress in 3D generation techniques, including the application of transformers and diffusion models to 3D asset creation. This research contributes to the growing field of AI-assisted 3D content generation, which aims to streamline and enhance the traditionally time-consuming process of manual mesh creation.

As the technology continues to evolve, it holds the promise of revolutionising 3D asset production, potentially reducing the reliance on manual labor and accelerating the creation of complex 3D environments for various digital media applications. 

Moreover, this technology is particularly beneficial for companies and individuals involved in creating 3D models, as it enhances accuracy and efficiency. For instance, 3DAiLY, a 3D model creation company, uses generative AI to produce ultra-realistic, production-ready assets. They are also pioneers in making these models accessible on their community platform, catering to both AAA and indie game developers.

The post Advancements in 3D Mesh Generation: Introducing MeshAnything V2 and Adjacent Mesh Tokenisation appeared first on AIM.

]]>
https://analyticsindiamag.com/ai-news-updates/advancements-in-3d-mesh-generation-introducing-meshanything-v2-and-adjacent-mesh-tokenisation/feed/ 0