Two platform, ‘distillation’ framework for custom model

-

(Photo = Two Platform)

Artificial Intelligence (CEO Pranab Mistri) unveiled the ‘Sutra D3’ framework on the twenty seventh for the production of a company distilled model.

Knowledge Distillation is a way of learning data output by large language models (LLM) to a compact model, with the advantage of having the ability to develop the core performance of LLM at a low price.

Two Platform said the synthetic data generated by a big model through the Sutra D3 that it’s suitable for the industries that lack data and lack of knowledge.

As a substitute of the patient’s medical records within the hospital, a distilled model is created with a virtual record created by LLM, and it uses it to ascertain an AI solution that analyzes multilingual medical records.

The model is lightweight on this process and might be run within the on -premises and edge environment, which may reduce costs.

Two Platform supports the distribution environment in response to the company environment corresponding to API call service, Ondivis and On -Premies.

“The era of universal LLM has passed, and the AI ​​trend is now moving to a light-weight model optimized for practical work,” said Pranab Mist.

By Park Soo -bin, sbin08@aitimes.com

ASK ANA

What are your thoughts on this topic?
Let us know in the comments below.

0 0 votes
Article Rating
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments

Share this article

Recent posts

0
Would love your thoughts, please comment.x
()
x