supervised-fine-tuning

MoRA: High-Rank Updating for Parameter-Efficient Fantastic-Tuning

Owing to its robust performance and broad applicability when put next to other methods, LoRA or Low-Rank Adaption is some of the popular PEFT or Parameter Efficient Fantastic-Tuning methods for fine-tuning a big language...

RAFT – A High quality-Tuning and RAG Approach to Domain-Specific Query Answering

Because the applications of enormous language models expand into specialized domains, the necessity for efficient and effective adaptation techniques becomes increasingly crucial. Enter RAFT (Retrieval Augmented High quality Tuning), a novel approach that mixes...

Recent posts

Popular categories

ASK DUKE