Unified Efficient Fine-Tuning of 100+ LLMs (ACL 2024)
-
Updated
Nov 14, 2024 - Python
Unified Efficient Fine-Tuning of 100+ LLMs (ACL 2024)
[NeurIPS'23 Oral] Visual Instruction Tuning (LLaVA) built towards GPT-4V level capabilities and beyond.
✨✨Latest Advances on Multimodal Large Language Models
The official GitHub page for the survey paper "A Survey of Large Language Models".
Instruction Tuning with GPT-4
Aligning pretrained language models with instruction data generated by themselves.
🦦 Otter, a multi-modal model based on OpenFlamingo (open-sourced version of DeepMind's Flamingo), trained on MIMIC-IT and showcasing improved instruction-following and in-context learning ability.
Code and models for NExT-GPT: Any-to-Any Multimodal Large Language Model
【EMNLP 2024🔥】Video-LLaVA: Learning United Visual Representation by Alignment Before Projection
A one-stop data processing system to make data higher-quality, juicier, and more digestible for (multimodal) LLMs! 🍎 🍋 🌽 ➡️ ➡️🍸 🍹 🍷为大模型提供更高质量、更丰富、更易”消化“的数据!
总结Prompt&LLM论文,开源数据&模型,AIGC应用
We unified the interfaces of instruction-tuning data (e.g., CoT data), multiple LLMs and parameter-efficient methods (e.g., lora, p-tuning) together for easy use. We welcome open-source enthusiasts to initiate any meaningful PR on this repo and integrate as many LLM related technologies as possible. 我们打造了方便研究人员上手和使用大模型等微调平台,我们欢迎开源爱好者发起任何有意义的pr!
InternLM-XComposer-2.5: A Versatile Large Vision Language Model Supporting Long-Contextual Input and Output
mPLUG-Owl: The Powerful Multi-modal Large Language Model Family
Cambrian-1 is a family of multimodal LLMs with a vision-centric design.
[ECCV2024] Video Foundation Models & Data for Multimodal Understanding
An Open-sourced Knowledgable Large Language Model Framework.
A collection of open-source dataset to train instruction-following LLMs (ChatGPT,LLaMA,Alpaca)
DataDreamer: Prompt. Generate Synthetic Data. Train & Align Models. 🤖💤
[ICML2024 (Oral)] Official PyTorch implementation of DoRA: Weight-Decomposed Low-Rank Adaptation
Add a description, image, and links to the instruction-tuning topic page so that developers can more easily learn about it.
To associate your repository with the instruction-tuning topic, visit your repo's landing page and select "manage topics."