Skip to content

ali-vilab/ACE

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

: All-round Creator and Editor Following
Instructions via Diffusion Transformer

Paper PDF Project Page
Zhen Han* · Zeyinzi Jiang* · Yulin Pan* · Jingfeng Zhang* · Chaojie Mao*
Chenwei Xie · Yu Liu · Jingren Zhou
Tongyi Lab, Alibaba Group

📢 News

  • [2024.9.30] Release the paper of ACE on arxiv.
  • [2023.10.31] Release the ACE checkpoint on ModelScope and HuggingFace.
  • [2023.11.1] Support online demo on HuggingFace.
  • [TODO] Release the FLUX-ACE model, which trained on top of FLUX.1 (12B), to enhance image quality and improve aesthetic appeal.

🚀 Installation

Install the necessary packages with pip:

pip install -r requirements.txt

🔥 Training

We offer a demonstration training YAML that enables the end-to-end training of ACE using a toy dataset. For a comprehensive overview of the hyperparameter configurations, please consult config/ace_0.6b_512_train.yaml.

Prepare datasets

Please find the dataset class located in modules/data/dataset/dataset.py, designed to facilitate end-to-end training using an open-source toy dataset. Download a dataset zip file from modelscope, and then extract its contents into the cache/datasets/ directory.

Should you wish to prepare your own datasets, we recommend consulting modules/data/dataset/dataset.py for detailed guidance on the required data format.

Prepare initial weight

The ACE checkpoint has been uploaded to both ModelScope and HuggingFace platforms:

In the provided training YAML configuration, we have designated the Modelscope URL as the default checkpoint URL. Should you wish to transition to Hugging Face, you can effortlessly achieve this by modifying the PRETRAINED_MODEL value within the YAML file (replace the prefix "ms://iic" to "hf://scepter-studio").

Start training

You can easily start training procedure by executing the following command:

PYTHONPATH=. python tools/run_train.py --cfg config/ace_0.6b_512_train.yaml

💬 Chat Bot

We have developed an chatbot UI utilizing Gradio, designed to transform user input in natural language into visually stunning images that align semantically with the provided instructions. Users can effortlessly initiate the chatbot app by executing the following command:

PYTHONPATH=. python chatbot/run_gradio.py --cfg chatbot/config/chatbot_ui.yaml

⚙️️ ComfyUI Workflow

Workflow

We support the use of ACE in the ComfyUI Workflow through the following methods:

  1. Automatic installation directly via the ComfyUI Manager by searching for the ComfyUI-Scepter node.
  2. Manually install by moving custom_nodes from Scepter to ComfyUI.
git clone https://github.com/modelscope/scepter.git
cd path/to/scepter
pip install -e .
cp -r path/to/scepter/workflow/ path/to/ComfyUI/custom_nodes/ComfyUI-Scepter
cd path/to/ComfyUI
python main.py

Note: You can use the nodes by dragging the sample images below into ComfyUI. Additionally, our nodes can automatically pull models from ModelScope or HuggingFace by selecting the model_source field, or you can place the already downloaded models in a local path.

ACE Workflow Examples
Control Semantic Element

📝 Citation

@article{han2024ace,
  title={ACE: All-round Creator and Editor Following Instructions via Diffusion Transformer},
  author={Han, Zhen and Jiang, Zeyinzi and Pan, Yulin and Zhang, Jingfeng and Mao, Chaojie and Xie, Chenwei and Liu, Yu and Zhou, Jingren},
  journal={arXiv preprint arXiv:2410.00086},
  year={2024}
}