Quick Start:
- Download amazon dataset meta data(.json.gz) and rating data(.csv) files to dataset/Metadata and dataset/Ratings respectively.
- python preprocess_amazon.py
- python run_llmrec.py
Note:
- The code is based on Torch's DDP (Distributed Data Parallel) and defaults to using 4 GPUs for training.
- The code set the backbone to bert-base-uncased in default
- This code defaults to directly accessing the Hugging Face website to download pre-trained models.