This repository contains code and instructions for fine-tuning the GPT-2 model to generate stories using a dataset from Kaggle.
-
Download the Dataset: You can download the dataset using the following commands:
pip install kaggle kaggle datasets download -d emily2008/story-text mkdir -p data mv story-text.zip data/ # Move the downloaded dataset to the data folder
-
Fine-tuning the Model: Follow the instructions in the notebook to fine-tune the GPT-2 model on the downloaded dataset.
We use the GPT-2 model from the Transformers library and fine-tune it for story generation.
-
Story Text Dataset:
Emily2008, Kaggle. Story Text dataset. Available at: https://www.kaggle.com/datasets/emily2008/story-text -
Transformers Library:
Hugging Face. Transformers: State-of-the-art Natural Language Processing. Available at: https://huggingface.co/transformers -
PyTorch Framework:
PyTorch. An open-source machine learning framework. Available at: https://pytorch.org