Here are related papers on the fitting and generalization of deep learning:
- ProSelfLC: Progressive Self Label Correction Towards A Low-Temperature Entropy State
- Understanding deep learning requires rethinking generalization
- A Closer Look at Memorization in Deep Networks
- ProSelfLC: Progressive Self Label Correction for Training Robust Deep Neural Networks
- Mean Absolute Error Does Not Treat Examples Equally and Gradient Magnitude’s Variance Matters
- Derivative Manipulation: Example Weighting via Emphasis Density Funtion in the context of DL
- Novelty: moving from loss design to derivative design
See Citation Details
@article{wang2022proselflc,
title={ProSelfLC: Progressive Self Label Correction Towards A Low-Temperature Entropy State},
author={Wang, Xinshao and Hua, Yang and Kodirov, Elyor and Mukherjee, Sankha Subhra and Clifton, David A and Robertson, Neil M},
journal={bioRxiv},
year={2022}
}
@inproceddings{wang2021proselflc,
title={ {ProSelfLC}: Progressive Self Label Correction
for Training Robust Deep Neural Networks},
author={Wang, Xinshao and Hua, Yang and Kodirov, Elyor and Clifton, David A and Robertson, Neil M},
booktitle={CVPR},
year={2021}
}
@phdthesis{wang2020example,
title={Example weighting for deep representation learning},
author={Wang, Xinshao},
year={2020},
school={Queen's University Belfast}
}
@article{wang2019derivative,
title={Derivative Manipulation for General Example Weighting},
author={Wang, Xinshao and Kodirov, Elyor and Hua, Yang and Robertson, Neil},
journal={arXiv preprint arXiv:1905.11233},
year={2019}
}
@article{wang2019imae,
title={{IMAE} for Noise-Robust Learning: Mean Absolute Error Does Not Treat Examples Equally and Gradient Magnitude’s Variance Matters},
author={Wang, Xinshao and Hua, Yang and Kodirov, Elyor and Robertson, Neil M},
journal={arXiv preprint arXiv:1903.12141},
year={2019}
}
- Easy to install
- Easy to use
- Easy to extend: new losses, new networks, new datasets and batch loaders
- Easy to run experiments and sink results
- Easy to put sinked results into your technical reports and academic papers.
-
Proselflc
: Training theshufflenetv2
oncifar-100 with a symmetric noise rate of 40%
-
Proselflc
: Training theresnet18
oncifar-100 with a symmetric noise rate of 40%
-
Label smoothing
: Training theshufflenetv2
oncifar-100 with a symmetric noise rate of 40%
-
Proselflc
: Training thebert transformers
ondeeploc dataset with unknown labels
See Install Guidelines
- sudo apt update && sudo apt upgrade
- sudo apt install python3.8
- curl https://bootstrap.pypa.io/get-pip.py -o get-pip.py
- python3.8 get-pip.py
- vim ~/.bashrc -> add
export PATH="/home/ubuntu/.local/bin:$PATH"
-> source ~/.bashrc - pip3 install pipenv
- git clone
this repo
- cd
this repo
- pipenv install -e . --skip-lock
- cd
this repo
- pipenv shell
- pre-commit install
- run experimenets:
- CIFAR-100:
CUDA_VISIBLE_DEVICES=0 CUBLAS_WORKSPACE_CONFIG=:4096:8 TOKENIZERS_PARALLELISM=false python -W ignore tests/convnets_cifar100/trainer_calibration_vision_cifar100_covnets_proselflc.py
- Protein classification:
CUDA_VISIBLE_DEVICES=0 CUBLAS_WORKSPACE_CONFIG=:4096:8 TOKENIZERS_PARALLELISM=false python -W ignore tests/protbertbfd_deeploc/MS-with-unknown/test_trainer_2MSwithunknown_proselflc.py
- CIFAR-100:
The results are well sinked and organised, e.g.,
- CIFAR-100: experiments_records/cifar100_symmetric_noise_rate_0.4/shufflenetv2
- Protein classification: experiments_records/deeploc_prottrans_symmetric_noise_rate_0.0/Rostlab_prot_bert_bfd_seq/MS-with-unknown
See Sinked Results
- Add dataset and dataloader: see examples in src/proselflc/slices/datain
- Add losses: see examples in src/proselflc/slices/losses
- Add networks: see examples in src/proselflc/slices/networks
- Add optimisers: see examples in src/proselflc/optim
- Extend the slicegetter: src/proselflc/slicegetter
- Write run scripts: see examples in tests/
- 23rd Dec 2022, Shanghai DianJi University
- 12th Aug 2022, Southern University of Science and Technology
- 17th May 2022, Loughborough University
- For any specific research discussion or potential future collaboration, please feel free to contact me.
- This work is a personal research project and in progress using personal time.