TLlib is an open-source and well-documented library for Transfer Learning. It is based on pure PyTorch with high performance and friendly API. Our code is pythonic, and the design is consistent with torchvision. You can easily develop new algorithms, or readily apply existing algorithms.
Our API is divided by methods, which include:
domain alignment methods (tllib.aligment)
domain translation methods (tllib.translation)
self-training methods (tllib.self_training)
regularization methods (tllib.regularization)
data reweighting/resampling methods (tllib.reweight)
model ranking/selection methods (tllib.ranking)
normalization-based methods (tllib.normalization)
We provide many example codes in the directory examples, which is divided by learning setups. Currently, the supported learning setups include:
DA (domain adaptation)
TA (task adaptation, also known as finetune)
OOD (out-of-distribution generalization, also known as DG / domain generalization)
SSL (semi-supervised learning)
Model Selection
Our supported tasks include: classification, regression, object detection, segmentation, keypoint detection, and so on.
H-Score - An Information-theoretic Approach to Transferability in Task Transfer Learning [ICIP 2019][Code]
NCE - Negative Conditional Entropy in `Transferability and Hardness of Supervised Classification Tasks [ICCV 2019][Code]
LEEP - LEEP: A New Measure to Evaluate Transferability of Learned Representations [ICML 2020][Code]
LogME - Log Maximum Evidence in `LogME: Practical Assessment of Pre-trained Models for Transfer Learning [ICML 2021][Code]
Semi-Supervised Learning for Classification [Code]
Pseudo Label - Pseudo-Label : The Simple and Efficient Semi-Supervised Learning Method for Deep Neural Networks [ICML 2013][Code]
Pi Model - Temporal Ensembling for Semi-Supervised Learning [ICLR 2017][Code]
Mean Teacher - Mean teachers are better role models: Weight-averaged consistency targets improve semi-supervised deep learning results [NIPS 2017][Code]
DebiasMatch - Debiased Learning From Naturally Imbalanced Pseudo-Labels [CVPR 2022][Code]
DST - Debiased Self-Training for Semi-Supervised Learning [ArXiv][Code]
Installation
To use tllib in other places, you need to install TLlib,
python setup.py install
Note that we do not support pip install currently.
For flexible use and modification of TLlib, please git clone the library and check that you have install all the dependency.
pip install -r requirements.txt
It's recommended to use pytorch==1.7.1 and torchvision==0.8.2 in order to better reproduce the benchmark results.
Documentation
You can find the API documentation on the website: Documentation.
Usage
You can find examples in the directory examples. A typical usage is
# Train a DANN on Office-31 Amazon -> Webcam task using ResNet 50.# Assume you have put the datasets under the path `data/office-31`, # or you are glad to download the datasets automatically from the Internet to this path
python dann.py data/office31 -d Office31 -s A -t W -a resnet50 --epochs 20
Contributing
We appreciate all contributions. If you are planning to contribute back bug-fixes, please do so without any further discussion. If you plan to contribute new features, utility functions or extensions, please first open an issue and discuss the feature with us.
Disclaimer on Datasets
This is a utility library that downloads and prepares public datasets. We do not host or distribute these datasets, vouch for their quality or fairness, or claim that you have licenses to use the dataset. It is your responsibility to determine whether you have permission to use the dataset under the dataset's license.
If you're a dataset owner and wish to update any part of it (description, citation, etc.), or do not want your dataset to be included in this library, please get in touch through a GitHub issue. Thanks for your contribution to the ML community!
Contact
If you have any problem with our code or have some suggestions, including the future feature, feel free to contact
For Q&A in Chinese, you can choose to ask questions here before sending an email. 迁移学习算法库答疑专区
Citation
If you use this toolbox or benchmark in your research, please cite this project.
@misc{jiang2022transferability,
title={Transferability in Deep Learning: A Survey},
author={Junguang Jiang and Yang Shu and Jianmin Wang and Mingsheng Long},
year={2022},
eprint={2201.05867},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
@misc{tllib,
author = {Junguang Jiang, Baixu Chen, Bo Fu, Mingsheng Long},
title = {Transfer-Learning-library},
year = {2020},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/thuml/Transfer-Learning-Library}},
}
Acknowledgment
We would like to thank School of Software, Tsinghua University and The National Engineering Laboratory for Big Data Software for providing such an excellent ML research platform.
请发表评论