• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    公众号

Jiakui/awesome-bert: bert nlp papers, applications and github resources, includ ...

原作者: [db:作者] 来自: 网络 收藏 邀请

开源软件名称:

Jiakui/awesome-bert

开源软件地址:

https://github.com/Jiakui/awesome-bert

开源编程语言:


开源软件介绍:

This repository is to collect BERT related resources.

AD: a repository for graph convolutional networks at https://github.com/Jiakui/awesome-gcn (resources for graph convolutional networks (图卷积神经网络相关资源)).

Papers:

  1. arXiv:1810.04805, BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , Authors: Jacob Devlin, Ming-Wei Chang, Kenton Lee, Kristina Toutanova
Click to see more
  1. arXiv:1812.06705, Conditional BERT Contextual Augmentation, Authors: Xing Wu, Shangwen Lv, Liangjun Zang, Jizhong Han, Songlin Hu

  2. arXiv:1812.03593, SDNet: Contextualized Attention-based Deep Network for Conversational Question Answering, Authors: Chenguang Zhu, Michael Zeng, Xuedong Huang

  3. arXiv:1901.02860, Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context, Authors: Zihang Dai, Zhilin Yang, Yiming Yang, William W. Cohen, Jaime Carbonell, Quoc V. Le and Ruslan Salakhutdinov.

  4. arXiv:1901.04085, Passage Re-ranking with BERT, Authors: Rodrigo Nogueira, Kyunghyun Cho

  5. arXiv:1902.02671, BERT and PALs: Projected Attention Layers for Efficient Adaptation in Multi-Task Learning, Authors: Asa Cooper Stickland, Iain Murray

  6. arXiv:1904.02232, BERT Post-Training for Review Reading Comprehension and Aspect-based Sentiment Analysis, Authors: Hu Xu, Bing Liu, Lei Shu, Philip S. Yu, [code]

Github Repositories:

official implement:

  1. google-research/bert, officical TensorFlow code and pre-trained models for BERT ,

implement of BERT besides tensorflow:

  1. codertimo/BERT-pytorch, Google AI 2018 BERT pytorch implementation,

  2. huggingface/pytorch-pretrained-BERT, A PyTorch implementation of Google AI's BERT model with script to load Google's pre-trained models,

  3. dmlc/gluon-nlp, Gluon + MXNet implementation that reproduces BERT pretraining and finetuning on GLUE benchmark, SQuAD, etc,

  4. dbiir/UER-py, UER-py is a toolkit for pre-training on general-domain corpus and fine-tuning on downstream task. UER-py maintains model modularity and supports research extensibility. It facilitates the use of different pre-training models (e.g. BERT), and provides interfaces for users to further extend upon.

  5. BrikerMan/Kashgari, Simple, Keras-powered multilingual NLP framework, allows you to build your models in 5 minutes for named entity recognition (NER), part-of-speech tagging (PoS) and text classification tasks. Includes BERT, GPT-2 and word2vec embedding.

  6. kaushaltrivedi/fast-bert, Super easy library for BERT based NLP models,

Click to see more
  1. Separius/BERT-keras, Keras implementation of BERT with pre-trained weights,

  2. soskek/bert-chainer, Chainer implementation of "BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding",

  3. innodatalabs/tbert, PyTorch port of BERT ML model

  4. guotong1988/BERT-tensorflow, BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

  5. dreamgonfly/BERT-pytorch, PyTorch implementation of BERT in "BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding"

  6. CyberZHG/keras-bert, Implementation of BERT that could load official pre-trained models for feature extraction and prediction

  7. soskek/bert-chainer, Chainer implementation of "BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding"

  8. MaZhiyuanBUAA/bert-tf1.4.0, bert-tf1.4.0

  9. dhlee347/pytorchic-bert, Pytorch Implementation of Google BERT,

  10. kpot/keras-transformer, Keras library for building (Universal) Transformers, facilitating BERT and GPT models,

  11. miroozyx/BERT_with_keras, A Keras version of Google's BERT model,

  12. conda-forge/pytorch-pretrained-bert-feedstock, A conda-smithy repository for pytorch-pretrained-bert. ,

  13. Rshcaroline/BERT_Pytorch_fastNLP, A PyTorch & fastNLP implementation of Google AI's BERT model.

  14. nghuyong/ERNIE-Pytorch, ERNIE Pytorch Version,

Pretrained BERT weights:

  1. brightmart/roberta_zh, RoBERTa for Chinese, 中文预训练RoBERTa模型,

  2. ymcui/Chinese-BERT-wwm, Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm预训练模型) https://arxiv.org/abs/1906.08101,

  3. thunlp/OpenCLaP,Open Chinese Language Pre-trained Model Zoo, OpenCLaP:多领域开源中文预训练语言模型仓库,

  4. ymcui/Chinese-PreTrained-XLNet, Pre-Trained Chinese XLNet(中文XLNet预训练模型),

  5. brightmart/xlnet_zh, 中文预训练XLNet模型: Pre-Trained Chinese XLNet_Large,

improvement over BERT:

  1. thunlp/ERNIE, Source code and dataset for ACL 2019 paper "ERNIE: Enhanced Language Representation with Informative Entities", imporove bert with heterogeneous information fusion.

  2. PaddlePaddle/LARK, LAnguage Representations Kit, PaddlePaddle implementation of BERT. It also contains an improved version of BERT, ERNIE, for chinese NLP tasks. BERT 的中文改进版 ERNIE,

  3. zihangdai/xlnet, XLNet: Generalized Autoregressive Pretraining for Language Understanding,

  4. kimiyoung/transformer-xl, Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context, This repository contains the code in both PyTorch and TensorFlow for our paper.

  5. GaoPeng97/transformer-xl-chinese, transformer xl在中文文本生成上的尝试。(transformer xl for text generation of chinese),

  6. PaddlePaddle/ERNIE, An Implementation of ERNIE For Language Understanding (including Pre-training models and Fine-tuning tools) BERT 的中文改进版 ERNIE,

  7. pytorch/fairseq, Facebook AI Research Sequence-to-Sequence Toolkit written in Python. RoBERTa: A Robustly Optimized BERT Pretraining Approach,

  8. facebookresearch/SpanBERT, Code for using and evaluating SpanBERT. , This repository contains code and models for the paper: SpanBERT: Improving Pre-training by Representing and Predicting Spans.,

  9. brightmart/albert_zh, 海量中文预训练ALBERT模型, A LITE BERT FOR SELF-SUPERVISED LEARNING OF LANGUAGE REPRESENTATIONS https://arxiv.org/pdf/1909.11942.pdf,

  10. lonePatient/albert_pytorch, A Lite Bert For Self-Supervised Learning Language Representations,

  11. kpe/bert-for-tf2, A Keras TensorFlow 2.0 implementation of BERT, ALBERT and adapter-BERT. https://github.com/kpe/bert-for-tf2,

other resources for BERT:

  1. brightmart/bert_language_understanding, Pre-training of Deep Bidirectional Transformers for Language Understanding: pre-train TextCNN,

  2. Y1ran/NLP-BERT--ChineseVersion, 谷歌自然语言处理模型BERT:论文解析与python代码,

Click to see more
  1. yangbisheng2009/cn-bert, BERT在中文NLP的应用, 语法检查

  2. JayYip/bert-multiple-gpu, A multiple GPU support version of BERT,

  3. HighCWu/keras-bert-tpu, Implementation of BERT that could load official pre-trained models for feature extraction and prediction on TPU,

  4. Willyoung2017/Bert_Attempt, PyTorch Pretrained Bert,

  5. Pydataman/bert_examples, some examples of bert, run_classifier.py 是基于谷歌bert实现了Quora Insincere Questions Classification二分类比赛。run_ner.py是基于瑞金医院AI大赛 第一赛季数据和bert写的一个命名实体识别。

  6. guotong1988/BERT-chinese, BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding 中文 汉语

  7. zhongyunuestc/bert_multitask, 多任务task

  8. Microsoft/AzureML-BERT, End-to-end walk through for fine-tuning BERT using Azure Machine Learning ,

  9. bigboNed3/bert_serving, export bert model for serving,

  10. yoheikikuta/bert-japanese, BERT with SentencePiece for Japanese text.

  11. whqwill/seq2seq-keyphrase-bert, add BERT to encoder part for https://github.com/memray/seq2seq-keyphrase-pytorch,

  12. algteam/bert-examples, bert-demo,

  13. cedrickchee/awesome-bert-nlp, A curated list of NLP resources focused on BERT, attention mechanism, Transformer networks, and transfer learning.

  14. cnfive/cnbert, 中文注释一下bert代码功能,

  15. brightmart/bert_customized, bert with customized features,

  16. JayYip/bert-multitask-learning, BERT for Multitask Learning,

  17. yuanxiaosc/BERT_Paper_Chinese_Translation, BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding 论文的中文翻译。Chinese Translation! https://yuanxiaosc.github.io/2018/12/…,

  18. yaserkl/BERTvsULMFIT, Comparing Text Classification results using BERT embedding and ULMFIT embedding,

  19. kpot/keras-transformer, Keras library for building (Universal) Transformers, facilitating BERT and GPT models,

  20. 1234560o/Bert-model-code-interpretation, 解读tensorflow版本Bert中modeling.py数据流

  21. cdathuraliya/bert-inference, A helper class for Google BERT (Devlin et al., 2018) to support online prediction and model pipelining.

  22. gameofdimension/java-bert-predict, turn bert pretrain checkpoint into saved model for a feature extracting demo in java

  23. 1234560o/Bert-model-code-interpretation, 解读tensorflow版本Bert中modeling.py数据流

domain specific BERT:

  1. allenai/scibert, A BERT model for scientific text. https://arxiv.org/abs/1903.10676,

  2. MeRajat/SolvingAlmostAnythingWithBert, BioBert Pytorch

  3. kexinhuang12345/clinicalBERT, ClinicalBERT: Modeling Clinical Notes and Predicting Hospital Readmission https://arxiv.org/abs/1904.05342

  4. EmilyAlsentzer/clinicalBERT, repository for Publicly Available Clinical BERT Embeddings

BERT Deploy Tricks:


鲜花

握手

雷人

路过

鸡蛋
该文章已有0人参与评论

请发表评论

全部评论

专题导读
热门推荐
阅读排行榜

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap