🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
-
Updated
Nov 6, 2024 - Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
🔍 AI orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your data. With advanced retrieval methods, it's best suited for building RAG, question answering, semantic search or conversational agent chatbots.
Natural Language Processing Tutorial for Deep Learning Researchers
《李宏毅深度学习教程》(李宏毅老师推荐👍,苹果书🍎),PDF下载地址:https://github.com/datawhalechina/leedl-tutorial/releases
🏄 Scalable embedding, reasoning, ranking for images and sentences with CLIP
👑 Easy-to-use and powerful NLP and LLM library with 🤗 Awesome model zoo, supporting wide-range of NLP tasks from research to industrial applications, including 🗂Text Classification, 🔍 Neural Search, ❓ Question Answering, ℹ️ Information Extraction, 📄 Document Intelligence, 💌 Sentiment Analysis etc.
Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)
大规模中文自然语言处理语料 Large Scale Chinese Corpus for NLP
This repository contains demos I made with the Transformers library by HuggingFace.
💥 Fast State-of-the-Art Tokenizers optimized for Research and Production
BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)
Official implementations for various pre-training models of ERNIE-family, covering topics of Language Understanding & Generation, Multimodal Understanding & Generation, and beyond.
Google AI 2018 BERT pytorch implementation
Leveraging BERT and c-TF-IDF to create easily interpretable topics.
Transformer related optimization, including BERT, GPT
Tensorflow solution of NER task Using BiLSTM-CRF model with Google BERT Fine-tuning And private Server services
Tutorials on getting started with PyTorch and TorchText for sentiment analysis.
Must-read papers on prompt-based tuning for pre-trained language models.
Add a description, image, and links to the bert topic page so that developers can more easily learn about it.
To associate your repository with the bert topic, visit your repo's landing page and select "manage topics."