You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
**Texar** is an open-source toolkit based on TensorFlow, aiming to support a broad set of machine learning especially **text generation tasks**, such as machine translation, dialog, summarization, content manipulation, language modeling, and so on. Texar is designed for both researchers and practitioners for fast prototyping and experimentation.
11
+
**Texar** is an open-source toolkit based on TensorFlow, aiming to support a broad set of machine learning, especially text generation tasks, such as machine translation, dialog, summarization, content manipulation, language modeling, and so on. Texar is designed for both researchers and practitioners for fast prototyping and experimentation.
12
12
13
+
*If you work with PyTorch, be sure to check out **[Texar-PyTorch](https://github.com/asyml/texar-pytorch)** which has (mostly) the **same functionalities and interfaces**.*
14
+
13
15
With the design goals of **modularity, versatility, and extensibility** in mind, Texar extracts the common patterns underlying the diverse tasks and methodologies, creates a library of highly reusable modules and functionalities, and facilitates **arbitrary model architectures and algorithmic paradigms**, e.g.,
14
16
* encoder(s) to decoder(s), sequential- and self-attentions, memory, hierarchical models, classifiers...
@@ -27,6 +29,7 @@ Users can construct their own models at a high conceptual level just like assemb
27
29
***Extensibility**. It is straightforward to integrate any user-customized, external modules. Also, Texar is fully compatible with the native TensorFlow interfaces and can take advantage of the rich TensorFlow features, and resources from the vibrant open-source community.
28
30
* Interfaces with different functionality levels. Users can customize a model through 1) simple **Python/YAML configuration files** of provided model templates/examples; 2) programming with **Python Library APIs** for maximal customizability.
29
31
* Easy-to-use APIs: 1) Convenient automatic variable re-use---no worry about the complicated TF variable scopes; 2) PyTorch-like callable modules; 3) Rich configuration options for each module, all with default values; ...
32
+
***Pretrained Models** such as **BERT**, **GPT2**, and more!
30
33
* Well-structured high-quality code of uniform design patterns and consistent styles.
31
34
* Clean, detailed [documentation](https://texar.readthedocs.io) and rich [examples](./examples).
32
35
***Distributed model training** with multiple GPUs.
@@ -87,15 +90,15 @@ pip install -e .
87
90
*[Documentation](https://texar.readthedocs.io)
88
91
89
92
### Reference
90
-
If you use Texar, please cite the [report](https://arxiv.org/abs/1809.00794) with the following BibTex entry:
93
+
If you use Texar, please cite the [tech report](https://arxiv.org/abs/1809.00794) with the following BibTex entry:
91
94
```
92
95
Texar: A Modularized, Versatile, and Extensible Toolkit for Text Generation
title={Texar: A Modularized, Versatile, and Extensible Toolkit for Text Generation},
98
-
author={Hu, Zhiting and Shi, Haoran and Yang, Zichao and Tan, Bowen and Zhao, Tiancheng and He, Junxian and Wang, Wentao and Qin, Lianhui and Wang, Di and others},
101
+
author={Hu, Zhiting and Shi, Haoran and Tan, Bowen and Wang, Wentao and Yang, Zichao and Zhao, Tiancheng and He, Junxian and Qin, Lianhui and Wang, Di and others},
0 commit comments