You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Nov 8, 2022. It is now read-only.
*[Knowledge Distillation using Transformers](https://intellabs.github.io/nlp-architect/transformers_distillation.html)
117
+
*[Sparse and Quantized Neural Machine Translation (GNMT)](https://intellabs.github.io/nlp-architect/sparse_gnmt.html)
118
118
119
119
Solutions (End-to-end applications) using one or more models:
120
120
121
-
*[Term Set expansion](http://nlp_architect.nervanasys.com/term_set_expansion.html) - uses the included word chunker as a noun phrase extractor and NP2Vec to create semantic term sets
122
-
*[Topics and trend analysis](http://nlp_architect.nervanasys.com/trend_analysis.html) - analyzing trending phrases in temporal corpora
123
-
*[Aspect Based Sentiment Analysis (ABSA)](http://nlp_architect.nervanasys.com/absa_solution.html)
121
+
*[Term Set expansion](https://intellabs.github.io/nlp-architect/term_set_expansion.html) - uses the included word chunker as a noun phrase extractor and NP2Vec to create semantic term sets
122
+
*[Topics and trend analysis](https://intellabs.github.io/nlp-architect/trend_analysis.html) - analyzing trending phrases in temporal corpora
123
+
*[Aspect Based Sentiment Analysis (ABSA)](https://intellabs.github.io/nlp-architect/absa_solution.html)
124
124
125
125
## Documentation
126
126
127
-
Full library [documentation](http://nlp_architect.nervanasys.com/) of NLP models, algorithms, solutions and instructions
128
-
on how to run each model can be found on our [website](http://nlp_architect.nervanasys.com/).
127
+
Full library [documentation](https://intellabs.github.io/nlp-architect/) of NLP models, algorithms, solutions and instructions
128
+
on how to run each model can be found on our [website](https://intellabs.github.io/nlp-architect/).
129
129
130
130
## NLP Architect library design philosophy
131
131
@@ -176,7 +176,7 @@ and NLP research communities are more than welcome.
176
176
Contact the NLP Architect development team through Github issues or
Copy file name to clipboardexpand all lines: docs-source/source/absa_solution.rst
+1-1
Original file line number
Diff line number
Diff line change
@@ -36,7 +36,7 @@ For more details see :doc:`ABSA <absa>`.
36
36
37
37
Installing and Running
38
38
======================
39
-
For instructions on running the end-to-end solution UI, see here: `ABSA Solution README <https://github.com/NervanaSystems/nlp-architect/tree/master/solutions/absa_solution#setup>`__
39
+
For instructions on running the end-to-end solution UI, see here: `ABSA Solution README <https://github.com/IntelLabs/nlp-architect/tree/master/solutions/absa_solution#setup>`__
Copy file name to clipboardexpand all lines: docs-source/source/tutorials.rst
+7-7
Original file line number
Diff line number
Diff line change
@@ -45,10 +45,10 @@ If you don't have Jupyter installed, follow these instructions to install:
45
45
46
46
List of tutorials
47
47
-----------------
48
-
- `Natural Language Question/Answering Systems <https://github.com/NervanaSystems/nlp-architect/blob/master/tutorials/Question_Answering/Natural_Language_Question_Answer_Systems.ipynb>`_
- `Term set expansion <https://github.com/NervanaSystems/private-nlp-architect/blob/master/tutorials/Term_Set_Expansion/term_set_expansion.ipynb>`_
48
+
- `Natural Language Question/Answering Systems <https://github.com/IntelLabs/nlp-architect/blob/master/tutorials/Question_Answering/Natural_Language_Question_Answer_Systems.ipynb>`_
Copy file name to clipboardexpand all lines: examples/chunker/README.md
+1-1
Original file line number
Diff line number
Diff line change
@@ -15,7 +15,7 @@ In this example the sentence can be divided into 4 phrases, `The quick brown fox
15
15
## Documentation
16
16
17
17
This model is based on the paper: [Deep multi-task learning with low level tasks supervised at lower layers](http://anthology.aclweb.org/P16-2038). \
18
-
Full documentation of this example and the neural network model can be found here: [http://nlp_architect.nervanasys.com/chunker.html](http://nlp_architect.nervanasys.com/chunker.html)
18
+
Full documentation of this example and the neural network model can be found here: [https://intellabs.github.io/nlp-architect/chunker.html](https://intellabs.github.io/nlp-architect/chunker.html)
Copy file name to clipboardexpand all lines: examples/sparse_gnmt/README.md
+1-1
Original file line number
Diff line number
Diff line change
@@ -26,7 +26,7 @@ You can use the script [wmt16_en_de.sh](https://github.com/tensorflow/nmt/blob/m
26
26
27
27
## Results & Pre-Trained Models
28
28
The following table presents some of our experiments and results.
29
-
Furthermore, some of our pre-trained models are offered in the form of checkpoints in our [Model Zoo](http://nlp_architect.nervanasys.com/model_zoo.html).
29
+
Furthermore, some of our pre-trained models are offered in the form of checkpoints in our [Model Zoo](https://intellabs.github.io/nlp-architect/model_zoo.html).
30
30
You can use these models to [Run Inference with Pre-Trained Model](#run-inference-with-pre-trained-model) and evaluate them.
31
31
32
32
| Model | Sparsity | BLEU| Non-Zero Parameters | Data Type |
0 commit comments