Skip to content
This repository was archived by the owner on Jan 15, 2024. It is now read-only.
This repository was archived by the owner on Jan 15, 2024. It is now read-only.

Problems not being imported from colab #1596

@stratosphere1492

Description

@stratosphere1492

Description

Problems not being imported from colab

!pip install mxnet gluonnlp pandas tqdm
!pip install sentencepiece
!pip install transformers
!pip install torch
!pip install git+https://[email protected]/SKTBrain/KoBERT.git@master
import gluonnlp as nlp

Error Message


ImportError Traceback (most recent call last)
in <cell line: 6>()
4 import torch.optim as optim
5 from torch.utils.data import Dataset, DataLoader
----> 6 import gluonnlp as nlp
7 import numpy as np
8 from tqdm import tqdm, tqdm_notebook

6 frames
/usr/local/lib/python3.10/dist-packages/gluonnlp/init.py in
23
24 from . import loss
---> 25 from . import data
26 from . import embedding
27 from . import model

/usr/local/lib/python3.10/dist-packages/gluonnlp/data/init.py in
21 import os
22
---> 23 from . import (batchify, candidate_sampler, conll, corpora, dataloader,
24 dataset, question_answering, registry, sampler, sentiment,
25 stream, super_glue, transforms, translation, utils,

/usr/local/lib/python3.10/dist-packages/gluonnlp/data/corpora/init.py in
19 """Corpora."""
20
---> 21 from . import (google_billion_word, large_text_compression_benchmark, wikitext)
22
23 from .google_billion_word import *

/usr/local/lib/python3.10/dist-packages/gluonnlp/data/corpora/google_billion_word.py in
32 from ..._constants import EOS_TOKEN
33 from ...base import get_home_dir
---> 34 from ...vocab import Vocab
35 from ..dataset import CorpusDataset
36 from ..stream import SimpleDatasetStream

/usr/local/lib/python3.10/dist-packages/gluonnlp/vocab/init.py in
19 """Vocabulary."""
20
---> 21 from . import bert, elmo, subwords, vocab
22 from .bert import *
23 from .elmo import *

/usr/local/lib/python3.10/dist-packages/gluonnlp/vocab/bert.py in
22 import os
23
---> 24 from ..data.transforms import SentencepieceTokenizer
25 from ..data.utils import count_tokens
26 from .vocab import Vocab

/usr/local/lib/python3.10/dist-packages/gluonnlp/data/transforms.py in
46 from ..vocab.vocab import Vocab
47 from .utils import _extract_archive
---> 48 from .fast_bert_tokenizer import is_control, is_punctuation, is_whitespace
49 from .fast_bert_tokenizer import BasicTokenizer, WordpieceTokenizer
50

ImportError: /usr/local/lib/python3.10/dist-packages/gluonnlp/data/fast_bert_tokenizer.cpython-310-x86_64-linux-gnu.so: undefined symbol: _PyGen_Send


NOTE: If your import is failing due to a missing package, you can
manually install dependencies using either !pip or !apt.

To view examples of installing some common dependencies, click the
"Open Examples" button below.

To Reproduce

run in colab

Steps to reproduce

(Paste the commands you ran that produced the error.)

What have you tried to solve it?

Environment

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions