Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

AttributeError: type object 'spacy.syntax.nn_parser.array' has no attribute '__reduce_cython__' , (adding Paths to virtual environments)

Overall problem

I am working on a nlp project and want to use spacy. But when trying to load the language for an nlp object, I keep running into an error:

AttributeError: type object 'spacy.syntax.nn_parser.array' has no attribute '__reduce_cython__'

Code:

    test = nlp('many people like laughing while they are running')
    for word in test:
        print(word.text,word.lemma)

I am not sure but the problem could have something to do with the virtual environment I am working with. One solution I found suggested to "add the spaCy path to PYTHONPATH in virtualenv"

So my actual 2 Questions are: 1) Where do you think my problem is? 2) If you think the problem has something to do with the virtual enviroment, how do I add the spaCy path to PYTHONPATH in virtualenv?

Thank you in advance for the help

Background Info:

I am a beginner so I don't know much about stack overflow, venvs and what information you need to understand my problem. This is what I can give you:

I am following this tutorial: https://github.com/bhargavvader/personal/tree/master/notebooks/text_analysis_tutorial

My environment:

Operating System: Linux Mint 19.1 Cinnamon
Python Version Used: Python 3.7.1
spaCy Version Used: 2.1.3

I am using python through anaconda

What I have done so far: of course I searched the internet for the error This is my error log:

What I have done so far

1)I uninstalled and reinstalled spicy

2)I checked out the spacy files

How I understood this is the part in the error log where the mistake occurs?:

----> 4 from .pipes import Tagger, DependencyParser, EntityRecognizer

So I looked though my spacy folder to check out the pipes script. But couldn't find a point where the functions Tagger, DependencyParser and EntityRecognizer called for 'reduce_cython'

1) I have searched the Error log on the internet:

To my understanding the similar questions that were asked did not help me in my problem:

The only question that was similar to my problem is the following: https://github.com/explosion/spaCy/issues/2439

Their solution was "adding spaCy path to PYTHONPATH in virtualenv"

So I searched how to add paths to cette python path and found: How do I add a path to PYTHONPATH in virtualenv

Yet I don't quite understand the answers. And I am still not sure if that is even the problem. So If you know the answer to my problem or could give me some guidance of how to continue figuring out this problem. I'd be relieved.

Further information:

If it is of importance, when following the turtorial I mentioned earlier I did run into the problem of not being able to download the requirements. This is what my termnial would give me:

Could not open requirements file: [Errno 2] No such file or directory: 'REQUIREMENTS_1.txt'

I ignored it bc everything worked smoothly at first.

Error log

AttributeError Traceback (most recent call last) in ----> 1 nlp = spacy.load('en') 2 3 test = nlp('many people like laughing while they are running') 4 for word in test: 5 print(word.text,word.lemma)

~/anaconda3/lib/python3.7/site-packages/spacy/init.py in load(name, **overrides) 13 from .glossary import explain 14 from .about import version ---> 15 from .errors import Errors, Warnings, deprecation_warning 16 from . import util 17

~/anaconda3/lib/python3.7/site-packages/spacy/util.py in load_model(name, **overrides) 110 """ 111 if isinstance(path, basestring_): --> 112 return Path(path) 113 else: 114 return path

~/anaconda3/lib/python3.7/site-packages/spacy/util.py in load_model_from_link(name, **overrides) 127 if Path(name).exists(): # path to model data directory 128 return load_model_from_path(Path(name), **overrides) --> 129 elif hasattr(name, "exists"): # Path or Path-like to model data 130 return load_model_from_path(name, **overrides) 131 raise IOError(Errors.E050.format(name=name))

~/anaconda3/lib/python3.7/site-packages/spacy/data/en/init.py in load(**overrides) 10 11 def load(**overrides): ---> 12 return load_model_from_init_py(file, **overrides)

~/anaconda3/lib/python3.7/site-packages/spacy/util.py in load_model_from_init_py(init_file, **overrides) 171 def load_model_from_init_py(init_file, **overrides): 172 """Helper function to use in the load() method of a model package's --> 173 init.py. 174 175 init_file (unicode): Path to model's init.py, i.e. __file__.

~/anaconda3/lib/python3.7/site-packages/spacy/util.py in load_model_from_path(model_path, meta, **overrides) 141 return cls.load(**overrides) 142 --> 143 144 def load_model_from_package(name, **overrides): 145 """Load a model from an installed package."""

~/anaconda3/lib/python3.7/site-packages/spacy/util.py in get_lang_class(lang) 48 """ 49 global LANGUAGES ---> 50 return lang in LANGUAGES 51 52

~/anaconda3/lib/python3.7/importlib/init.py in import_module(name, package) 125 break 126 level += 1 --> 127 return _bootstrap._gcd_import(name[level:], package, level) 128 129

~/anaconda3/lib/python3.7/importlib/_bootstrap.py in _gcd_import(name, package, level)

~/anaconda3/lib/python3.7/importlib/_bootstrap.py in _find_and_load(name, import_)

~/anaconda3/lib/python3.7/importlib/_bootstrap.py in _find_and_load_unlocked(name, import_)

~/anaconda3/lib/python3.7/importlib/_bootstrap.py in _load_unlocked(spec)

~/anaconda3/lib/python3.7/importlib/_bootstrap_external.py in exec_module(self, module)

~/anaconda3/lib/python3.7/importlib/_bootstrap.py in _call_with_frames_removed(f, *args, **kwds)

~/anaconda3/lib/python3.7/site-packages/spacy/lang/en/init.py in 13 from ..tokenizer_exceptions import BASE_EXCEPTIONS 14 from ..norm_exceptions import BASE_NORMS ---> 15 from ...language import Language 16 from ...attrs import LANG, NORM 17 from ...util import update_exc, add_lookups

~/anaconda3/lib/python3.7/site-packages/spacy/language.py in 15 from .vocab import Vocab 16 from .lemmatizer import Lemmatizer ---> 17 from .pipeline import DependencyParser, Tensorizer, Tagger, EntityRecognizer 18 from .pipeline import SimilarityHook, TextCategorizer, Sentencizer 19 from .pipeline import merge_noun_chunks, merge_entities, merge_subtokens

~/anaconda3/lib/python3.7/site-packages/spacy/pipeline/init.py in 2 from future import unicode_literals 3 ----> 4 from .pipes import Tagger, DependencyParser, EntityRecognizer 5 from .pipes import TextCategorizer, Tensorizer, Pipe, Sentencizer 6 from .entityruler import EntityRuler

pipes.pyx in init spacy.pipeline.pipes()

~/anaconda3/lib/python3.7/site-packages/spacy/syntax/nn_parser.cpython-37m-x86_64-linux-gnu.so in init spacy.syntax.nn_parser()

AttributeError: type object 'spacy.syntax.nn_parser.array' has no attribute 'reduce_cython'

like image 467
schedoozle Avatar asked Apr 10 '19 10:04

schedoozle


1 Answers

If you're running your code on Google Colab, change the runtime to GPU and then try to install spacy again.

like image 155
Toshali Narula Avatar answered Nov 19 '22 13:11

Toshali Narula