Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SystemError: ffi_prep_closure(): bad user_data (it seems that the version of the libffi library seen at runtime is different from the 'ffi.h' file seen at compile-time) #154

Open
natea opened this issue Apr 14, 2023 · 1 comment

Comments

@natea
Copy link

natea commented Apr 14, 2023

I'm getting this error when I try to run the command scrapy crawl article_spider -a id=1 -a do_action=yes

2023-04-13 11:14:39 [dds] INFO:
2023-04-13 11:14:39 [dds] INFO: ======================================================================================
2023-04-13 11:14:39 [dds] INFO: Scraping data from page 1(0).
2023-04-13 11:14:39 [dds] INFO: URL     : http://en.wikinews.org/wiki/Main_Page
2023-04-13 11:14:39 [dds] INFO: ======================================================================================
2023-04-13 11:14:39 [scrapy.core.scraper] ERROR: Error downloading <GET https://en.wikinews.org/wiki/Main_Page>
Traceback (most recent call last):
  File "/Users/nateaune/.pyenv/versions/3.6.15/envs/venv/lib/python3.6/site-packages/twisted/internet/defer.py", line 1658, in _inlineCallbacks
    cast(Failure, result).throwExceptionIntoGenerator, gen
  File "/Users/nateaune/.pyenv/versions/3.6.15/envs/venv/lib/python3.6/site-packages/twisted/internet/defer.py", line 63, in run
    return f(*args, **kwargs)
  File "/Users/nateaune/.pyenv/versions/3.6.15/envs/venv/lib/python3.6/site-packages/twisted/python/failure.py", line 500, in throwExceptionIntoGenerator
    return g.throw(self.type, self.value, self.tb)
  File "/Users/nateaune/.pyenv/versions/3.6.15/envs/venv/lib/python3.6/site-packages/scrapy/core/downloader/middleware.py", line 43, in process_request
    defer.returnValue((yield download_func(request=request,spider=spider)))
  File "/Users/nateaune/.pyenv/versions/3.6.15/envs/venv/lib/python3.6/site-packages/scrapy/utils/defer.py", line 45, in mustbe_deferred
    result = f(*args, **kw)
  File "/Users/nateaune/.pyenv/versions/3.6.15/envs/venv/lib/python3.6/site-packages/scrapy/core/downloader/handlers/__init__.py", line 65, in download_request
    return handler.download_request(request, spider)
  File "/Users/nateaune/.pyenv/versions/3.6.15/envs/venv/lib/python3.6/site-packages/scrapy/core/downloader/handlers/http11.py", line 67, in download_request
    return agent.download_request(request)
  File "/Users/nateaune/.pyenv/versions/3.6.15/envs/venv/lib/python3.6/site-packages/scrapy/core/downloader/handlers/http11.py", line 331, in download_request
    method, to_bytes(url, encoding='ascii'), headers, bodyproducer)
  File "/Users/nateaune/.pyenv/versions/3.6.15/envs/venv/lib/python3.6/site-packages/twisted/web/client.py", line 1753, in request
    endpoint = self._getEndpoint(parsedURI)
  File "/Users/nateaune/.pyenv/versions/3.6.15/envs/venv/lib/python3.6/site-packages/twisted/web/client.py", line 1737, in _getEndpoint
    return self._endpointFactory.endpointForURI(uri)
  File "/Users/nateaune/.pyenv/versions/3.6.15/envs/venv/lib/python3.6/site-packages/twisted/web/client.py", line 1609, in endpointForURI
    uri.host, uri.port
  File "/Users/nateaune/.pyenv/versions/3.6.15/envs/venv/lib/python3.6/site-packages/scrapy/core/downloader/contextfactory.py", line 59, in creatorForNetloc
    return ScrapyClientTLSOptions(hostname.decode("ascii"), self.getContext())
  File "/Users/nateaune/.pyenv/versions/3.6.15/envs/venv/lib/python3.6/site-packages/scrapy/core/downloader/contextfactory.py", line 56, in getContext
    return self.getCertificateOptions().getContext()
  File "/Users/nateaune/.pyenv/versions/3.6.15/envs/venv/lib/python3.6/site-packages/twisted/internet/_sslverify.py", line 1632, in getContext
    self._context = self._makeContext()
  File "/Users/nateaune/.pyenv/versions/3.6.15/envs/venv/lib/python3.6/site-packages/twisted/internet/_sslverify.py", line 1663, in _makeContext
    ctx.set_verify(verifyFlags, _verifyCallback)
  File "/Users/nateaune/.pyenv/versions/3.6.15/envs/venv/lib/python3.6/site-packages/OpenSSL/SSL.py", line 1128, in set_verify
    self._verify_helper = _VerifyHelper(callback)
  File "/Users/nateaune/.pyenv/versions/3.6.15/envs/venv/lib/python3.6/site-packages/OpenSSL/SSL.py", line 360, in __init__
    "int (*)(int, X509_STORE_CTX *)", wrapper
SystemError: ffi_prep_closure(): bad user_data (it seems that the version of the libffi library seen at runtime is different from the 'ffi.h' file seen at compile-time)
2023-04-13 11:14:39 [scrapy.core.engine] INFO: Closing spider (finished)
2023-04-13 11:14:39 [dds] INFO: Closing Django DB connection.
2023-04-13 11:14:39 [scrapy.statscollectors] INFO: Dumping Scrapy stats:
{'downloader/exception_count': 1,
 'downloader/exception_type_count/builtins.SystemError': 1,
 'downloader/request_bytes': 487,
 'downloader/request_count': 2,
 'downloader/request_method_count/GET': 2,
 'downloader/response_bytes': 527,
 'downloader/response_count': 1,
 'downloader/response_status_count/301': 1,
 'finish_reason': 'finished',
 'finish_time': datetime.datetime(2023, 4, 13, 16, 14, 39, 547975),
 'log_count/ERROR': 1,
 'log_count/INFO': 16,
 'memusage/max': 80887808,
 'memusage/startup': 80883712,
 'scheduler/dequeued': 2,
 'scheduler/dequeued/memory': 2,
 'scheduler/enqueued': 2,
 'scheduler/enqueued/memory': 2,
 'start_time': datetime.datetime(2023, 4, 13, 16, 14, 39, 142477)}
2023-04-13 11:14:39 [scrapy.core.engine] INFO: Spider closed (finished)

Previously I was getting errors about SSL, and there were recommendations on Stackoverflow to install the following packages:

$ pip install pyopenssl==22.0.0
$ pip install cryptography<38

The recommended solution on Stackoverflow to solve the ffi_prep_closure() system error was to re-install cffi, but not use the binary:

$ pip install --force-reinstall --no-binary :all: cffi

But I tried this, and I'm still getting the error. Any ideas of other things to try?

@dataknower
Copy link

dataknower commented Apr 14, 2023 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants