Running pytest
gives me this:
============================= test session starts ==============================
platform darwin -- Python 3.7.4, pytest-5.0.1, py-1.8.0, pluggy-0.12.0
rootdir: /Users/abe/Documents/superconductive/tools/great_expectations, inifile: pytest.ini
plugins: cov-2.7.1
collected 0 items / 68 errors
==================================== ERRORS ====================================
__________________ ERROR collecting tests/test_autoinspect.py __________________
tests/conftest.py:54: in build_test_backends_list
from pyspark.sql import SparkSession
E ModuleNotFoundError: No module named 'pyspark'
During handling of the above exception, another exception occurred:
/Users/abe/anaconda2/envs/py3/lib/python3.7/site-packages/pluggy/hooks.py:289: in __call__
return self._hookexec(self, self.get_hookimpls(), kwargs)
/Users/abe/anaconda2/envs/py3/lib/python3.7/site-packages/pluggy/manager.py:87: in _hookexec
return self._inner_hookexec(hook, methods, kwargs)
/Users/abe/anaconda2/envs/py3/lib/python3.7/site-packages/pluggy/manager.py:81: in <lambda>
firstresult=hook.spec.opts.get("firstresult") if hook.spec else False,
/Users/abe/anaconda2/envs/py3/lib/python3.7/site-packages/_pytest/python.py:225: in pytest_pycollect_makeitem
res = list(collector._genfunctions(name, obj))
/Users/abe/anaconda2/envs/py3/lib/python3.7/site-packages/_pytest/python.py:401: in _genfunctions
self.ihook.pytest_generate_tests(metafunc=metafunc)
/Users/abe/anaconda2/envs/py3/lib/python3.7/site-packages/pluggy/hooks.py:289: in __call__
return self._hookexec(self, self.get_hookimpls(), kwargs)
/Users/abe/anaconda2/envs/py3/lib/python3.7/site-packages/pluggy/manager.py:87: in _hookexec
return self._inner_hookexec(hook, methods, kwargs)
/Users/abe/anaconda2/envs/py3/lib/python3.7/site-packages/pluggy/manager.py:81: in <lambda>
firstresult=hook.spec.opts.get("firstresult") if hook.spec else False,
tests/conftest.py:95: in pytest_generate_tests
test_backends = build_test_backends_list(metafunc)
tests/conftest.py:56: in build_test_backends_list
raise ValueError("spark tests are requested, but pyspark is not installed")
E ValueError: spark tests are requested, but pyspark is not installed
____________________ ERROR collecting tests/test_configs.py ____________________
tests/conftest.py:54: in build_test_backends_list
from pyspark.sql import SparkSession
E ModuleNotFoundError: No module named 'pyspark'
During handling of the above exception, another exception occurred:
/Users/abe/anaconda2/envs/py3/lib/python3.7/site-packages/pluggy/hooks.py:289: in __call__
return self._hookexec(self, self.get_hookimpls(), kwargs)
/Users/abe/anaconda2/envs/py3/lib/python3.7/site-packages/pluggy/manager.py:87: in _hookexec
return self._inner_hookexec(hook, methods, kwargs)
/Users/abe/anaconda2/envs/py3/lib/python3.7/site-packages/pluggy/manager.py:81: in <lambda>
firstresult=hook.spec.opts.get("firstresult") if hook.spec else False,
/Users/abe/anaconda2/envs/py3/lib/python3.7/site-packages/_pytest/python.py:225: in pytest_pycollect_makeitem
res = list(collector._genfunctions(name, obj))
/Users/abe/anaconda2/envs/py3/lib/python3.7/site-packages/_pytest/python.py:401: in _genfunctions
self.ihook.pytest_generate_tests(metafunc=metafunc)
/Users/abe/anaconda2/envs/py3/lib/python3.7/site-packages/pluggy/hooks.py:289: in __call__
return self._hookexec(self, self.get_hookimpls(), kwargs)
/Users/abe/anaconda2/envs/py3/lib/python3.7/site-packages/pluggy/manager.py:87: in _hookexec
return self._inner_hookexec(hook, methods, kwargs)
/Users/abe/anaconda2/envs/py3/lib/python3.7/site-packages/pluggy/manager.py:81: in <lambda>
firstresult=hook.spec.opts.get("firstresult") if hook.spec else False,
tests/conftest.py:95: in pytest_generate_tests
test_backends = build_test_backends_list(metafunc)
tests/conftest.py:56: in build_test_backends_list
raise ValueError("spark tests are requested, but pyspark is not installed")
E ValueError: spark tests are requested, but pyspark is not installed
___________________ ERROR collecting tests/test_ge_utils.py ____________________
...
...
...
_________ ERROR collecting tests/test_definitions/test_expectations.py _________
tests/conftest.py:54: in build_test_backends_list
from pyspark.sql import SparkSession
E ModuleNotFoundError: No module named 'pyspark'
During handling of the above exception, another exception occurred:
tests/test_definitions/test_expectations.py:34: in pytest_generate_tests
for c in build_test_backends_list(metafunc):
tests/conftest.py:56: in build_test_backends_list
raise ValueError("spark tests are requested, but pyspark is not installed")
E ValueError: spark tests are requested, but pyspark is not installed
=============================== warnings summary ===============================
/Users/abe/anaconda2/envs/py3/lib/python3.7/site-packages/boto/plugin.py:40
/Users/abe/anaconda2/envs/py3/lib/python3.7/site-packages/boto/plugin.py:40: DeprecationWarning: the imp module is deprecated in favour of importlib; see the module's documentation for alternative uses
import imp
/Users/abe/anaconda2/envs/py3/lib/python3.7/site-packages/jose/jws.py:6
/Users/abe/anaconda2/envs/py3/lib/python3.7/site-packages/jose/jws.py:6: DeprecationWarning: Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated, and in 3.8 it will stop working
from collections import Mapping, Iterable
-- Docs: https://docs.pytest.org/en/latest/warnings.html
!!!!!!!!!!!!!!!!!!! Interrupted: 68 errors during collection !!!!!!!!!!!!!!!!!!!
==================== 2 warnings, 68 error in 11.69 seconds =====================
In the past, I’ve toggled the spark tests by commenting out parts of tests/conftest.py
. Is there a better way to do this?