Mock Version: 3.5
Mock Version: 3.5
Mock Version: 3.5
Mock Version: 3.5
ENTER ['do_with_status'](['bash', '--login', '-c', '/usr/bin/rpmbuild -bs  --target x86_64 --nodeps /builddir/build/SPECS/python-sahara-plugin-spark.spec'], chrootPath='/var/lib/mock/dlrn-centos9-master-x86_64-5/root'env={'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;<mock-chroot>\\007"', 'PS1': '<mock-chroot> \\s-\\v\\$ ', 'LANG': 'C.UTF-8'}shell=Falselogger=<mockbuild.trace_decorator.getLog object at 0x7f1525988be0>timeout=0uid=1032gid=135user='mockbuild'nspawn_args=[]unshare_net=TrueprintOutput=False)
Executing command: ['bash', '--login', '-c', '/usr/bin/rpmbuild -bs  --target x86_64 --nodeps /builddir/build/SPECS/python-sahara-plugin-spark.spec'] with env {'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;<mock-chroot>\\007"', 'PS1': '<mock-chroot> \\s-\\v\\$ ', 'LANG': 'C.UTF-8'} and shell False
Building target platforms: x86_64
Building for target x86_64
Wrote: /builddir/build/SRPMS/python-sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.el9.src.rpm
Child return code was: 0
Mock Version: 3.5
ENTER ['do_with_status'](['bash', '--login', '-c', '/usr/bin/rpmbuild -bs  --target x86_64 --nodeps /builddir/build/SPECS/python-sahara-plugin-spark.spec'], chrootPath='/var/lib/mock/dlrn-centos9-master-x86_64-5/root'env={'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;<mock-chroot>\\007"', 'PS1': '<mock-chroot> \\s-\\v\\$ ', 'LANG': 'C.UTF-8'}shell=Falselogger=<mockbuild.trace_decorator.getLog object at 0x7f82c40f6908>timeout=0uid=1032gid=135user='mockbuild'nspawn_args=[]unshare_net=TrueprintOutput=False)
Executing command: ['bash', '--login', '-c', '/usr/bin/rpmbuild -bs  --target x86_64 --nodeps /builddir/build/SPECS/python-sahara-plugin-spark.spec'] with env {'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;<mock-chroot>\\007"', 'PS1': '<mock-chroot> \\s-\\v\\$ ', 'LANG': 'C.UTF-8'} and shell False
Building target platforms: x86_64
Building for target x86_64
Wrote: /builddir/build/SRPMS/python-sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.el9.src.rpm
Child return code was: 0
ENTER ['do_with_status'](['bash', '--login', '-c', '/usr/bin/rpmbuild -br  --target x86_64 --nodeps /builddir/build/SPECS/python-sahara-plugin-spark.spec'], chrootPath='/var/lib/mock/dlrn-centos9-master-x86_64-5/root'env={'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;<mock-chroot>\\007"', 'PS1': '<mock-chroot> \\s-\\v\\$ ', 'LANG': 'C.UTF-8'}shell=Falselogger=<mockbuild.trace_decorator.getLog object at 0x7f82c40f6908>timeout=0uid=1032gid=135user='mockbuild'nspawn_args=[]unshare_net=TrueraiseExc=FalseprintOutput=False)
Executing command: ['bash', '--login', '-c', '/usr/bin/rpmbuild -br  --target x86_64 --nodeps /builddir/build/SPECS/python-sahara-plugin-spark.spec'] with env {'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;<mock-chroot>\\007"', 'PS1': '<mock-chroot> \\s-\\v\\$ ', 'LANG': 'C.UTF-8'} and shell False
Building target platforms: x86_64
Building for target x86_64
Executing(%prep): /bin/sh -e /var/tmp/rpm-tmp.SuFJX2
+ umask 022
+ cd /builddir/build/BUILD
+ cd /builddir/build/BUILD
+ rm -rf sahara-plugin-spark-10.0.0
+ /usr/bin/tar -xof -
+ /usr/bin/gzip -dc /builddir/build/SOURCES/sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.tar.gz
+ STATUS=0
+ '[' 0 -ne 0 ']'
+ cd sahara-plugin-spark-10.0.0
+ /usr/bin/chmod -Rf a+rX,u+w,g-w,o-w .
+ /usr/bin/git init -q
+ /usr/bin/git config user.name rpm-build
+ /usr/bin/git config user.email '<rpm-build>'
+ /usr/bin/git config gc.auto 0
+ /usr/bin/git add --force .
+ /usr/bin/git commit -q --allow-empty -a --author 'rpm-build <rpm-build>' -m 'python-sahara-plugin-spark-10.0.0 base'
+ chmod a+x sahara_plugin_spark/plugins/spark/resources/spark-env.sh.template
+ chmod a+x sahara_plugin_spark/plugins/spark/resources/tmp-cleanup.sh.template
+ chmod a+x sahara_plugin_spark/plugins/spark/resources/topology.sh
+ sed -i '/^[[:space:]]*-c{env:.*_CONSTRAINTS_FILE.*/d' tox.ini
+ sed -i 's/^deps = -c{env:.*_CONSTRAINTS_FILE.*/deps =/' tox.ini
+ sed -i '/^minversion.*/d' tox.ini
+ sed -i '/^requires.*virtualenv.*/d' tox.ini
+ for pkg in doc8 bandit pre-commit hacking flake8-import-order bashate pylint whereto os-api-ref
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f doc/requirements.txt ']'
+ sed -i '/^doc8.*/d' doc/requirements.txt
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f test-requirements.txt ']'
+ sed -i '/^doc8.*/d' test-requirements.txt
+ for pkg in doc8 bandit pre-commit hacking flake8-import-order bashate pylint whereto os-api-ref
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f doc/requirements.txt ']'
+ sed -i '/^bandit.*/d' doc/requirements.txt
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f test-requirements.txt ']'
+ sed -i '/^bandit.*/d' test-requirements.txt
+ for pkg in doc8 bandit pre-commit hacking flake8-import-order bashate pylint whereto os-api-ref
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f doc/requirements.txt ']'
+ sed -i '/^pre-commit.*/d' doc/requirements.txt
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f test-requirements.txt ']'
+ sed -i '/^pre-commit.*/d' test-requirements.txt
+ for pkg in doc8 bandit pre-commit hacking flake8-import-order bashate pylint whereto os-api-ref
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f doc/requirements.txt ']'
+ sed -i '/^hacking.*/d' doc/requirements.txt
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f test-requirements.txt ']'
+ sed -i '/^hacking.*/d' test-requirements.txt
+ for pkg in doc8 bandit pre-commit hacking flake8-import-order bashate pylint whereto os-api-ref
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f doc/requirements.txt ']'
+ sed -i '/^flake8-import-order.*/d' doc/requirements.txt
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f test-requirements.txt ']'
+ sed -i '/^flake8-import-order.*/d' test-requirements.txt
+ for pkg in doc8 bandit pre-commit hacking flake8-import-order bashate pylint whereto os-api-ref
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f doc/requirements.txt ']'
+ sed -i '/^bashate.*/d' doc/requirements.txt
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f test-requirements.txt ']'
+ sed -i '/^bashate.*/d' test-requirements.txt
+ for pkg in doc8 bandit pre-commit hacking flake8-import-order bashate pylint whereto os-api-ref
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f doc/requirements.txt ']'
+ sed -i '/^pylint.*/d' doc/requirements.txt
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f test-requirements.txt ']'
+ sed -i '/^pylint.*/d' test-requirements.txt
+ for pkg in doc8 bandit pre-commit hacking flake8-import-order bashate pylint whereto os-api-ref
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f doc/requirements.txt ']'
+ sed -i '/^whereto.*/d' doc/requirements.txt
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f test-requirements.txt ']'
+ sed -i '/^whereto.*/d' test-requirements.txt
+ for pkg in doc8 bandit pre-commit hacking flake8-import-order bashate pylint whereto os-api-ref
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f doc/requirements.txt ']'
+ sed -i '/^os-api-ref.*/d' doc/requirements.txt
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f test-requirements.txt ']'
+ sed -i '/^os-api-ref.*/d' test-requirements.txt
+ RPM_EC=0
++ jobs -p
+ exit 0
Executing(%generate_buildrequires): /bin/sh -e /var/tmp/rpm-tmp.1HjKHa
+ umask 022
+ cd /builddir/build/BUILD
+ cd sahara-plugin-spark-10.0.0
+ echo pyproject-rpm-macros
+ echo python3-devel
+ echo 'python3dist(pip) >= 19'
+ echo 'python3dist(packaging)'
+ '[' -f pyproject.toml ']'
+ '[' -f setup.py ']'
+ echo 'python3dist(setuptools) >= 40.8'
+ echo 'python3dist(wheel)'
+ rm -rfv '*.dist-info/'
+ '[' -f /usr/bin/python3 ']'
+ mkdir -p /builddir/build/BUILD/sahara-plugin-spark-10.0.0/.pyproject-builddir
+ echo -n
+ CFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1  -m64 -march=x86-64-v2 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -fcf-protection'
+ LDFLAGS='-Wl,-z,relro -Wl,--as-needed  -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 '
+ TMPDIR=/builddir/build/BUILD/sahara-plugin-spark-10.0.0/.pyproject-builddir
+ RPM_TOXENV=py39,docs
+ HOSTNAME=rpmbuild
+ /usr/bin/python3 -Bs /usr/lib/rpm/redhat/pyproject_buildrequires.py --generate-extras --python3_pkgversion 3 --wheeldir /builddir/build/BUILD/sahara-plugin-spark-10.0.0/pyproject-wheeldir --output /builddir/build/BUILD/python-sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.el9.x86_64-pyproject-buildrequires -t -e py39,docs
Handling setuptools >= 40.8 from default build backend
Requirement satisfied: setuptools >= 40.8
   (installed: setuptools 57.4.0)
Handling wheel from default build backend
Requirement not satisfied: wheel
Exiting dependency generation pass: build backend
+ cat /builddir/build/BUILD/python-sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.el9.x86_64-pyproject-buildrequires
+ rm -rfv '*.dist-info/'
+ RPM_EC=0
++ jobs -p
+ exit 0
Wrote: /builddir/build/SRPMS/python-sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.el9.buildreqs.nosrc.rpm
Child return code was: 11
Dynamic buildrequires detected
Going to install missing buildrequires. See root.log for details.
ENTER ['do_with_status'](['bash', '--login', '-c', '/usr/bin/rpmbuild -br  --target x86_64 --nodeps /builddir/build/SPECS/python-sahara-plugin-spark.spec'], chrootPath='/var/lib/mock/dlrn-centos9-master-x86_64-5/root'env={'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;<mock-chroot>\\007"', 'PS1': '<mock-chroot> \\s-\\v\\$ ', 'LANG': 'C.UTF-8'}shell=Falselogger=<mockbuild.trace_decorator.getLog object at 0x7f82c40f6908>timeout=0uid=1032gid=135user='mockbuild'nspawn_args=[]unshare_net=TrueraiseExc=FalseprintOutput=False)
Executing command: ['bash', '--login', '-c', '/usr/bin/rpmbuild -br  --target x86_64 --nodeps /builddir/build/SPECS/python-sahara-plugin-spark.spec'] with env {'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;<mock-chroot>\\007"', 'PS1': '<mock-chroot> \\s-\\v\\$ ', 'LANG': 'C.UTF-8'} and shell False
Building target platforms: x86_64
Building for target x86_64
Executing(%prep): /bin/sh -e /var/tmp/rpm-tmp.AKfjlk
+ umask 022
+ cd /builddir/build/BUILD
+ cd /builddir/build/BUILD
+ rm -rf sahara-plugin-spark-10.0.0
+ /usr/bin/gzip -dc /builddir/build/SOURCES/sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.tar.gz
+ /usr/bin/tar -xof -
+ STATUS=0
+ '[' 0 -ne 0 ']'
+ cd sahara-plugin-spark-10.0.0
+ /usr/bin/chmod -Rf a+rX,u+w,g-w,o-w .
+ /usr/bin/git init -q
+ /usr/bin/git config user.name rpm-build
+ /usr/bin/git config user.email '<rpm-build>'
+ /usr/bin/git config gc.auto 0
+ /usr/bin/git add --force .
+ /usr/bin/git commit -q --allow-empty -a --author 'rpm-build <rpm-build>' -m 'python-sahara-plugin-spark-10.0.0 base'
+ chmod a+x sahara_plugin_spark/plugins/spark/resources/spark-env.sh.template
+ chmod a+x sahara_plugin_spark/plugins/spark/resources/tmp-cleanup.sh.template
+ chmod a+x sahara_plugin_spark/plugins/spark/resources/topology.sh
+ sed -i '/^[[:space:]]*-c{env:.*_CONSTRAINTS_FILE.*/d' tox.ini
+ sed -i 's/^deps = -c{env:.*_CONSTRAINTS_FILE.*/deps =/' tox.ini
+ sed -i '/^minversion.*/d' tox.ini
+ sed -i '/^requires.*virtualenv.*/d' tox.ini
+ for pkg in doc8 bandit pre-commit hacking flake8-import-order bashate pylint whereto os-api-ref
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f doc/requirements.txt ']'
+ sed -i '/^doc8.*/d' doc/requirements.txt
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f test-requirements.txt ']'
+ sed -i '/^doc8.*/d' test-requirements.txt
+ for pkg in doc8 bandit pre-commit hacking flake8-import-order bashate pylint whereto os-api-ref
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f doc/requirements.txt ']'
+ sed -i '/^bandit.*/d' doc/requirements.txt
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f test-requirements.txt ']'
+ sed -i '/^bandit.*/d' test-requirements.txt
+ for pkg in doc8 bandit pre-commit hacking flake8-import-order bashate pylint whereto os-api-ref
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f doc/requirements.txt ']'
+ sed -i '/^pre-commit.*/d' doc/requirements.txt
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f test-requirements.txt ']'
+ sed -i '/^pre-commit.*/d' test-requirements.txt
+ for pkg in doc8 bandit pre-commit hacking flake8-import-order bashate pylint whereto os-api-ref
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f doc/requirements.txt ']'
+ sed -i '/^hacking.*/d' doc/requirements.txt
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f test-requirements.txt ']'
+ sed -i '/^hacking.*/d' test-requirements.txt
+ for pkg in doc8 bandit pre-commit hacking flake8-import-order bashate pylint whereto os-api-ref
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f doc/requirements.txt ']'
+ sed -i '/^flake8-import-order.*/d' doc/requirements.txt
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f test-requirements.txt ']'
+ sed -i '/^flake8-import-order.*/d' test-requirements.txt
+ for pkg in doc8 bandit pre-commit hacking flake8-import-order bashate pylint whereto os-api-ref
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f doc/requirements.txt ']'
+ sed -i '/^bashate.*/d' doc/requirements.txt
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f test-requirements.txt ']'
+ sed -i '/^bashate.*/d' test-requirements.txt
+ for pkg in doc8 bandit pre-commit hacking flake8-import-order bashate pylint whereto os-api-ref
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f doc/requirements.txt ']'
+ sed -i '/^pylint.*/d' doc/requirements.txt
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f test-requirements.txt ']'
+ sed -i '/^pylint.*/d' test-requirements.txt
+ for pkg in doc8 bandit pre-commit hacking flake8-import-order bashate pylint whereto os-api-ref
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f doc/requirements.txt ']'
+ sed -i '/^whereto.*/d' doc/requirements.txt
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f test-requirements.txt ']'
+ sed -i '/^whereto.*/d' test-requirements.txt
+ for pkg in doc8 bandit pre-commit hacking flake8-import-order bashate pylint whereto os-api-ref
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f doc/requirements.txt ']'
+ sed -i '/^os-api-ref.*/d' doc/requirements.txt
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f test-requirements.txt ']'
+ sed -i '/^os-api-ref.*/d' test-requirements.txt
+ RPM_EC=0
++ jobs -p
+ exit 0
Executing(%generate_buildrequires): /bin/sh -e /var/tmp/rpm-tmp.yyUaPn
+ umask 022
+ cd /builddir/build/BUILD
+ cd sahara-plugin-spark-10.0.0
+ echo pyproject-rpm-macros
+ echo python3-devel
+ echo 'python3dist(pip) >= 19'
+ echo 'python3dist(packaging)'
+ '[' -f pyproject.toml ']'
+ '[' -f setup.py ']'
+ echo 'python3dist(setuptools) >= 40.8'
+ echo 'python3dist(wheel)'
+ rm -rfv '*.dist-info/'
+ '[' -f /usr/bin/python3 ']'
+ mkdir -p /builddir/build/BUILD/sahara-plugin-spark-10.0.0/.pyproject-builddir
+ echo -n
+ CFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1  -m64 -march=x86-64-v2 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -fcf-protection'
+ LDFLAGS='-Wl,-z,relro -Wl,--as-needed  -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 '
+ TMPDIR=/builddir/build/BUILD/sahara-plugin-spark-10.0.0/.pyproject-builddir
+ RPM_TOXENV=py39,docs
+ HOSTNAME=rpmbuild
+ /usr/bin/python3 -Bs /usr/lib/rpm/redhat/pyproject_buildrequires.py --generate-extras --python3_pkgversion 3 --wheeldir /builddir/build/BUILD/sahara-plugin-spark-10.0.0/pyproject-wheeldir --output /builddir/build/BUILD/python-sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.el9.x86_64-pyproject-buildrequires -t -e py39,docs
Handling setuptools >= 40.8 from default build backend
Requirement satisfied: setuptools >= 40.8
   (installed: setuptools 57.4.0)
Handling wheel from default build backend
Requirement satisfied: wheel
   (installed: wheel 0.36.2)
/usr/lib/python3.9/site-packages/setuptools/dist.py:697: UserWarning: Usage of dash-separated 'description-file' will not be supported in future versions. Please use the underscore name 'description_file' instead
  warnings.warn(
/usr/lib/python3.9/site-packages/setuptools/dist.py:697: UserWarning: Usage of dash-separated 'python-requires' will not be supported in future versions. Please use the underscore name 'python_requires' instead
  warnings.warn(
/usr/lib/python3.9/site-packages/setuptools/dist.py:697: UserWarning: Usage of dash-separated 'author-email' will not be supported in future versions. Please use the underscore name 'author_email' instead
  warnings.warn(
/usr/lib/python3.9/site-packages/setuptools/dist.py:697: UserWarning: Usage of dash-separated 'home-page' will not be supported in future versions. Please use the underscore name 'home_page' instead
  warnings.warn(
Handling wheel from get_requires_for_build_wheel
Requirement satisfied: wheel
   (installed: wheel 0.36.2)
Handling pbr>=2.0.0 from get_requires_for_build_wheel
Requirement satisfied: pbr>=2.0.0
   (installed: pbr 5.11.1)
Handling tox-current-env >= 0.0.6 from tox itself
Requirement not satisfied: tox-current-env >= 0.0.6
Exiting dependency generation pass: tox itself
+ cat /builddir/build/BUILD/python-sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.el9.x86_64-pyproject-buildrequires
+ rm -rfv '*.dist-info/'
+ RPM_EC=0
++ jobs -p
Wrote: /builddir/build/SRPMS/python-sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.el9.buildreqs.nosrc.rpm
+ exit 0
Child return code was: 11
Dynamic buildrequires detected
Going to install missing buildrequires. See root.log for details.
ENTER ['do_with_status'](['bash', '--login', '-c', '/usr/bin/rpmbuild -br  --target x86_64 --nodeps /builddir/build/SPECS/python-sahara-plugin-spark.spec'], chrootPath='/var/lib/mock/dlrn-centos9-master-x86_64-5/root'env={'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;<mock-chroot>\\007"', 'PS1': '<mock-chroot> \\s-\\v\\$ ', 'LANG': 'C.UTF-8'}shell=Falselogger=<mockbuild.trace_decorator.getLog object at 0x7f82c40f6908>timeout=0uid=1032gid=135user='mockbuild'nspawn_args=[]unshare_net=TrueraiseExc=FalseprintOutput=False)
Executing command: ['bash', '--login', '-c', '/usr/bin/rpmbuild -br  --target x86_64 --nodeps /builddir/build/SPECS/python-sahara-plugin-spark.spec'] with env {'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;<mock-chroot>\\007"', 'PS1': '<mock-chroot> \\s-\\v\\$ ', 'LANG': 'C.UTF-8'} and shell False
Building target platforms: x86_64
Building for target x86_64
Executing(%prep): /bin/sh -e /var/tmp/rpm-tmp.lsFZtb
+ umask 022
+ cd /builddir/build/BUILD
+ cd /builddir/build/BUILD
+ rm -rf sahara-plugin-spark-10.0.0
+ /usr/bin/gzip -dc /builddir/build/SOURCES/sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.tar.gz
+ /usr/bin/tar -xof -
+ STATUS=0
+ '[' 0 -ne 0 ']'
+ cd sahara-plugin-spark-10.0.0
+ /usr/bin/chmod -Rf a+rX,u+w,g-w,o-w .
+ /usr/bin/git init -q
+ /usr/bin/git config user.name rpm-build
+ /usr/bin/git config user.email '<rpm-build>'
+ /usr/bin/git config gc.auto 0
+ /usr/bin/git add --force .
+ /usr/bin/git commit -q --allow-empty -a --author 'rpm-build <rpm-build>' -m 'python-sahara-plugin-spark-10.0.0 base'
+ chmod a+x sahara_plugin_spark/plugins/spark/resources/spark-env.sh.template
+ chmod a+x sahara_plugin_spark/plugins/spark/resources/tmp-cleanup.sh.template
+ chmod a+x sahara_plugin_spark/plugins/spark/resources/topology.sh
+ sed -i '/^[[:space:]]*-c{env:.*_CONSTRAINTS_FILE.*/d' tox.ini
+ sed -i 's/^deps = -c{env:.*_CONSTRAINTS_FILE.*/deps =/' tox.ini
+ sed -i '/^minversion.*/d' tox.ini
+ sed -i '/^requires.*virtualenv.*/d' tox.ini
+ for pkg in doc8 bandit pre-commit hacking flake8-import-order bashate pylint whereto os-api-ref
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f doc/requirements.txt ']'
+ sed -i '/^doc8.*/d' doc/requirements.txt
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f test-requirements.txt ']'
+ sed -i '/^doc8.*/d' test-requirements.txt
+ for pkg in doc8 bandit pre-commit hacking flake8-import-order bashate pylint whereto os-api-ref
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f doc/requirements.txt ']'
+ sed -i '/^bandit.*/d' doc/requirements.txt
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f test-requirements.txt ']'
+ sed -i '/^bandit.*/d' test-requirements.txt
+ for pkg in doc8 bandit pre-commit hacking flake8-import-order bashate pylint whereto os-api-ref
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f doc/requirements.txt ']'
+ sed -i '/^pre-commit.*/d' doc/requirements.txt
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f test-requirements.txt ']'
+ sed -i '/^pre-commit.*/d' test-requirements.txt
+ for pkg in doc8 bandit pre-commit hacking flake8-import-order bashate pylint whereto os-api-ref
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f doc/requirements.txt ']'
+ sed -i '/^hacking.*/d' doc/requirements.txt
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f test-requirements.txt ']'
+ sed -i '/^hacking.*/d' test-requirements.txt
+ for pkg in doc8 bandit pre-commit hacking flake8-import-order bashate pylint whereto os-api-ref
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f doc/requirements.txt ']'
+ sed -i '/^flake8-import-order.*/d' doc/requirements.txt
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f test-requirements.txt ']'
+ sed -i '/^flake8-import-order.*/d' test-requirements.txt
+ for pkg in doc8 bandit pre-commit hacking flake8-import-order bashate pylint whereto os-api-ref
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f doc/requirements.txt ']'
+ sed -i '/^bashate.*/d' doc/requirements.txt
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f test-requirements.txt ']'
+ sed -i '/^bashate.*/d' test-requirements.txt
+ for pkg in doc8 bandit pre-commit hacking flake8-import-order bashate pylint whereto os-api-ref
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f doc/requirements.txt ']'
+ sed -i '/^pylint.*/d' doc/requirements.txt
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f test-requirements.txt ']'
+ sed -i '/^pylint.*/d' test-requirements.txt
+ for pkg in doc8 bandit pre-commit hacking flake8-import-order bashate pylint whereto os-api-ref
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f doc/requirements.txt ']'
+ sed -i '/^whereto.*/d' doc/requirements.txt
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f test-requirements.txt ']'
+ sed -i '/^whereto.*/d' test-requirements.txt
+ for pkg in doc8 bandit pre-commit hacking flake8-import-order bashate pylint whereto os-api-ref
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f doc/requirements.txt ']'
+ sed -i '/^os-api-ref.*/d' doc/requirements.txt
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f test-requirements.txt ']'
+ sed -i '/^os-api-ref.*/d' test-requirements.txt
+ RPM_EC=0
++ jobs -p
+ exit 0
Executing(%generate_buildrequires): /bin/sh -e /var/tmp/rpm-tmp.ENPwg7
+ umask 022
+ cd /builddir/build/BUILD
+ cd sahara-plugin-spark-10.0.0
+ echo pyproject-rpm-macros
+ echo python3-devel
+ echo 'python3dist(pip) >= 19'
+ echo 'python3dist(packaging)'
+ '[' -f pyproject.toml ']'
+ '[' -f setup.py ']'
+ echo 'python3dist(setuptools) >= 40.8'
+ echo 'python3dist(wheel)'
+ rm -rfv '*.dist-info/'
+ '[' -f /usr/bin/python3 ']'
+ mkdir -p /builddir/build/BUILD/sahara-plugin-spark-10.0.0/.pyproject-builddir
+ echo -n
+ CFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1  -m64 -march=x86-64-v2 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -fcf-protection'
+ LDFLAGS='-Wl,-z,relro -Wl,--as-needed  -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 '
+ TMPDIR=/builddir/build/BUILD/sahara-plugin-spark-10.0.0/.pyproject-builddir
+ RPM_TOXENV=py39,docs
+ HOSTNAME=rpmbuild
+ /usr/bin/python3 -Bs /usr/lib/rpm/redhat/pyproject_buildrequires.py --generate-extras --python3_pkgversion 3 --wheeldir /builddir/build/BUILD/sahara-plugin-spark-10.0.0/pyproject-wheeldir --output /builddir/build/BUILD/python-sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.el9.x86_64-pyproject-buildrequires -t -e py39,docs
Handling setuptools >= 40.8 from default build backend
Requirement satisfied: setuptools >= 40.8
   (installed: setuptools 57.4.0)
Handling wheel from default build backend
Requirement satisfied: wheel
   (installed: wheel 0.36.2)
/usr/lib/python3.9/site-packages/setuptools/dist.py:697: UserWarning: Usage of dash-separated 'description-file' will not be supported in future versions. Please use the underscore name 'description_file' instead
  warnings.warn(
/usr/lib/python3.9/site-packages/setuptools/dist.py:697: UserWarning: Usage of dash-separated 'python-requires' will not be supported in future versions. Please use the underscore name 'python_requires' instead
  warnings.warn(
/usr/lib/python3.9/site-packages/setuptools/dist.py:697: UserWarning: Usage of dash-separated 'author-email' will not be supported in future versions. Please use the underscore name 'author_email' instead
  warnings.warn(
/usr/lib/python3.9/site-packages/setuptools/dist.py:697: UserWarning: Usage of dash-separated 'home-page' will not be supported in future versions. Please use the underscore name 'home_page' instead
  warnings.warn(
Handling wheel from get_requires_for_build_wheel
Requirement satisfied: wheel
   (installed: wheel 0.36.2)
Handling pbr>=2.0.0 from get_requires_for_build_wheel
Requirement satisfied: pbr>=2.0.0
   (installed: pbr 5.11.1)
Handling tox-current-env >= 0.0.6 from tox itself
Requirement satisfied: tox-current-env >= 0.0.6
   (installed: tox-current-env 0.0.8)
___________________________________ summary ____________________________________
  py39: commands succeeded
  docs: commands succeeded
  congratulations :)
Handling pbr!=2.1.0,>=2.0.0 from tox --print-deps-only: py39,docs
Requirement satisfied: pbr!=2.1.0,>=2.0.0
   (installed: pbr 5.11.1)
Handling Babel!=2.4.0,>=2.3.4 from tox --print-deps-only: py39,docs
Requirement satisfied: Babel!=2.4.0,>=2.3.4
   (installed: Babel 2.9.1)
Handling eventlet>=0.26.0 from tox --print-deps-only: py39,docs
Requirement satisfied: eventlet>=0.26.0
   (installed: eventlet 0.33.3)
Handling oslo.i18n>=3.15.3 from tox --print-deps-only: py39,docs
Requirement satisfied: oslo.i18n>=3.15.3
   (installed: oslo.i18n 6.2.0.dev2)
Handling oslo.log>=3.36.0 from tox --print-deps-only: py39,docs
Requirement satisfied: oslo.log>=3.36.0
   (installed: oslo.log 5.4.0.dev4)
Handling oslo.serialization!=2.19.1,>=2.18.0 from tox --print-deps-only: py39,docs
Requirement satisfied: oslo.serialization!=2.19.1,>=2.18.0
   (installed: oslo.serialization 5.3.0.dev1)
Handling oslo.utils>=3.33.0 from tox --print-deps-only: py39,docs
Requirement satisfied: oslo.utils>=3.33.0
   (installed: oslo.utils 6.3.0.dev2)
Handling requests>=2.14.2 from tox --print-deps-only: py39,docs
Requirement satisfied: requests>=2.14.2
   (installed: requests 2.25.1)
Handling sahara>=10.0.0.0b1 from tox --print-deps-only: py39,docs
Requirement satisfied: sahara>=10.0.0.0b1
   (installed: sahara 17.1.0.dev4)
Handling six>=1.10.0 from tox --print-deps-only: py39,docs
Requirement satisfied: six>=1.10.0
   (installed: six 1.15.0)
Handling coverage!=4.4,>=4.0 from tox --print-deps-only: py39,docs
Requirement not satisfied: coverage!=4.4,>=4.0
Handling fixtures>=3.0.0 from tox --print-deps-only: py39,docs
Requirement satisfied: fixtures>=3.0.0
   (installed: fixtures 4.0.1)
Handling oslotest>=3.2.0 from tox --print-deps-only: py39,docs
Requirement not satisfied: oslotest>=3.2.0
Handling stestr>=1.0.0 from tox --print-deps-only: py39,docs
Requirement not satisfied: stestr>=1.0.0
Handling testscenarios>=0.4 from tox --print-deps-only: py39,docs
Requirement satisfied: testscenarios>=0.4
   (installed: testscenarios 0.5.0)
Handling testtools>=2.4.0 from tox --print-deps-only: py39,docs
Requirement satisfied: testtools>=2.4.0
   (installed: testtools 2.6.0)
Handling openstackdocstheme>=2.2.1 from tox --print-deps-only: py39,docs
Requirement not satisfied: openstackdocstheme>=2.2.1
Handling reno>=3.1.0 from tox --print-deps-only: py39,docs
Requirement not satisfied: reno>=3.1.0
Handling sphinx>=2.0.0,!=2.1.0 from tox --print-deps-only: py39,docs
Requirement satisfied: sphinx>=2.0.0,!=2.1.0
   (installed: sphinx 3.4.3)
Handling sphinxcontrib-httpdomain>=1.3.0 from tox --print-deps-only: py39,docs
Requirement not satisfied: sphinxcontrib-httpdomain>=1.3.0
/usr/lib/python3.9/site-packages/setuptools/dist.py:697: UserWarning: Usage of dash-separated 'description-file' will not be supported in future versions. Please use the underscore name 'description_file' instead
  warnings.warn(
/usr/lib/python3.9/site-packages/setuptools/dist.py:697: UserWarning: Usage of dash-separated 'python-requires' will not be supported in future versions. Please use the underscore name 'python_requires' instead
  warnings.warn(
/usr/lib/python3.9/site-packages/setuptools/dist.py:697: UserWarning: Usage of dash-separated 'author-email' will not be supported in future versions. Please use the underscore name 'author_email' instead
  warnings.warn(
/usr/lib/python3.9/site-packages/setuptools/dist.py:697: UserWarning: Usage of dash-separated 'home-page' will not be supported in future versions. Please use the underscore name 'home_page' instead
  warnings.warn(
running dist_info
writing sahara_plugin_spark.egg-info/PKG-INFO
writing dependency_links to sahara_plugin_spark.egg-info/dependency_links.txt
writing entry points to sahara_plugin_spark.egg-info/entry_points.txt
writing requirements to sahara_plugin_spark.egg-info/requires.txt
writing top-level names to sahara_plugin_spark.egg-info/top_level.txt
writing pbr to sahara_plugin_spark.egg-info/pbr.json
[pbr] Processing SOURCES.txt
[pbr] In git context, generating filelist from git
warning: no previously-included files found matching '.gitignore'
warning: no previously-included files found matching '.gitreview'
warning: no previously-included files matching '*.pyc' found anywhere in distribution
adding license file 'LICENSE'
adding license file 'AUTHORS'
writing manifest file 'sahara_plugin_spark.egg-info/SOURCES.txt'
creating '/builddir/build/BUILD/sahara-plugin-spark-10.0.0/sahara_plugin_spark.dist-info'
adding license file "LICENSE" (matched pattern "LICEN[CS]E*")
adding license file "AUTHORS" (matched pattern "AUTHORS*")
Handling pbr (!=2.1.0,>=2.0.0) from hook generated metadata: Requires-Dist (sahara-plugin-spark)
Requirement satisfied: pbr (!=2.1.0,>=2.0.0)
   (installed: pbr 5.11.1)
Handling Babel (!=2.4.0,>=2.3.4) from hook generated metadata: Requires-Dist (sahara-plugin-spark)
Requirement satisfied: Babel (!=2.4.0,>=2.3.4)
   (installed: Babel 2.9.1)
Handling eventlet (>=0.26.0) from hook generated metadata: Requires-Dist (sahara-plugin-spark)
Requirement satisfied: eventlet (>=0.26.0)
   (installed: eventlet 0.33.3)
Handling oslo.i18n (>=3.15.3) from hook generated metadata: Requires-Dist (sahara-plugin-spark)
Requirement satisfied: oslo.i18n (>=3.15.3)
   (installed: oslo.i18n 6.2.0.dev2)
Handling oslo.log (>=3.36.0) from hook generated metadata: Requires-Dist (sahara-plugin-spark)
Requirement satisfied: oslo.log (>=3.36.0)
   (installed: oslo.log 5.4.0.dev4)
Handling oslo.serialization (!=2.19.1,>=2.18.0) from hook generated metadata: Requires-Dist (sahara-plugin-spark)
Requirement satisfied: oslo.serialization (!=2.19.1,>=2.18.0)
   (installed: oslo.serialization 5.3.0.dev1)
Handling oslo.utils (>=3.33.0) from hook generated metadata: Requires-Dist (sahara-plugin-spark)
Requirement satisfied: oslo.utils (>=3.33.0)
   (installed: oslo.utils 6.3.0.dev2)
Handling requests (>=2.14.2) from hook generated metadata: Requires-Dist (sahara-plugin-spark)
Requirement satisfied: requests (>=2.14.2)
   (installed: requests 2.25.1)
Handling sahara (>=10.0.0.0b1) from hook generated metadata: Requires-Dist (sahara-plugin-spark)
Requirement satisfied: sahara (>=10.0.0.0b1)
   (installed: sahara 17.1.0.dev4)
Handling six (>=1.10.0) from hook generated metadata: Requires-Dist (sahara-plugin-spark)
Requirement satisfied: six (>=1.10.0)
   (installed: six 1.15.0)
+ cat /builddir/build/BUILD/python-sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.el9.x86_64-pyproject-buildrequires
+ rm -rfv sahara_plugin_spark.dist-info/
removed 'sahara_plugin_spark.dist-info/AUTHORS'
removed 'sahara_plugin_spark.dist-info/LICENSE'
removed 'sahara_plugin_spark.dist-info/METADATA'
removed 'sahara_plugin_spark.dist-info/entry_points.txt'
removed 'sahara_plugin_spark.dist-info/pbr.json'
removed 'sahara_plugin_spark.dist-info/top_level.txt'
removed directory 'sahara_plugin_spark.dist-info/'
+ RPM_EC=0
++ jobs -p
Wrote: /builddir/build/SRPMS/python-sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.el9.buildreqs.nosrc.rpm
+ exit 0
Child return code was: 11
Dynamic buildrequires detected
Going to install missing buildrequires. See root.log for details.
ENTER ['do_with_status'](['bash', '--login', '-c', '/usr/bin/rpmbuild -br  --target x86_64 --nodeps /builddir/build/SPECS/python-sahara-plugin-spark.spec'], chrootPath='/var/lib/mock/dlrn-centos9-master-x86_64-5/root'env={'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;<mock-chroot>\\007"', 'PS1': '<mock-chroot> \\s-\\v\\$ ', 'LANG': 'C.UTF-8'}shell=Falselogger=<mockbuild.trace_decorator.getLog object at 0x7f82c40f6908>timeout=0uid=1032gid=135user='mockbuild'nspawn_args=[]unshare_net=TrueraiseExc=FalseprintOutput=False)
Executing command: ['bash', '--login', '-c', '/usr/bin/rpmbuild -br  --target x86_64 --nodeps /builddir/build/SPECS/python-sahara-plugin-spark.spec'] with env {'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;<mock-chroot>\\007"', 'PS1': '<mock-chroot> \\s-\\v\\$ ', 'LANG': 'C.UTF-8'} and shell False
Building target platforms: x86_64
Building for target x86_64
Executing(%prep): /bin/sh -e /var/tmp/rpm-tmp.MncKgx
+ umask 022
+ cd /builddir/build/BUILD
+ cd /builddir/build/BUILD
+ rm -rf sahara-plugin-spark-10.0.0
+ /usr/bin/gzip -dc /builddir/build/SOURCES/sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.tar.gz
+ /usr/bin/tar -xof -
+ STATUS=0
+ '[' 0 -ne 0 ']'
+ cd sahara-plugin-spark-10.0.0
+ /usr/bin/chmod -Rf a+rX,u+w,g-w,o-w .
+ /usr/bin/git init -q
+ /usr/bin/git config user.name rpm-build
+ /usr/bin/git config user.email '<rpm-build>'
+ /usr/bin/git config gc.auto 0
+ /usr/bin/git add --force .
+ /usr/bin/git commit -q --allow-empty -a --author 'rpm-build <rpm-build>' -m 'python-sahara-plugin-spark-10.0.0 base'
+ chmod a+x sahara_plugin_spark/plugins/spark/resources/spark-env.sh.template
+ chmod a+x sahara_plugin_spark/plugins/spark/resources/tmp-cleanup.sh.template
+ chmod a+x sahara_plugin_spark/plugins/spark/resources/topology.sh
+ sed -i '/^[[:space:]]*-c{env:.*_CONSTRAINTS_FILE.*/d' tox.ini
+ sed -i 's/^deps = -c{env:.*_CONSTRAINTS_FILE.*/deps =/' tox.ini
+ sed -i '/^minversion.*/d' tox.ini
+ sed -i '/^requires.*virtualenv.*/d' tox.ini
+ for pkg in doc8 bandit pre-commit hacking flake8-import-order bashate pylint whereto os-api-ref
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f doc/requirements.txt ']'
+ sed -i '/^doc8.*/d' doc/requirements.txt
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f test-requirements.txt ']'
+ sed -i '/^doc8.*/d' test-requirements.txt
+ for pkg in doc8 bandit pre-commit hacking flake8-import-order bashate pylint whereto os-api-ref
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f doc/requirements.txt ']'
+ sed -i '/^bandit.*/d' doc/requirements.txt
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f test-requirements.txt ']'
+ sed -i '/^bandit.*/d' test-requirements.txt
+ for pkg in doc8 bandit pre-commit hacking flake8-import-order bashate pylint whereto os-api-ref
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f doc/requirements.txt ']'
+ sed -i '/^pre-commit.*/d' doc/requirements.txt
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f test-requirements.txt ']'
+ sed -i '/^pre-commit.*/d' test-requirements.txt
+ for pkg in doc8 bandit pre-commit hacking flake8-import-order bashate pylint whereto os-api-ref
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f doc/requirements.txt ']'
+ sed -i '/^hacking.*/d' doc/requirements.txt
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f test-requirements.txt ']'
+ sed -i '/^hacking.*/d' test-requirements.txt
+ for pkg in doc8 bandit pre-commit hacking flake8-import-order bashate pylint whereto os-api-ref
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f doc/requirements.txt ']'
+ sed -i '/^flake8-import-order.*/d' doc/requirements.txt
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f test-requirements.txt ']'
+ sed -i '/^flake8-import-order.*/d' test-requirements.txt
+ for pkg in doc8 bandit pre-commit hacking flake8-import-order bashate pylint whereto os-api-ref
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f doc/requirements.txt ']'
+ sed -i '/^bashate.*/d' doc/requirements.txt
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f test-requirements.txt ']'
+ sed -i '/^bashate.*/d' test-requirements.txt
+ for pkg in doc8 bandit pre-commit hacking flake8-import-order bashate pylint whereto os-api-ref
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f doc/requirements.txt ']'
+ sed -i '/^pylint.*/d' doc/requirements.txt
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f test-requirements.txt ']'
+ sed -i '/^pylint.*/d' test-requirements.txt
+ for pkg in doc8 bandit pre-commit hacking flake8-import-order bashate pylint whereto os-api-ref
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f doc/requirements.txt ']'
+ sed -i '/^whereto.*/d' doc/requirements.txt
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f test-requirements.txt ']'
+ sed -i '/^whereto.*/d' test-requirements.txt
+ for pkg in doc8 bandit pre-commit hacking flake8-import-order bashate pylint whereto os-api-ref
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f doc/requirements.txt ']'
+ sed -i '/^os-api-ref.*/d' doc/requirements.txt
+ for reqfile in doc/requirements.txt test-requirements.txt
+ '[' -f test-requirements.txt ']'
+ sed -i '/^os-api-ref.*/d' test-requirements.txt
+ RPM_EC=0
++ jobs -p
+ exit 0
Executing(%generate_buildrequires): /bin/sh -e /var/tmp/rpm-tmp.Xw7DPu
+ umask 022
+ cd /builddir/build/BUILD
+ cd sahara-plugin-spark-10.0.0
+ echo pyproject-rpm-macros
+ echo python3-devel
+ echo 'python3dist(pip) >= 19'
+ echo 'python3dist(packaging)'
+ '[' -f pyproject.toml ']'
+ '[' -f setup.py ']'
+ echo 'python3dist(setuptools) >= 40.8'
+ echo 'python3dist(wheel)'
+ rm -rfv '*.dist-info/'
+ '[' -f /usr/bin/python3 ']'
+ mkdir -p /builddir/build/BUILD/sahara-plugin-spark-10.0.0/.pyproject-builddir
+ echo -n
+ CFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1  -m64 -march=x86-64-v2 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -fcf-protection'
+ LDFLAGS='-Wl,-z,relro -Wl,--as-needed  -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 '
+ TMPDIR=/builddir/build/BUILD/sahara-plugin-spark-10.0.0/.pyproject-builddir
+ RPM_TOXENV=py39,docs
+ HOSTNAME=rpmbuild
+ /usr/bin/python3 -Bs /usr/lib/rpm/redhat/pyproject_buildrequires.py --generate-extras --python3_pkgversion 3 --wheeldir /builddir/build/BUILD/sahara-plugin-spark-10.0.0/pyproject-wheeldir --output /builddir/build/BUILD/python-sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.el9.x86_64-pyproject-buildrequires -t -e py39,docs
Handling setuptools >= 40.8 from default build backend
Requirement satisfied: setuptools >= 40.8
   (installed: setuptools 57.4.0)
Handling wheel from default build backend
Requirement satisfied: wheel
   (installed: wheel 0.36.2)
/usr/lib/python3.9/site-packages/setuptools/dist.py:697: UserWarning: Usage of dash-separated 'description-file' will not be supported in future versions. Please use the underscore name 'description_file' instead
  warnings.warn(
/usr/lib/python3.9/site-packages/setuptools/dist.py:697: UserWarning: Usage of dash-separated 'python-requires' will not be supported in future versions. Please use the underscore name 'python_requires' instead
  warnings.warn(
/usr/lib/python3.9/site-packages/setuptools/dist.py:697: UserWarning: Usage of dash-separated 'author-email' will not be supported in future versions. Please use the underscore name 'author_email' instead
  warnings.warn(
/usr/lib/python3.9/site-packages/setuptools/dist.py:697: UserWarning: Usage of dash-separated 'home-page' will not be supported in future versions. Please use the underscore name 'home_page' instead
  warnings.warn(
Handling wheel from get_requires_for_build_wheel
Requirement satisfied: wheel
   (installed: wheel 0.36.2)
Handling pbr>=2.0.0 from get_requires_for_build_wheel
Requirement satisfied: pbr>=2.0.0
   (installed: pbr 5.11.1)
Handling tox-current-env >= 0.0.6 from tox itself
Requirement satisfied: tox-current-env >= 0.0.6
   (installed: tox-current-env 0.0.8)
___________________________________ summary ____________________________________
  py39: commands succeeded
  docs: commands succeeded
  congratulations :)
Handling pbr!=2.1.0,>=2.0.0 from tox --print-deps-only: py39,docs
Requirement satisfied: pbr!=2.1.0,>=2.0.0
   (installed: pbr 5.11.1)
Handling Babel!=2.4.0,>=2.3.4 from tox --print-deps-only: py39,docs
Requirement satisfied: Babel!=2.4.0,>=2.3.4
   (installed: Babel 2.9.1)
Handling eventlet>=0.26.0 from tox --print-deps-only: py39,docs
Requirement satisfied: eventlet>=0.26.0
   (installed: eventlet 0.33.3)
Handling oslo.i18n>=3.15.3 from tox --print-deps-only: py39,docs
Requirement satisfied: oslo.i18n>=3.15.3
   (installed: oslo.i18n 6.2.0.dev2)
Handling oslo.log>=3.36.0 from tox --print-deps-only: py39,docs
Requirement satisfied: oslo.log>=3.36.0
   (installed: oslo.log 5.4.0.dev4)
Handling oslo.serialization!=2.19.1,>=2.18.0 from tox --print-deps-only: py39,docs
Requirement satisfied: oslo.serialization!=2.19.1,>=2.18.0
   (installed: oslo.serialization 5.3.0.dev1)
Handling oslo.utils>=3.33.0 from tox --print-deps-only: py39,docs
Requirement satisfied: oslo.utils>=3.33.0
   (installed: oslo.utils 6.3.0.dev2)
Handling requests>=2.14.2 from tox --print-deps-only: py39,docs
Requirement satisfied: requests>=2.14.2
   (installed: requests 2.25.1)
Handling sahara>=10.0.0.0b1 from tox --print-deps-only: py39,docs
Requirement satisfied: sahara>=10.0.0.0b1
   (installed: sahara 17.1.0.dev4)
Handling six>=1.10.0 from tox --print-deps-only: py39,docs
Requirement satisfied: six>=1.10.0
   (installed: six 1.15.0)
Handling coverage!=4.4,>=4.0 from tox --print-deps-only: py39,docs
Requirement satisfied: coverage!=4.4,>=4.0
   (installed: coverage 6.4.2)
Handling fixtures>=3.0.0 from tox --print-deps-only: py39,docs
Requirement satisfied: fixtures>=3.0.0
   (installed: fixtures 4.0.1)
Handling oslotest>=3.2.0 from tox --print-deps-only: py39,docs
Requirement satisfied: oslotest>=3.2.0
   (installed: oslotest 4.5.1.dev7)
Handling stestr>=1.0.0 from tox --print-deps-only: py39,docs
Requirement satisfied: stestr>=1.0.0
   (installed: stestr 4.0.1)
Handling testscenarios>=0.4 from tox --print-deps-only: py39,docs
Requirement satisfied: testscenarios>=0.4
   (installed: testscenarios 0.5.0)
Handling testtools>=2.4.0 from tox --print-deps-only: py39,docs
Requirement satisfied: testtools>=2.4.0
   (installed: testtools 2.6.0)
Handling openstackdocstheme>=2.2.1 from tox --print-deps-only: py39,docs
Requirement satisfied: openstackdocstheme>=2.2.1
   (installed: openstackdocstheme 3.0.0)
Handling reno>=3.1.0 from tox --print-deps-only: py39,docs
Requirement satisfied: reno>=3.1.0
   (installed: reno 4.0.1.dev2)
Handling sphinx>=2.0.0,!=2.1.0 from tox --print-deps-only: py39,docs
Requirement satisfied: sphinx>=2.0.0,!=2.1.0
   (installed: sphinx 3.4.3)
Handling sphinxcontrib-httpdomain>=1.3.0 from tox --print-deps-only: py39,docs
Requirement satisfied: sphinxcontrib-httpdomain>=1.3.0
   (installed: sphinxcontrib-httpdomain 1.7.0)
/usr/lib/python3.9/site-packages/setuptools/dist.py:697: UserWarning: Usage of dash-separated 'description-file' will not be supported in future versions. Please use the underscore name 'description_file' instead
  warnings.warn(
/usr/lib/python3.9/site-packages/setuptools/dist.py:697: UserWarning: Usage of dash-separated 'python-requires' will not be supported in future versions. Please use the underscore name 'python_requires' instead
  warnings.warn(
/usr/lib/python3.9/site-packages/setuptools/dist.py:697: UserWarning: Usage of dash-separated 'author-email' will not be supported in future versions. Please use the underscore name 'author_email' instead
  warnings.warn(
/usr/lib/python3.9/site-packages/setuptools/dist.py:697: UserWarning: Usage of dash-separated 'home-page' will not be supported in future versions. Please use the underscore name 'home_page' instead
  warnings.warn(
running dist_info
writing sahara_plugin_spark.egg-info/PKG-INFO
writing dependency_links to sahara_plugin_spark.egg-info/dependency_links.txt
writing entry points to sahara_plugin_spark.egg-info/entry_points.txt
writing requirements to sahara_plugin_spark.egg-info/requires.txt
writing top-level names to sahara_plugin_spark.egg-info/top_level.txt
writing pbr to sahara_plugin_spark.egg-info/pbr.json
[pbr] Processing SOURCES.txt
[pbr] In git context, generating filelist from git
warning: no previously-included files found matching '.gitignore'
warning: no previously-included files found matching '.gitreview'
warning: no previously-included files matching '*.pyc' found anywhere in distribution
adding license file 'LICENSE'
adding license file 'AUTHORS'
writing manifest file 'sahara_plugin_spark.egg-info/SOURCES.txt'
creating '/builddir/build/BUILD/sahara-plugin-spark-10.0.0/sahara_plugin_spark.dist-info'
adding license file "LICENSE" (matched pattern "LICEN[CS]E*")
adding license file "AUTHORS" (matched pattern "AUTHORS*")
Handling pbr (!=2.1.0,>=2.0.0) from hook generated metadata: Requires-Dist (sahara-plugin-spark)
Requirement satisfied: pbr (!=2.1.0,>=2.0.0)
   (installed: pbr 5.11.1)
Handling Babel (!=2.4.0,>=2.3.4) from hook generated metadata: Requires-Dist (sahara-plugin-spark)
Requirement satisfied: Babel (!=2.4.0,>=2.3.4)
   (installed: Babel 2.9.1)
Handling eventlet (>=0.26.0) from hook generated metadata: Requires-Dist (sahara-plugin-spark)
Requirement satisfied: eventlet (>=0.26.0)
   (installed: eventlet 0.33.3)
Handling oslo.i18n (>=3.15.3) from hook generated metadata: Requires-Dist (sahara-plugin-spark)
Requirement satisfied: oslo.i18n (>=3.15.3)
   (installed: oslo.i18n 6.2.0.dev2)
Handling oslo.log (>=3.36.0) from hook generated metadata: Requires-Dist (sahara-plugin-spark)
Requirement satisfied: oslo.log (>=3.36.0)
   (installed: oslo.log 5.4.0.dev4)
Handling oslo.serialization (!=2.19.1,>=2.18.0) from hook generated metadata: Requires-Dist (sahara-plugin-spark)
Requirement satisfied: oslo.serialization (!=2.19.1,>=2.18.0)
   (installed: oslo.serialization 5.3.0.dev1)
Handling oslo.utils (>=3.33.0) from hook generated metadata: Requires-Dist (sahara-plugin-spark)
Requirement satisfied: oslo.utils (>=3.33.0)
   (installed: oslo.utils 6.3.0.dev2)
Handling requests (>=2.14.2) from hook generated metadata: Requires-Dist (sahara-plugin-spark)
Requirement satisfied: requests (>=2.14.2)
   (installed: requests 2.25.1)
Handling sahara (>=10.0.0.0b1) from hook generated metadata: Requires-Dist (sahara-plugin-spark)
Requirement satisfied: sahara (>=10.0.0.0b1)
   (installed: sahara 17.1.0.dev4)
Handling six (>=1.10.0) from hook generated metadata: Requires-Dist (sahara-plugin-spark)
Requirement satisfied: six (>=1.10.0)
   (installed: six 1.15.0)
+ cat /builddir/build/BUILD/python-sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.el9.x86_64-pyproject-buildrequires
+ rm -rfv sahara_plugin_spark.dist-info/
removed 'sahara_plugin_spark.dist-info/AUTHORS'
removed 'sahara_plugin_spark.dist-info/LICENSE'
removed 'sahara_plugin_spark.dist-info/METADATA'
removed 'sahara_plugin_spark.dist-info/entry_points.txt'
removed 'sahara_plugin_spark.dist-info/pbr.json'
removed 'sahara_plugin_spark.dist-info/top_level.txt'
removed directory 'sahara_plugin_spark.dist-info/'
+ RPM_EC=0
++ jobs -p
+ exit 0
Wrote: /builddir/build/SRPMS/python-sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.el9.buildreqs.nosrc.rpm
Child return code was: 11
Dynamic buildrequires detected
Going to install missing buildrequires. See root.log for details.
ENTER ['do_with_status'](['bash', '--login', '-c', '/usr/bin/rpmbuild -ba --noprep  --target x86_64 --nodeps /builddir/build/SPECS/python-sahara-plugin-spark.spec'], chrootPath='/var/lib/mock/dlrn-centos9-master-x86_64-5/root'env={'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;<mock-chroot>\\007"', 'PS1': '<mock-chroot> \\s-\\v\\$ ', 'LANG': 'C.UTF-8'}shell=Falselogger=<mockbuild.trace_decorator.getLog object at 0x7f82c40f6908>timeout=0uid=1032gid=135user='mockbuild'nspawn_args=[]unshare_net=TrueprintOutput=False)
Executing command: ['bash', '--login', '-c', '/usr/bin/rpmbuild -ba --noprep  --target x86_64 --nodeps /builddir/build/SPECS/python-sahara-plugin-spark.spec'] with env {'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;<mock-chroot>\\007"', 'PS1': '<mock-chroot> \\s-\\v\\$ ', 'LANG': 'C.UTF-8'} and shell False
Building target platforms: x86_64
Building for target x86_64
Executing(%generate_buildrequires): /bin/sh -e /var/tmp/rpm-tmp.9eH7Ff
+ umask 022
+ cd /builddir/build/BUILD
+ cd sahara-plugin-spark-10.0.0
+ echo pyproject-rpm-macros
+ echo python3-devel
+ echo 'python3dist(pip) >= 19'
+ echo 'python3dist(packaging)'
+ '[' -f pyproject.toml ']'
+ '[' -f setup.py ']'
+ echo 'python3dist(setuptools) >= 40.8'
+ echo 'python3dist(wheel)'
+ rm -rfv '*.dist-info/'
+ '[' -f /usr/bin/python3 ']'
+ mkdir -p /builddir/build/BUILD/sahara-plugin-spark-10.0.0/.pyproject-builddir
+ echo -n
+ CFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1  -m64 -march=x86-64-v2 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -fcf-protection'
+ LDFLAGS='-Wl,-z,relro -Wl,--as-needed  -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 '
+ TMPDIR=/builddir/build/BUILD/sahara-plugin-spark-10.0.0/.pyproject-builddir
+ RPM_TOXENV=py39,docs
+ HOSTNAME=rpmbuild
+ /usr/bin/python3 -Bs /usr/lib/rpm/redhat/pyproject_buildrequires.py --generate-extras --python3_pkgversion 3 --wheeldir /builddir/build/BUILD/sahara-plugin-spark-10.0.0/pyproject-wheeldir --output /builddir/build/BUILD/python-sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.el9.x86_64-pyproject-buildrequires -t -e py39,docs
Handling setuptools >= 40.8 from default build backend
Requirement satisfied: setuptools >= 40.8
   (installed: setuptools 57.4.0)
Handling wheel from default build backend
Requirement satisfied: wheel
   (installed: wheel 0.36.2)
/usr/lib/python3.9/site-packages/setuptools/dist.py:697: UserWarning: Usage of dash-separated 'description-file' will not be supported in future versions. Please use the underscore name 'description_file' instead
  warnings.warn(
/usr/lib/python3.9/site-packages/setuptools/dist.py:697: UserWarning: Usage of dash-separated 'python-requires' will not be supported in future versions. Please use the underscore name 'python_requires' instead
  warnings.warn(
/usr/lib/python3.9/site-packages/setuptools/dist.py:697: UserWarning: Usage of dash-separated 'author-email' will not be supported in future versions. Please use the underscore name 'author_email' instead
  warnings.warn(
/usr/lib/python3.9/site-packages/setuptools/dist.py:697: UserWarning: Usage of dash-separated 'home-page' will not be supported in future versions. Please use the underscore name 'home_page' instead
  warnings.warn(
Handling wheel from get_requires_for_build_wheel
Requirement satisfied: wheel
   (installed: wheel 0.36.2)
Handling pbr>=2.0.0 from get_requires_for_build_wheel
Requirement satisfied: pbr>=2.0.0
   (installed: pbr 5.11.1)
Handling tox-current-env >= 0.0.6 from tox itself
Requirement satisfied: tox-current-env >= 0.0.6
   (installed: tox-current-env 0.0.8)
___________________________________ summary ____________________________________
  py39: commands succeeded
  docs: commands succeeded
  congratulations :)
Handling pbr!=2.1.0,>=2.0.0 from tox --print-deps-only: py39,docs
Requirement satisfied: pbr!=2.1.0,>=2.0.0
   (installed: pbr 5.11.1)
Handling Babel!=2.4.0,>=2.3.4 from tox --print-deps-only: py39,docs
Requirement satisfied: Babel!=2.4.0,>=2.3.4
   (installed: Babel 2.9.1)
Handling eventlet>=0.26.0 from tox --print-deps-only: py39,docs
Requirement satisfied: eventlet>=0.26.0
   (installed: eventlet 0.33.3)
Handling oslo.i18n>=3.15.3 from tox --print-deps-only: py39,docs
Requirement satisfied: oslo.i18n>=3.15.3
   (installed: oslo.i18n 6.2.0.dev2)
Handling oslo.log>=3.36.0 from tox --print-deps-only: py39,docs
Requirement satisfied: oslo.log>=3.36.0
   (installed: oslo.log 5.4.0.dev4)
Handling oslo.serialization!=2.19.1,>=2.18.0 from tox --print-deps-only: py39,docs
Requirement satisfied: oslo.serialization!=2.19.1,>=2.18.0
   (installed: oslo.serialization 5.3.0.dev1)
Handling oslo.utils>=3.33.0 from tox --print-deps-only: py39,docs
Requirement satisfied: oslo.utils>=3.33.0
   (installed: oslo.utils 6.3.0.dev2)
Handling requests>=2.14.2 from tox --print-deps-only: py39,docs
Requirement satisfied: requests>=2.14.2
   (installed: requests 2.25.1)
Handling sahara>=10.0.0.0b1 from tox --print-deps-only: py39,docs
Requirement satisfied: sahara>=10.0.0.0b1
   (installed: sahara 17.1.0.dev4)
Handling six>=1.10.0 from tox --print-deps-only: py39,docs
Requirement satisfied: six>=1.10.0
   (installed: six 1.15.0)
Handling coverage!=4.4,>=4.0 from tox --print-deps-only: py39,docs
Requirement satisfied: coverage!=4.4,>=4.0
   (installed: coverage 6.4.2)
Handling fixtures>=3.0.0 from tox --print-deps-only: py39,docs
Requirement satisfied: fixtures>=3.0.0
   (installed: fixtures 4.0.1)
Handling oslotest>=3.2.0 from tox --print-deps-only: py39,docs
Requirement satisfied: oslotest>=3.2.0
   (installed: oslotest 4.5.1.dev7)
Handling stestr>=1.0.0 from tox --print-deps-only: py39,docs
Requirement satisfied: stestr>=1.0.0
   (installed: stestr 4.0.1)
Handling testscenarios>=0.4 from tox --print-deps-only: py39,docs
Requirement satisfied: testscenarios>=0.4
   (installed: testscenarios 0.5.0)
Handling testtools>=2.4.0 from tox --print-deps-only: py39,docs
Requirement satisfied: testtools>=2.4.0
   (installed: testtools 2.6.0)
Handling openstackdocstheme>=2.2.1 from tox --print-deps-only: py39,docs
Requirement satisfied: openstackdocstheme>=2.2.1
   (installed: openstackdocstheme 3.0.0)
Handling reno>=3.1.0 from tox --print-deps-only: py39,docs
Requirement satisfied: reno>=3.1.0
   (installed: reno 4.0.1.dev2)
Handling sphinx>=2.0.0,!=2.1.0 from tox --print-deps-only: py39,docs
Requirement satisfied: sphinx>=2.0.0,!=2.1.0
   (installed: sphinx 3.4.3)
Handling sphinxcontrib-httpdomain>=1.3.0 from tox --print-deps-only: py39,docs
Requirement satisfied: sphinxcontrib-httpdomain>=1.3.0
   (installed: sphinxcontrib-httpdomain 1.7.0)
/usr/lib/python3.9/site-packages/setuptools/dist.py:697: UserWarning: Usage of dash-separated 'description-file' will not be supported in future versions. Please use the underscore name 'description_file' instead
  warnings.warn(
/usr/lib/python3.9/site-packages/setuptools/dist.py:697: UserWarning: Usage of dash-separated 'python-requires' will not be supported in future versions. Please use the underscore name 'python_requires' instead
  warnings.warn(
/usr/lib/python3.9/site-packages/setuptools/dist.py:697: UserWarning: Usage of dash-separated 'author-email' will not be supported in future versions. Please use the underscore name 'author_email' instead
  warnings.warn(
/usr/lib/python3.9/site-packages/setuptools/dist.py:697: UserWarning: Usage of dash-separated 'home-page' will not be supported in future versions. Please use the underscore name 'home_page' instead
  warnings.warn(
running dist_info
creating sahara_plugin_spark.egg-info
writing sahara_plugin_spark.egg-info/PKG-INFO
writing dependency_links to sahara_plugin_spark.egg-info/dependency_links.txt
writing entry points to sahara_plugin_spark.egg-info/entry_points.txt
writing requirements to sahara_plugin_spark.egg-info/requires.txt
writing top-level names to sahara_plugin_spark.egg-info/top_level.txt
writing pbr to sahara_plugin_spark.egg-info/pbr.json
[pbr] Processing SOURCES.txt
writing manifest file 'sahara_plugin_spark.egg-info/SOURCES.txt'
[pbr] In git context, generating filelist from git
warning: no previously-included files found matching '.gitignore'
warning: no previously-included files found matching '.gitreview'
warning: no previously-included files matching '*.pyc' found anywhere in distribution
adding license file 'LICENSE'
adding license file 'AUTHORS'
writing manifest file 'sahara_plugin_spark.egg-info/SOURCES.txt'
creating '/builddir/build/BUILD/sahara-plugin-spark-10.0.0/sahara_plugin_spark.dist-info'
adding license file "LICENSE" (matched pattern "LICEN[CS]E*")
adding license file "AUTHORS" (matched pattern "AUTHORS*")
Handling pbr (!=2.1.0,>=2.0.0) from hook generated metadata: Requires-Dist (sahara-plugin-spark)
Requirement satisfied: pbr (!=2.1.0,>=2.0.0)
   (installed: pbr 5.11.1)
Handling Babel (!=2.4.0,>=2.3.4) from hook generated metadata: Requires-Dist (sahara-plugin-spark)
Requirement satisfied: Babel (!=2.4.0,>=2.3.4)
   (installed: Babel 2.9.1)
Handling eventlet (>=0.26.0) from hook generated metadata: Requires-Dist (sahara-plugin-spark)
Requirement satisfied: eventlet (>=0.26.0)
   (installed: eventlet 0.33.3)
Handling oslo.i18n (>=3.15.3) from hook generated metadata: Requires-Dist (sahara-plugin-spark)
Requirement satisfied: oslo.i18n (>=3.15.3)
   (installed: oslo.i18n 6.2.0.dev2)
Handling oslo.log (>=3.36.0) from hook generated metadata: Requires-Dist (sahara-plugin-spark)
Requirement satisfied: oslo.log (>=3.36.0)
   (installed: oslo.log 5.4.0.dev4)
Handling oslo.serialization (!=2.19.1,>=2.18.0) from hook generated metadata: Requires-Dist (sahara-plugin-spark)
Requirement satisfied: oslo.serialization (!=2.19.1,>=2.18.0)
   (installed: oslo.serialization 5.3.0.dev1)
Handling oslo.utils (>=3.33.0) from hook generated metadata: Requires-Dist (sahara-plugin-spark)
Requirement satisfied: oslo.utils (>=3.33.0)
   (installed: oslo.utils 6.3.0.dev2)
Handling requests (>=2.14.2) from hook generated metadata: Requires-Dist (sahara-plugin-spark)
Requirement satisfied: requests (>=2.14.2)
   (installed: requests 2.25.1)
Handling sahara (>=10.0.0.0b1) from hook generated metadata: Requires-Dist (sahara-plugin-spark)
Requirement satisfied: sahara (>=10.0.0.0b1)
   (installed: sahara 17.1.0.dev4)
Handling six (>=1.10.0) from hook generated metadata: Requires-Dist (sahara-plugin-spark)
Requirement satisfied: six (>=1.10.0)
   (installed: six 1.15.0)
+ cat /builddir/build/BUILD/python-sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.el9.x86_64-pyproject-buildrequires
+ rm -rfv sahara_plugin_spark.dist-info/
removed 'sahara_plugin_spark.dist-info/AUTHORS'
removed 'sahara_plugin_spark.dist-info/LICENSE'
removed 'sahara_plugin_spark.dist-info/METADATA'
removed 'sahara_plugin_spark.dist-info/entry_points.txt'
removed 'sahara_plugin_spark.dist-info/top_level.txt'
removed 'sahara_plugin_spark.dist-info/pbr.json'
removed directory 'sahara_plugin_spark.dist-info/'
+ RPM_EC=0
++ jobs -p
+ exit 0
Executing(%build): /bin/sh -e /var/tmp/rpm-tmp.K9PNMT
+ umask 022
+ cd /builddir/build/BUILD
+ cd sahara-plugin-spark-10.0.0
+ mkdir -p /builddir/build/BUILD/sahara-plugin-spark-10.0.0/.pyproject-builddir
+ CFLAGS='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1  -m64 -march=x86-64-v2 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -fcf-protection'
+ LDFLAGS='-Wl,-z,relro -Wl,--as-needed  -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 '
+ TMPDIR=/builddir/build/BUILD/sahara-plugin-spark-10.0.0/.pyproject-builddir
+ /usr/bin/python3 -Bs /usr/lib/rpm/redhat/pyproject_wheel.py /builddir/build/BUILD/sahara-plugin-spark-10.0.0/pyproject-wheeldir
Processing /builddir/build/BUILD/sahara-plugin-spark-10.0.0
  DEPRECATION: A future pip version will change local packages to be built in-place without first copying to a temporary directory. We recommend you use --use-feature=in-tree-build to test your packages with this new behavior before it becomes the default.
   pip 21.3 will remove support for this functionality. You can find discussion regarding this at https://github.com/pypa/pip/issues/7555.
    Preparing wheel metadata: started
    Running command /usr/bin/python3 /usr/lib/python3.9/site-packages/pip/_vendor/pep517/in_process/_in_process.py prepare_metadata_for_build_wheel /builddir/build/BUILD/sahara-plugin-spark-10.0.0/.pyproject-builddir/tmp9vq5ozx4
    /usr/lib/python3.9/site-packages/setuptools/dist.py:697: UserWarning: Usage of dash-separated 'description-file' will not be supported in future versions. Please use the underscore name 'description_file' instead
      warnings.warn(
    /usr/lib/python3.9/site-packages/setuptools/dist.py:697: UserWarning: Usage of dash-separated 'python-requires' will not be supported in future versions. Please use the underscore name 'python_requires' instead
      warnings.warn(
    /usr/lib/python3.9/site-packages/setuptools/dist.py:697: UserWarning: Usage of dash-separated 'author-email' will not be supported in future versions. Please use the underscore name 'author_email' instead
      warnings.warn(
    /usr/lib/python3.9/site-packages/setuptools/dist.py:697: UserWarning: Usage of dash-separated 'home-page' will not be supported in future versions. Please use the underscore name 'home_page' instead
      warnings.warn(
    running dist_info
    creating /builddir/build/BUILD/sahara-plugin-spark-10.0.0/.pyproject-builddir/pip-modern-metadata-jqxwbt6i/sahara_plugin_spark.egg-info
    writing /builddir/build/BUILD/sahara-plugin-spark-10.0.0/.pyproject-builddir/pip-modern-metadata-jqxwbt6i/sahara_plugin_spark.egg-info/PKG-INFO
    writing dependency_links to /builddir/build/BUILD/sahara-plugin-spark-10.0.0/.pyproject-builddir/pip-modern-metadata-jqxwbt6i/sahara_plugin_spark.egg-info/dependency_links.txt
    writing entry points to /builddir/build/BUILD/sahara-plugin-spark-10.0.0/.pyproject-builddir/pip-modern-metadata-jqxwbt6i/sahara_plugin_spark.egg-info/entry_points.txt
    writing requirements to /builddir/build/BUILD/sahara-plugin-spark-10.0.0/.pyproject-builddir/pip-modern-metadata-jqxwbt6i/sahara_plugin_spark.egg-info/requires.txt
    writing top-level names to /builddir/build/BUILD/sahara-plugin-spark-10.0.0/.pyproject-builddir/pip-modern-metadata-jqxwbt6i/sahara_plugin_spark.egg-info/top_level.txt
    writing pbr to /builddir/build/BUILD/sahara-plugin-spark-10.0.0/.pyproject-builddir/pip-modern-metadata-jqxwbt6i/sahara_plugin_spark.egg-info/pbr.json
    [pbr] Processing SOURCES.txt
    writing manifest file '/builddir/build/BUILD/sahara-plugin-spark-10.0.0/.pyproject-builddir/pip-modern-metadata-jqxwbt6i/sahara_plugin_spark.egg-info/SOURCES.txt'
    [pbr] In git context, generating filelist from git
    warning: no previously-included files found matching '.gitignore'
    warning: no previously-included files found matching '.gitreview'
    warning: no previously-included files matching '*.pyc' found anywhere in distribution
    adding license file 'LICENSE'
    adding license file 'AUTHORS'
    writing manifest file '/builddir/build/BUILD/sahara-plugin-spark-10.0.0/.pyproject-builddir/pip-modern-metadata-jqxwbt6i/sahara_plugin_spark.egg-info/SOURCES.txt'
    creating '/builddir/build/BUILD/sahara-plugin-spark-10.0.0/.pyproject-builddir/pip-modern-metadata-jqxwbt6i/sahara_plugin_spark.dist-info'
    adding license file "LICENSE" (matched pattern "LICEN[CS]E*")
    adding license file "AUTHORS" (matched pattern "AUTHORS*")
    Preparing wheel metadata: finished with status 'done'
Building wheels for collected packages: sahara-plugin-spark
  Building wheel for sahara-plugin-spark (PEP 517): started
  Running command /usr/bin/python3 /usr/lib/python3.9/site-packages/pip/_vendor/pep517/in_process/_in_process.py build_wheel /builddir/build/BUILD/sahara-plugin-spark-10.0.0/.pyproject-builddir/tmpeftn7elj
  /usr/lib/python3.9/site-packages/setuptools/dist.py:697: UserWarning: Usage of dash-separated 'description-file' will not be supported in future versions. Please use the underscore name 'description_file' instead
    warnings.warn(
  /usr/lib/python3.9/site-packages/setuptools/dist.py:697: UserWarning: Usage of dash-separated 'python-requires' will not be supported in future versions. Please use the underscore name 'python_requires' instead
    warnings.warn(
  /usr/lib/python3.9/site-packages/setuptools/dist.py:697: UserWarning: Usage of dash-separated 'author-email' will not be supported in future versions. Please use the underscore name 'author_email' instead
    warnings.warn(
  /usr/lib/python3.9/site-packages/setuptools/dist.py:697: UserWarning: Usage of dash-separated 'home-page' will not be supported in future versions. Please use the underscore name 'home_page' instead
    warnings.warn(
  running bdist_wheel
  running build
  running build_py
  creating build
  creating build/lib
  creating build/lib/sahara_plugin_spark
  copying sahara_plugin_spark/__init__.py -> build/lib/sahara_plugin_spark
  copying sahara_plugin_spark/i18n.py -> build/lib/sahara_plugin_spark
  creating build/lib/sahara_plugin_spark/tests
  creating build/lib/sahara_plugin_spark/tests/unit
  copying sahara_plugin_spark/tests/unit/__init__.py -> build/lib/sahara_plugin_spark/tests/unit
  copying sahara_plugin_spark/tests/unit/base.py -> build/lib/sahara_plugin_spark/tests/unit
  creating build/lib/sahara_plugin_spark/plugins
  creating build/lib/sahara_plugin_spark/plugins/spark
  copying sahara_plugin_spark/plugins/spark/__init__.py -> build/lib/sahara_plugin_spark/plugins/spark
  copying sahara_plugin_spark/plugins/spark/config_helper.py -> build/lib/sahara_plugin_spark/plugins/spark
  copying sahara_plugin_spark/plugins/spark/edp_engine.py -> build/lib/sahara_plugin_spark/plugins/spark
  copying sahara_plugin_spark/plugins/spark/images.py -> build/lib/sahara_plugin_spark/plugins/spark
  copying sahara_plugin_spark/plugins/spark/plugin.py -> build/lib/sahara_plugin_spark/plugins/spark
  copying sahara_plugin_spark/plugins/spark/run_scripts.py -> build/lib/sahara_plugin_spark/plugins/spark
  copying sahara_plugin_spark/plugins/spark/scaling.py -> build/lib/sahara_plugin_spark/plugins/spark
  copying sahara_plugin_spark/plugins/spark/shell_engine.py -> build/lib/sahara_plugin_spark/plugins/spark
  copying sahara_plugin_spark/plugins/__init__.py -> build/lib/sahara_plugin_spark/plugins
  creating build/lib/sahara_plugin_spark/tests/unit/plugins
  copying sahara_plugin_spark/tests/unit/plugins/__init__.py -> build/lib/sahara_plugin_spark/tests/unit/plugins
  creating build/lib/sahara_plugin_spark/utils
  copying sahara_plugin_spark/utils/__init__.py -> build/lib/sahara_plugin_spark/utils
  copying sahara_plugin_spark/utils/patches.py -> build/lib/sahara_plugin_spark/utils
  copying sahara_plugin_spark/tests/__init__.py -> build/lib/sahara_plugin_spark/tests
  creating build/lib/sahara_plugin_spark/tests/unit/plugins/spark
  copying sahara_plugin_spark/tests/unit/plugins/spark/__init__.py -> build/lib/sahara_plugin_spark/tests/unit/plugins/spark
  copying sahara_plugin_spark/tests/unit/plugins/spark/test_config_helper.py -> build/lib/sahara_plugin_spark/tests/unit/plugins/spark
  copying sahara_plugin_spark/tests/unit/plugins/spark/test_plugin.py -> build/lib/sahara_plugin_spark/tests/unit/plugins/spark
  running egg_info
  creating sahara_plugin_spark.egg-info
  writing sahara_plugin_spark.egg-info/PKG-INFO
  writing dependency_links to sahara_plugin_spark.egg-info/dependency_links.txt
  writing entry points to sahara_plugin_spark.egg-info/entry_points.txt
  writing requirements to sahara_plugin_spark.egg-info/requires.txt
  writing top-level names to sahara_plugin_spark.egg-info/top_level.txt
  writing pbr to sahara_plugin_spark.egg-info/pbr.json
  [pbr] Processing SOURCES.txt
  writing manifest file 'sahara_plugin_spark.egg-info/SOURCES.txt'
  [pbr] In git context, generating filelist from git
  warning: no previously-included files found matching '.gitignore'
  warning: no previously-included files found matching '.gitreview'
  warning: no previously-included files matching '*.pyc' found anywhere in distribution
  adding license file 'LICENSE'
  adding license file 'AUTHORS'
  writing manifest file 'sahara_plugin_spark.egg-info/SOURCES.txt'
  creating build/lib/sahara_plugin_spark/locale
  creating build/lib/sahara_plugin_spark/locale/de
  creating build/lib/sahara_plugin_spark/locale/de/LC_MESSAGES
  copying sahara_plugin_spark/locale/de/LC_MESSAGES/sahara_plugin_spark.po -> build/lib/sahara_plugin_spark/locale/de/LC_MESSAGES
  creating build/lib/sahara_plugin_spark/locale/en_GB
  creating build/lib/sahara_plugin_spark/locale/en_GB/LC_MESSAGES
  copying sahara_plugin_spark/locale/en_GB/LC_MESSAGES/sahara_plugin_spark.po -> build/lib/sahara_plugin_spark/locale/en_GB/LC_MESSAGES
  creating build/lib/sahara_plugin_spark/locale/id
  creating build/lib/sahara_plugin_spark/locale/id/LC_MESSAGES
  copying sahara_plugin_spark/locale/id/LC_MESSAGES/sahara_plugin_spark.po -> build/lib/sahara_plugin_spark/locale/id/LC_MESSAGES
  creating build/lib/sahara_plugin_spark/locale/ne
  creating build/lib/sahara_plugin_spark/locale/ne/LC_MESSAGES
  copying sahara_plugin_spark/locale/ne/LC_MESSAGES/sahara_plugin_spark.po -> build/lib/sahara_plugin_spark/locale/ne/LC_MESSAGES
  creating build/lib/sahara_plugin_spark/plugins/spark/resources
  copying sahara_plugin_spark/plugins/spark/resources/README.rst -> build/lib/sahara_plugin_spark/plugins/spark/resources
  copying sahara_plugin_spark/plugins/spark/resources/core-default.xml -> build/lib/sahara_plugin_spark/plugins/spark/resources
  copying sahara_plugin_spark/plugins/spark/resources/hdfs-default.xml -> build/lib/sahara_plugin_spark/plugins/spark/resources
  copying sahara_plugin_spark/plugins/spark/resources/spark-cleanup.cron -> build/lib/sahara_plugin_spark/plugins/spark/resources
  copying sahara_plugin_spark/plugins/spark/resources/spark-env.sh.template -> build/lib/sahara_plugin_spark/plugins/spark/resources
  copying sahara_plugin_spark/plugins/spark/resources/tmp-cleanup.sh.template -> build/lib/sahara_plugin_spark/plugins/spark/resources
  copying sahara_plugin_spark/plugins/spark/resources/topology.sh -> build/lib/sahara_plugin_spark/plugins/spark/resources
  creating build/lib/sahara_plugin_spark/plugins/spark/resources/images
  copying sahara_plugin_spark/plugins/spark/resources/images/image.yaml -> build/lib/sahara_plugin_spark/plugins/spark/resources/images
  creating build/lib/sahara_plugin_spark/plugins/spark/resources/images/centos
  copying sahara_plugin_spark/plugins/spark/resources/images/centos/turn_off_services -> build/lib/sahara_plugin_spark/plugins/spark/resources/images/centos
  copying sahara_plugin_spark/plugins/spark/resources/images/centos/wget_cdh_repo -> build/lib/sahara_plugin_spark/plugins/spark/resources/images/centos
  creating build/lib/sahara_plugin_spark/plugins/spark/resources/images/common
  copying sahara_plugin_spark/plugins/spark/resources/images/common/add_jar -> build/lib/sahara_plugin_spark/plugins/spark/resources/images/common
  copying sahara_plugin_spark/plugins/spark/resources/images/common/install_extjs -> build/lib/sahara_plugin_spark/plugins/spark/resources/images/common
  copying sahara_plugin_spark/plugins/spark/resources/images/common/install_spark -> build/lib/sahara_plugin_spark/plugins/spark/resources/images/common
  copying sahara_plugin_spark/plugins/spark/resources/images/common/manipulate_s3 -> build/lib/sahara_plugin_spark/plugins/spark/resources/images/common
  creating build/lib/sahara_plugin_spark/plugins/spark/resources/images/ubuntu
  copying sahara_plugin_spark/plugins/spark/resources/images/ubuntu/config_spark -> build/lib/sahara_plugin_spark/plugins/spark/resources/images/ubuntu
  copying sahara_plugin_spark/plugins/spark/resources/images/ubuntu/turn_off_services -> build/lib/sahara_plugin_spark/plugins/spark/resources/images/ubuntu
  copying sahara_plugin_spark/plugins/spark/resources/images/ubuntu/wget_cdh_repo -> build/lib/sahara_plugin_spark/plugins/spark/resources/images/ubuntu
  installing to build/bdist.linux-x86_64/wheel
  running install
  [pbr] Writing ChangeLog
  [pbr] Generating ChangeLog
  [pbr] ChangeLog complete (0.0s)
  [pbr] Generating AUTHORS
  [pbr] AUTHORS complete (0.1s)
  running install_lib
  creating build/bdist.linux-x86_64
  creating build/bdist.linux-x86_64/wheel
  creating build/bdist.linux-x86_64/wheel/sahara_plugin_spark
  creating build/bdist.linux-x86_64/wheel/sahara_plugin_spark/locale
  creating build/bdist.linux-x86_64/wheel/sahara_plugin_spark/locale/ne
  creating build/bdist.linux-x86_64/wheel/sahara_plugin_spark/locale/ne/LC_MESSAGES
  copying build/lib/sahara_plugin_spark/locale/ne/LC_MESSAGES/sahara_plugin_spark.po -> build/bdist.linux-x86_64/wheel/sahara_plugin_spark/locale/ne/LC_MESSAGES
  creating build/bdist.linux-x86_64/wheel/sahara_plugin_spark/locale/id
  creating build/bdist.linux-x86_64/wheel/sahara_plugin_spark/locale/id/LC_MESSAGES
  copying build/lib/sahara_plugin_spark/locale/id/LC_MESSAGES/sahara_plugin_spark.po -> build/bdist.linux-x86_64/wheel/sahara_plugin_spark/locale/id/LC_MESSAGES
  creating build/bdist.linux-x86_64/wheel/sahara_plugin_spark/locale/en_GB
  creating build/bdist.linux-x86_64/wheel/sahara_plugin_spark/locale/en_GB/LC_MESSAGES
  copying build/lib/sahara_plugin_spark/locale/en_GB/LC_MESSAGES/sahara_plugin_spark.po -> build/bdist.linux-x86_64/wheel/sahara_plugin_spark/locale/en_GB/LC_MESSAGES
  creating build/bdist.linux-x86_64/wheel/sahara_plugin_spark/locale/de
  creating build/bdist.linux-x86_64/wheel/sahara_plugin_spark/locale/de/LC_MESSAGES
  copying build/lib/sahara_plugin_spark/locale/de/LC_MESSAGES/sahara_plugin_spark.po -> build/bdist.linux-x86_64/wheel/sahara_plugin_spark/locale/de/LC_MESSAGES
  creating build/bdist.linux-x86_64/wheel/sahara_plugin_spark/utils
  copying build/lib/sahara_plugin_spark/utils/patches.py -> build/bdist.linux-x86_64/wheel/sahara_plugin_spark/utils
  copying build/lib/sahara_plugin_spark/utils/__init__.py -> build/bdist.linux-x86_64/wheel/sahara_plugin_spark/utils
  creating build/bdist.linux-x86_64/wheel/sahara_plugin_spark/plugins
  copying build/lib/sahara_plugin_spark/plugins/__init__.py -> build/bdist.linux-x86_64/wheel/sahara_plugin_spark/plugins
  creating build/bdist.linux-x86_64/wheel/sahara_plugin_spark/plugins/spark
  creating build/bdist.linux-x86_64/wheel/sahara_plugin_spark/plugins/spark/resources
  creating build/bdist.linux-x86_64/wheel/sahara_plugin_spark/plugins/spark/resources/images
  creating build/bdist.linux-x86_64/wheel/sahara_plugin_spark/plugins/spark/resources/images/ubuntu
  copying build/lib/sahara_plugin_spark/plugins/spark/resources/images/ubuntu/wget_cdh_repo -> build/bdist.linux-x86_64/wheel/sahara_plugin_spark/plugins/spark/resources/images/ubuntu
  copying build/lib/sahara_plugin_spark/plugins/spark/resources/images/ubuntu/turn_off_services -> build/bdist.linux-x86_64/wheel/sahara_plugin_spark/plugins/spark/resources/images/ubuntu
  copying build/lib/sahara_plugin_spark/plugins/spark/resources/images/ubuntu/config_spark -> build/bdist.linux-x86_64/wheel/sahara_plugin_spark/plugins/spark/resources/images/ubuntu
  creating build/bdist.linux-x86_64/wheel/sahara_plugin_spark/plugins/spark/resources/images/common
  copying build/lib/sahara_plugin_spark/plugins/spark/resources/images/common/manipulate_s3 -> build/bdist.linux-x86_64/wheel/sahara_plugin_spark/plugins/spark/resources/images/common
  copying build/lib/sahara_plugin_spark/plugins/spark/resources/images/common/install_spark -> build/bdist.linux-x86_64/wheel/sahara_plugin_spark/plugins/spark/resources/images/common
  copying build/lib/sahara_plugin_spark/plugins/spark/resources/images/common/install_extjs -> build/bdist.linux-x86_64/wheel/sahara_plugin_spark/plugins/spark/resources/images/common
  copying build/lib/sahara_plugin_spark/plugins/spark/resources/images/common/add_jar -> build/bdist.linux-x86_64/wheel/sahara_plugin_spark/plugins/spark/resources/images/common
  creating build/bdist.linux-x86_64/wheel/sahara_plugin_spark/plugins/spark/resources/images/centos
  copying build/lib/sahara_plugin_spark/plugins/spark/resources/images/centos/wget_cdh_repo -> build/bdist.linux-x86_64/wheel/sahara_plugin_spark/plugins/spark/resources/images/centos
  copying build/lib/sahara_plugin_spark/plugins/spark/resources/images/centos/turn_off_services -> build/bdist.linux-x86_64/wheel/sahara_plugin_spark/plugins/spark/resources/images/centos
  copying build/lib/sahara_plugin_spark/plugins/spark/resources/images/image.yaml -> build/bdist.linux-x86_64/wheel/sahara_plugin_spark/plugins/spark/resources/images
  copying build/lib/sahara_plugin_spark/plugins/spark/resources/topology.sh -> build/bdist.linux-x86_64/wheel/sahara_plugin_spark/plugins/spark/resources
  copying build/lib/sahara_plugin_spark/plugins/spark/resources/tmp-cleanup.sh.template -> build/bdist.linux-x86_64/wheel/sahara_plugin_spark/plugins/spark/resources
  copying build/lib/sahara_plugin_spark/plugins/spark/resources/spark-env.sh.template -> build/bdist.linux-x86_64/wheel/sahara_plugin_spark/plugins/spark/resources
  copying build/lib/sahara_plugin_spark/plugins/spark/resources/spark-cleanup.cron -> build/bdist.linux-x86_64/wheel/sahara_plugin_spark/plugins/spark/resources
  copying build/lib/sahara_plugin_spark/plugins/spark/resources/hdfs-default.xml -> build/bdist.linux-x86_64/wheel/sahara_plugin_spark/plugins/spark/resources
  copying build/lib/sahara_plugin_spark/plugins/spark/resources/core-default.xml -> build/bdist.linux-x86_64/wheel/sahara_plugin_spark/plugins/spark/resources
  copying build/lib/sahara_plugin_spark/plugins/spark/resources/README.rst -> build/bdist.linux-x86_64/wheel/sahara_plugin_spark/plugins/spark/resources
  copying build/lib/sahara_plugin_spark/plugins/spark/shell_engine.py -> build/bdist.linux-x86_64/wheel/sahara_plugin_spark/plugins/spark
  copying build/lib/sahara_plugin_spark/plugins/spark/scaling.py -> build/bdist.linux-x86_64/wheel/sahara_plugin_spark/plugins/spark
  copying build/lib/sahara_plugin_spark/plugins/spark/run_scripts.py -> build/bdist.linux-x86_64/wheel/sahara_plugin_spark/plugins/spark
  copying build/lib/sahara_plugin_spark/plugins/spark/plugin.py -> build/bdist.linux-x86_64/wheel/sahara_plugin_spark/plugins/spark
  copying build/lib/sahara_plugin_spark/plugins/spark/images.py -> build/bdist.linux-x86_64/wheel/sahara_plugin_spark/plugins/spark
  copying build/lib/sahara_plugin_spark/plugins/spark/edp_engine.py -> build/bdist.linux-x86_64/wheel/sahara_plugin_spark/plugins/spark
  copying build/lib/sahara_plugin_spark/plugins/spark/config_helper.py -> build/bdist.linux-x86_64/wheel/sahara_plugin_spark/plugins/spark
  copying build/lib/sahara_plugin_spark/plugins/spark/__init__.py -> build/bdist.linux-x86_64/wheel/sahara_plugin_spark/plugins/spark
  creating build/bdist.linux-x86_64/wheel/sahara_plugin_spark/tests
  copying build/lib/sahara_plugin_spark/tests/__init__.py -> build/bdist.linux-x86_64/wheel/sahara_plugin_spark/tests
  creating build/bdist.linux-x86_64/wheel/sahara_plugin_spark/tests/unit
  creating build/bdist.linux-x86_64/wheel/sahara_plugin_spark/tests/unit/plugins
  creating build/bdist.linux-x86_64/wheel/sahara_plugin_spark/tests/unit/plugins/spark
  copying build/lib/sahara_plugin_spark/tests/unit/plugins/spark/test_plugin.py -> build/bdist.linux-x86_64/wheel/sahara_plugin_spark/tests/unit/plugins/spark
  copying build/lib/sahara_plugin_spark/tests/unit/plugins/spark/test_config_helper.py -> build/bdist.linux-x86_64/wheel/sahara_plugin_spark/tests/unit/plugins/spark
  copying build/lib/sahara_plugin_spark/tests/unit/plugins/spark/__init__.py -> build/bdist.linux-x86_64/wheel/sahara_plugin_spark/tests/unit/plugins/spark
  copying build/lib/sahara_plugin_spark/tests/unit/plugins/__init__.py -> build/bdist.linux-x86_64/wheel/sahara_plugin_spark/tests/unit/plugins
  copying build/lib/sahara_plugin_spark/tests/unit/base.py -> build/bdist.linux-x86_64/wheel/sahara_plugin_spark/tests/unit
  copying build/lib/sahara_plugin_spark/tests/unit/__init__.py -> build/bdist.linux-x86_64/wheel/sahara_plugin_spark/tests/unit
  copying build/lib/sahara_plugin_spark/i18n.py -> build/bdist.linux-x86_64/wheel/sahara_plugin_spark
  copying build/lib/sahara_plugin_spark/__init__.py -> build/bdist.linux-x86_64/wheel/sahara_plugin_spark
  running install_egg_info
  Copying sahara_plugin_spark.egg-info to build/bdist.linux-x86_64/wheel/sahara_plugin_spark-10.0.0-py3.9.egg-info
  running install_scripts
  adding license file "LICENSE" (matched pattern "LICEN[CS]E*")
  adding license file "AUTHORS" (matched pattern "AUTHORS*")
  creating build/bdist.linux-x86_64/wheel/sahara_plugin_spark-10.0.0.dist-info/WHEEL
  creating '/builddir/build/BUILD/sahara-plugin-spark-10.0.0/.pyproject-builddir/pip-wheel-i0hep9jr/tmpy12vdq10/sahara_plugin_spark-10.0.0-py3-none-any.whl' and adding 'build/bdist.linux-x86_64/wheel' to it
  adding 'sahara_plugin_spark/__init__.py'
  adding 'sahara_plugin_spark/i18n.py'
  adding 'sahara_plugin_spark/locale/de/LC_MESSAGES/sahara_plugin_spark.po'
  adding 'sahara_plugin_spark/locale/en_GB/LC_MESSAGES/sahara_plugin_spark.po'
  adding 'sahara_plugin_spark/locale/id/LC_MESSAGES/sahara_plugin_spark.po'
  adding 'sahara_plugin_spark/locale/ne/LC_MESSAGES/sahara_plugin_spark.po'
  adding 'sahara_plugin_spark/plugins/__init__.py'
  adding 'sahara_plugin_spark/plugins/spark/__init__.py'
  adding 'sahara_plugin_spark/plugins/spark/config_helper.py'
  adding 'sahara_plugin_spark/plugins/spark/edp_engine.py'
  adding 'sahara_plugin_spark/plugins/spark/images.py'
  adding 'sahara_plugin_spark/plugins/spark/plugin.py'
  adding 'sahara_plugin_spark/plugins/spark/run_scripts.py'
  adding 'sahara_plugin_spark/plugins/spark/scaling.py'
  adding 'sahara_plugin_spark/plugins/spark/shell_engine.py'
  adding 'sahara_plugin_spark/plugins/spark/resources/README.rst'
  adding 'sahara_plugin_spark/plugins/spark/resources/core-default.xml'
  adding 'sahara_plugin_spark/plugins/spark/resources/hdfs-default.xml'
  adding 'sahara_plugin_spark/plugins/spark/resources/spark-cleanup.cron'
  adding 'sahara_plugin_spark/plugins/spark/resources/spark-env.sh.template'
  adding 'sahara_plugin_spark/plugins/spark/resources/tmp-cleanup.sh.template'
  adding 'sahara_plugin_spark/plugins/spark/resources/topology.sh'
  adding 'sahara_plugin_spark/plugins/spark/resources/images/image.yaml'
  adding 'sahara_plugin_spark/plugins/spark/resources/images/centos/turn_off_services'
  adding 'sahara_plugin_spark/plugins/spark/resources/images/centos/wget_cdh_repo'
  adding 'sahara_plugin_spark/plugins/spark/resources/images/common/add_jar'
  adding 'sahara_plugin_spark/plugins/spark/resources/images/common/install_extjs'
  adding 'sahara_plugin_spark/plugins/spark/resources/images/common/install_spark'
  adding 'sahara_plugin_spark/plugins/spark/resources/images/common/manipulate_s3'
  adding 'sahara_plugin_spark/plugins/spark/resources/images/ubuntu/config_spark'
  adding 'sahara_plugin_spark/plugins/spark/resources/images/ubuntu/turn_off_services'
  adding 'sahara_plugin_spark/plugins/spark/resources/images/ubuntu/wget_cdh_repo'
  adding 'sahara_plugin_spark/tests/__init__.py'
  adding 'sahara_plugin_spark/tests/unit/__init__.py'
  adding 'sahara_plugin_spark/tests/unit/base.py'
  adding 'sahara_plugin_spark/tests/unit/plugins/__init__.py'
  adding 'sahara_plugin_spark/tests/unit/plugins/spark/__init__.py'
  adding 'sahara_plugin_spark/tests/unit/plugins/spark/test_config_helper.py'
  adding 'sahara_plugin_spark/tests/unit/plugins/spark/test_plugin.py'
  adding 'sahara_plugin_spark/utils/__init__.py'
  adding 'sahara_plugin_spark/utils/patches.py'
  adding 'sahara_plugin_spark-10.0.0.dist-info/AUTHORS'
  adding 'sahara_plugin_spark-10.0.0.dist-info/LICENSE'
  adding 'sahara_plugin_spark-10.0.0.dist-info/METADATA'
  adding 'sahara_plugin_spark-10.0.0.dist-info/WHEEL'
  adding 'sahara_plugin_spark-10.0.0.dist-info/entry_points.txt'
  adding 'sahara_plugin_spark-10.0.0.dist-info/pbr.json'
  adding 'sahara_plugin_spark-10.0.0.dist-info/top_level.txt'
  adding 'sahara_plugin_spark-10.0.0.dist-info/RECORD'
  removing build/bdist.linux-x86_64/wheel
  Building wheel for sahara-plugin-spark (PEP 517): finished with status 'done'
  Created wheel for sahara-plugin-spark: filename=sahara_plugin_spark-10.0.0-py3-none-any.whl size=57802 sha256=7c00d3a846186c9757503f5040148a75aea4e5f2d1baeb2e7550fe2169df7d46
  Stored in directory: /builddir/.cache/pip/wheels/8c/0c/70/e2bd7036e3d8ab20fa313ec37ee1d156a3e24812d104b5525d
Successfully built sahara-plugin-spark
+ TOX_TESTENV_PASSENV='*'
+ PYTHONDONTWRITEBYTECODE=1
+ PATH=/builddir/build/BUILDROOT/python-sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.el9.x86_64/usr/bin:/builddir/.local/bin:/builddir/bin:/usr/share/Modules/bin:/usr/bin:/bin:/usr/sbin:/sbin:/usr/local/sbin
+ PYTHONPATH=/builddir/build/BUILDROOT/python-sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.el9.x86_64/usr/lib64/python3.9/site-packages:/builddir/build/BUILDROOT/python-sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.el9.x86_64/usr/lib/python3.9/site-packages
+ PYTEST_ADDOPTS=' --ignore=/builddir/build/BUILD/sahara-plugin-spark-10.0.0/.pyproject-builddir'
+ HOSTNAME=rpmbuild
+ /usr/bin/python3 -m tox --current-env -q --recreate -e docs
Running Sphinx v3.4.3
[openstackdocstheme] version: 3.0.0
[openstackdocstheme] connecting html-page-context event handler
making output directory... done
[openstackdocstheme] overriding configured project name (Python) with name extracted from the package (sahara-plugin-spark); you can disable this behavior with the 'openstackdocs_auto_name' option
[openstackdocstheme] using theme from /usr/lib/python3.9/site-packages/openstackdocstheme/theme
[openstackdocstheme] no /builddir/build/BUILD/sahara-plugin-spark-10.0.0/.gitreview found
building [mo]: targets for 0 po files that are out of date
building [html]: targets for 5 source files that are out of date
updating environment: [new config] 5 added, 0 changed, 0 removed
reading sources... [ 20%] contributor/contributing
reading sources... [ 40%] contributor/index
reading sources... [ 60%] index
reading sources... [ 80%] user/index
reading sources... [100%] user/spark-plugin
looking for now-outdated files... none found
pickling environment... done
checking consistency... done
preparing documents... done
writing output... [ 20%] contributor/contributing
writing output... [ 40%] contributor/index
writing output... [ 60%] index
writing output... [ 80%] user/index
writing output... [100%] user/spark-plugin
generating indices... genindex done
writing additional pages... search done
copying static files... done
copying extra files... done
dumping search index in English (code: en)... done
dumping object inventory... done
build succeeded.
The HTML pages are in doc/build/html.
___________________________________ summary ____________________________________
  docs: commands succeeded
  congratulations :)
+ rm -rf doc/build/html/.doctrees doc/build/html/.buildinfo
+ sphinx-build -W -b man doc/source doc/build/man
Running Sphinx v3.4.3
[openstackdocstheme] version: 3.0.0
[openstackdocstheme] connecting html-page-context event handler
making output directory... done
[openstackdocstheme] overriding configured project name (Python) with name extracted from the package (sahara-plugin-spark); you can disable this behavior with the 'openstackdocs_auto_name' option
[openstackdocstheme] using theme from /usr/lib/python3.9/site-packages/openstackdocstheme/theme
[openstackdocstheme] no /builddir/build/BUILD/sahara-plugin-spark-10.0.0/.gitreview found
building [mo]: targets for 0 po files that are out of date
building [man]: all manpages
updating environment: [new config] 5 added, 0 changed, 0 removed
reading sources... [ 20%] contributor/contributing
reading sources... [ 40%] contributor/index
reading sources... [ 60%] index
reading sources... [ 80%] user/index
reading sources... [100%] user/spark-plugin
looking for now-outdated files... none found
pickling environment... done
checking consistency... done
writing... sahara-plugin-spark.1 { user/index user/spark-plugin contributor/index contributor/contributing } done
build succeeded.
The manual pages are in doc/build/man.
+ RPM_EC=0
++ jobs -p
+ exit 0
Executing(%install): /bin/sh -e /var/tmp/rpm-tmp.mpFYq9
+ umask 022
+ cd /builddir/build/BUILD
+ '[' /builddir/build/BUILDROOT/python-sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.el9.x86_64 '!=' / ']'
+ rm -rf /builddir/build/BUILDROOT/python-sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.el9.x86_64
++ dirname /builddir/build/BUILDROOT/python-sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.el9.x86_64
+ mkdir -p /builddir/build/BUILDROOT
+ mkdir /builddir/build/BUILDROOT/python-sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.el9.x86_64
+ cd sahara-plugin-spark-10.0.0
++ sed -E 's/([^-]+)-([^-]+)-.+\.whl/\1==\2/'
++ ls /builddir/build/BUILD/sahara-plugin-spark-10.0.0/pyproject-wheeldir/sahara_plugin_spark-10.0.0-py3-none-any.whl
++ xargs basename --multiple
+ specifier=sahara_plugin_spark==10.0.0
+ TMPDIR=/builddir/build/BUILD/sahara-plugin-spark-10.0.0/.pyproject-builddir
+ /usr/bin/python3 -m pip install --root /builddir/build/BUILDROOT/python-sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.el9.x86_64 --prefix /usr --no-deps --disable-pip-version-check --progress-bar off --verbose --ignore-installed --no-warn-script-location --no-index --no-cache-dir --find-links /builddir/build/BUILD/sahara-plugin-spark-10.0.0/pyproject-wheeldir sahara_plugin_spark==10.0.0
Using pip 21.2.3 from /usr/lib/python3.9/site-packages/pip (python 3.9)
Looking in links: /builddir/build/BUILD/sahara-plugin-spark-10.0.0/pyproject-wheeldir
Processing ./pyproject-wheeldir/sahara_plugin_spark-10.0.0-py3-none-any.whl
Installing collected packages: sahara-plugin-spark
Successfully installed sahara-plugin-spark-10.0.0
+ '[' -d /builddir/build/BUILDROOT/python-sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.el9.x86_64/usr/bin ']'
+ rm -f /builddir/build/BUILD/python-sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.el9.x86_64-pyproject-ghost-distinfo
+ site_dirs=()
+ '[' -d /builddir/build/BUILDROOT/python-sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.el9.x86_64/usr/lib/python3.9/site-packages ']'
+ site_dirs+=("/usr/lib/python3.9/site-packages")
+ '[' /builddir/build/BUILDROOT/python-sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.el9.x86_64/usr/lib64/python3.9/site-packages '!=' /builddir/build/BUILDROOT/python-sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.el9.x86_64/usr/lib/python3.9/site-packages ']'
+ '[' -d /builddir/build/BUILDROOT/python-sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.el9.x86_64/usr/lib64/python3.9/site-packages ']'
+ for site_dir in ${site_dirs[@]}
+ for distinfo in /builddir/build/BUILDROOT/python-sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.el9.x86_64$site_dir/*.dist-info
+ echo '%ghost /usr/lib/python3.9/site-packages/sahara_plugin_spark-10.0.0.dist-info'
+ sed -i s/pip/rpm/ /builddir/build/BUILDROOT/python-sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.el9.x86_64/usr/lib/python3.9/site-packages/sahara_plugin_spark-10.0.0.dist-info/INSTALLER
+ PYTHONPATH=/usr/lib/rpm/redhat
+ /usr/bin/python3 -B /usr/lib/rpm/redhat/pyproject_preprocess_record.py --buildroot /builddir/build/BUILDROOT/python-sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.el9.x86_64 --record /builddir/build/BUILDROOT/python-sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.el9.x86_64/usr/lib/python3.9/site-packages/sahara_plugin_spark-10.0.0.dist-info/RECORD --output /builddir/build/BUILD/python-sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.el9.x86_64-pyproject-record
+ rm -fv /builddir/build/BUILDROOT/python-sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.el9.x86_64/usr/lib/python3.9/site-packages/sahara_plugin_spark-10.0.0.dist-info/RECORD
removed '/builddir/build/BUILDROOT/python-sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.el9.x86_64/usr/lib/python3.9/site-packages/sahara_plugin_spark-10.0.0.dist-info/RECORD'
+ rm -fv /builddir/build/BUILDROOT/python-sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.el9.x86_64/usr/lib/python3.9/site-packages/sahara_plugin_spark-10.0.0.dist-info/REQUESTED
removed '/builddir/build/BUILDROOT/python-sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.el9.x86_64/usr/lib/python3.9/site-packages/sahara_plugin_spark-10.0.0.dist-info/REQUESTED'
++ wc -l /builddir/build/BUILD/python-sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.el9.x86_64-pyproject-ghost-distinfo
++ cut -f1 '-d '
+ lines=1
+ '[' 1 -ne 1 ']'
+ mkdir -p /builddir/build/BUILDROOT/python-sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.el9.x86_64/usr/share/man/man1
+ install -p -D -m 644 doc/build/man/sahara-plugin-spark.1 /builddir/build/BUILDROOT/python-sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.el9.x86_64/usr/share/man/man1/
+ /usr/bin/find-debuginfo -j4 --strict-build-id -m -i --build-id-seed 10.0.0-0.20231010183737.c68cb5e.el9 --unique-debug-suffix -10.0.0-0.20231010183737.c68cb5e.el9.x86_64 --unique-debug-src-base python-sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.el9.x86_64 --run-dwz --dwz-low-mem-die-limit 10000000 --dwz-max-die-limit 110000000 --remove-section .gnu.build.attributes -S debugsourcefiles.list /builddir/build/BUILD/sahara-plugin-spark-10.0.0
find: 'debug': No such file or directory
+ /usr/lib/rpm/check-buildroot
+ /usr/lib/rpm/redhat/brp-ldconfig
+ /usr/lib/rpm/brp-compress
+ /usr/lib/rpm/redhat/brp-strip-lto /usr/bin/strip
+ /usr/lib/rpm/brp-strip-static-archive /usr/bin/strip
+ /usr/lib/rpm/redhat/brp-python-bytecompile '' 1 0
Bytecompiling .py files below /builddir/build/BUILDROOT/python-sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.el9.x86_64/usr/lib/python3.9 using python3.9
mangling shebang in /usr/lib/python3.9/site-packages/sahara_plugin_spark/plugins/spark/resources/topology.sh from /bin/bash to #!/usr/bin/bash
mangling shebang in /usr/lib/python3.9/site-packages/sahara_plugin_spark/plugins/spark/resources/tmp-cleanup.sh.template from /bin/sh to #!/usr/bin/sh
mangling shebang in /usr/lib/python3.9/site-packages/sahara_plugin_spark/plugins/spark/resources/spark-env.sh.template from /usr/bin/env bash to #!/usr/bin/bash
Executing(%check): /bin/sh -e /var/tmp/rpm-tmp.AqDNev
+ /usr/lib/rpm/brp-python-hardlink
+ /usr/lib/rpm/redhat/brp-mangle-shebangs
+ umask 022
+ cd /builddir/build/BUILD
+ cd sahara-plugin-spark-10.0.0
+ TOX_TESTENV_PASSENV='*'
+ PYTHONDONTWRITEBYTECODE=1
+ PATH=/builddir/build/BUILDROOT/python-sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.el9.x86_64/usr/bin:/builddir/.local/bin:/builddir/bin:/usr/share/Modules/bin:/usr/bin:/bin:/usr/sbin:/sbin:/usr/local/sbin
+ PYTHONPATH=/builddir/build/BUILDROOT/python-sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.el9.x86_64/usr/lib64/python3.9/site-packages:/builddir/build/BUILDROOT/python-sahara-plugin-spark-10.0.0-0.20231010183737.c68cb5e.el9.x86_64/usr/lib/python3.9/site-packages
+ PYTEST_ADDOPTS=' --ignore=/builddir/build/BUILD/sahara-plugin-spark-10.0.0/.pyproject-builddir'
+ HOSTNAME=rpmbuild
+ /usr/bin/python3 -m tox --current-env -q --recreate -e py39
Arguments dropped when creating context: {'user': None, 'tenant': 'tenant_1'}
Arguments dropped when creating context: {'user': None, 'tenant': 'tenant_1'}
{0} sahara_plugin_spark.tests.unit.plugins.spark.test_plugin.SparkPluginTest.test_cleanup_configs [0.057187s] ... FAILED
Captured traceback:
~~~~~~~~~~~~~~~~~~~
    Traceback (most recent call last):
      File "/builddir/build/BUILD/sahara-plugin-spark-10.0.0/sahara_plugin_spark/tests/unit/plugins/spark/test_plugin.py", line 32, in setUp
    super(SparkPluginTest, self).setUp()
      File "/builddir/build/BUILD/sahara-plugin-spark-10.0.0/sahara_plugin_spark/tests/unit/base.py", line 52, in setUp
    db_api.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/plugins/db.py", line 21, in setup_db
    db_api.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/db/api.py", line 56, in setup_db
    return IMPL.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 268, in setup_db
    engine = get_engine()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 61, in get_engine
    facade = _create_facade_lazily()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 54, in _create_facade_lazily
    _FACADE = db_session.EngineFacade.from_config(CONF,
    TypeError: from_config() got an unexpected keyword argument 'autocommit'
Arguments dropped when creating context: {'user': None, 'tenant': 'tenant_1'}
{0} sahara_plugin_spark.tests.unit.plugins.spark.test_plugin.SparkPluginTest.test_plugin21_shell_engine [0.015153s] ... FAILED
Captured traceback:
~~~~~~~~~~~~~~~~~~~
    Traceback (most recent call last):
      File "/builddir/build/BUILD/sahara-plugin-spark-10.0.0/sahara_plugin_spark/tests/unit/plugins/spark/test_plugin.py", line 32, in setUp
    super(SparkPluginTest, self).setUp()
      File "/builddir/build/BUILD/sahara-plugin-spark-10.0.0/sahara_plugin_spark/tests/unit/base.py", line 52, in setUp
    db_api.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/plugins/db.py", line 21, in setup_db
    db_api.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/db/api.py", line 56, in setup_db
    return IMPL.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 268, in setup_db
    engine = get_engine()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 61, in get_engine
    facade = _create_facade_lazily()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 54, in _create_facade_lazily
    _FACADE = db_session.EngineFacade.from_config(CONF,
    TypeError: from_config() got an unexpected keyword argument 'autocommit'
Arguments dropped when creating context: {'user': None, 'tenant': 'tenant_1'}
{0} sahara_plugin_spark.tests.unit.plugins.spark.test_plugin.SparkPluginTest.test_plugin22_edp_engine [0.009023s] ... FAILED
Captured traceback:
~~~~~~~~~~~~~~~~~~~
    Traceback (most recent call last):
      File "/builddir/build/BUILD/sahara-plugin-spark-10.0.0/sahara_plugin_spark/tests/unit/plugins/spark/test_plugin.py", line 32, in setUp
    super(SparkPluginTest, self).setUp()
      File "/builddir/build/BUILD/sahara-plugin-spark-10.0.0/sahara_plugin_spark/tests/unit/base.py", line 52, in setUp
    db_api.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/plugins/db.py", line 21, in setup_db
    db_api.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/db/api.py", line 56, in setup_db
    return IMPL.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 268, in setup_db
    engine = get_engine()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 61, in get_engine
    facade = _create_facade_lazily()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 54, in _create_facade_lazily
    _FACADE = db_session.EngineFacade.from_config(CONF,
    TypeError: from_config() got an unexpected keyword argument 'autocommit'
{0} sahara_plugin_spark.tests.unit.plugins.spark.test_plugin.SparkProviderTest.test_supported_job_types [0.015794s] ... ok
Arguments dropped when creating context: {'user': None, 'tenant': 'tenant_1'}
Arguments dropped when creating context: {'user': None, 'tenant': 'tenant_1'}
Arguments dropped when creating context: {'user': None, 'tenant': 'tenant_1'}
Arguments dropped when creating context: {'user': None, 'tenant': 'tenant_1'}
Arguments dropped when creating context: {'user': None, 'tenant': 'tenant_1'}
{1} sahara_plugin_spark.tests.unit.plugins.spark.test_plugin.SparkPluginTest.test_plugin12_shell_engine [0.057275s] ... FAILED
Captured traceback:
~~~~~~~~~~~~~~~~~~~
    Traceback (most recent call last):
      File "/builddir/build/BUILD/sahara-plugin-spark-10.0.0/sahara_plugin_spark/tests/unit/plugins/spark/test_plugin.py", line 32, in setUp
    super(SparkPluginTest, self).setUp()
      File "/builddir/build/BUILD/sahara-plugin-spark-10.0.0/sahara_plugin_spark/tests/unit/base.py", line 52, in setUp
    db_api.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/plugins/db.py", line 21, in setup_db
    db_api.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/db/api.py", line 56, in setup_db
    return IMPL.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 268, in setup_db
    engine = get_engine()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 61, in get_engine
    facade = _create_facade_lazily()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 54, in _create_facade_lazily
    _FACADE = db_session.EngineFacade.from_config(CONF,
    TypeError: from_config() got an unexpected keyword argument 'autocommit'
{1} sahara_plugin_spark.tests.unit.plugins.spark.test_plugin.SparkPluginTest.test_plugin22_shell_engine [0.001815s] ... FAILED
Captured traceback:
~~~~~~~~~~~~~~~~~~~
    Traceback (most recent call last):
      File "/builddir/build/BUILD/sahara-plugin-spark-10.0.0/sahara_plugin_spark/tests/unit/plugins/spark/test_plugin.py", line 32, in setUp
    super(SparkPluginTest, self).setUp()
      File "/builddir/build/BUILD/sahara-plugin-spark-10.0.0/sahara_plugin_spark/tests/unit/base.py", line 52, in setUp
    db_api.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/plugins/db.py", line 21, in setup_db
    db_api.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/db/api.py", line 56, in setup_db
    return IMPL.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 268, in setup_db
    engine = get_engine()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 61, in get_engine
    facade = _create_facade_lazily()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 54, in _create_facade_lazily
    _FACADE = db_session.EngineFacade.from_config(CONF,
    TypeError: from_config() got an unexpected keyword argument 'autocommit'
{1} sahara_plugin_spark.tests.unit.plugins.spark.test_plugin.SparkPluginTest.test_plugin23_edp_engine [0.001293s] ... FAILED
Captured traceback:
~~~~~~~~~~~~~~~~~~~
    Traceback (most recent call last):
      File "/builddir/build/BUILD/sahara-plugin-spark-10.0.0/sahara_plugin_spark/tests/unit/plugins/spark/test_plugin.py", line 32, in setUp
    super(SparkPluginTest, self).setUp()
      File "/builddir/build/BUILD/sahara-plugin-spark-10.0.0/sahara_plugin_spark/tests/unit/base.py", line 52, in setUp
    db_api.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/plugins/db.py", line 21, in setup_db
    db_api.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/db/api.py", line 56, in setup_db
    return IMPL.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 268, in setup_db
    engine = get_engine()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 61, in get_engine
    facade = _create_facade_lazily()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 54, in _create_facade_lazily
    _FACADE = db_session.EngineFacade.from_config(CONF,
    TypeError: from_config() got an unexpected keyword argument 'autocommit'
{1} sahara_plugin_spark.tests.unit.plugins.spark.test_plugin.SparkPluginTest.test_plugin23_shell_engine [0.009775s] ... FAILED
Captured traceback:
~~~~~~~~~~~~~~~~~~~
    Traceback (most recent call last):
      File "/builddir/build/BUILD/sahara-plugin-spark-10.0.0/sahara_plugin_spark/tests/unit/plugins/spark/test_plugin.py", line 32, in setUp
    super(SparkPluginTest, self).setUp()
      File "/builddir/build/BUILD/sahara-plugin-spark-10.0.0/sahara_plugin_spark/tests/unit/base.py", line 52, in setUp
    db_api.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/plugins/db.py", line 21, in setup_db
    db_api.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/db/api.py", line 56, in setup_db
    return IMPL.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 268, in setup_db
    engine = get_engine()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 61, in get_engine
    facade = _create_facade_lazily()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 54, in _create_facade_lazily
    _FACADE = db_session.EngineFacade.from_config(CONF,
    TypeError: from_config() got an unexpected keyword argument 'autocommit'
{3} sahara_plugin_spark.tests.unit.plugins.spark.test_config_helper.ConfigHelperUtilsTest.test_make_hadoop_path [0.163602s] ... ok
Arguments dropped when creating context: {'user': None, 'tenant': 'tenant_1'}
Arguments dropped when creating context: {'user': None, 'tenant': 'tenant_1'}
{3} sahara_plugin_spark.tests.unit.plugins.spark.test_plugin.SparkPluginTest.test_plugin11_edp_engine [0.033521s] ... FAILED
Captured traceback:
~~~~~~~~~~~~~~~~~~~
    Traceback (most recent call last):
      File "/builddir/build/BUILD/sahara-plugin-spark-10.0.0/sahara_plugin_spark/tests/unit/plugins/spark/test_plugin.py", line 32, in setUp
    super(SparkPluginTest, self).setUp()
      File "/builddir/build/BUILD/sahara-plugin-spark-10.0.0/sahara_plugin_spark/tests/unit/base.py", line 52, in setUp
    db_api.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/plugins/db.py", line 21, in setup_db
    db_api.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/db/api.py", line 56, in setup_db
    return IMPL.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 268, in setup_db
    engine = get_engine()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 61, in get_engine
    facade = _create_facade_lazily()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 54, in _create_facade_lazily
    _FACADE = db_session.EngineFacade.from_config(CONF,
    TypeError: from_config() got an unexpected keyword argument 'autocommit'
{3} sahara_plugin_spark.tests.unit.plugins.spark.test_plugin.SparkPluginTest.test_plugin21_edp_engine [0.001613s] ... FAILED
Captured traceback:
~~~~~~~~~~~~~~~~~~~
    Traceback (most recent call last):
      File "/builddir/build/BUILD/sahara-plugin-spark-10.0.0/sahara_plugin_spark/tests/unit/plugins/spark/test_plugin.py", line 32, in setUp
    super(SparkPluginTest, self).setUp()
      File "/builddir/build/BUILD/sahara-plugin-spark-10.0.0/sahara_plugin_spark/tests/unit/base.py", line 52, in setUp
    db_api.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/plugins/db.py", line 21, in setup_db
    db_api.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/db/api.py", line 56, in setup_db
    return IMPL.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 268, in setup_db
    engine = get_engine()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 61, in get_engine
    facade = _create_facade_lazily()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 54, in _create_facade_lazily
    _FACADE = db_session.EngineFacade.from_config(CONF,
    TypeError: from_config() got an unexpected keyword argument 'autocommit'
Arguments dropped when creating context: {'user': None, 'tenant': 'tenant_1'}
Arguments dropped when creating context: {'user': None, 'tenant': 'tenant_1'}
{2} sahara_plugin_spark.tests.unit.plugins.spark.test_config_helper.ConfigHelperUtilsTest.test_cleanup_configs [0.096773s] ... ok
Arguments dropped when creating context: {'user': None, 'tenant': 'tenant_1'}
{2} sahara_plugin_spark.tests.unit.plugins.spark.test_config_helper.ConfigHelperUtilsTest.test_generate_xml_configs [0.010436s] ... ok
Arguments dropped when creating context: {'user': None, 'tenant': 'tenant_1'}
{2} sahara_plugin_spark.tests.unit.plugins.spark.test_plugin.SparkProviderTest.test_edp_config_hints [0.001741s] ... ok
{2} sahara_plugin_spark.tests.unit.plugins.spark.test_plugin.SparkValidationTest.test_validate [1.858649s] ... ok
==============================
Failed 9 tests - output below:
==============================
sahara_plugin_spark.tests.unit.plugins.spark.test_plugin.SparkPluginTest.test_cleanup_configs
---------------------------------------------------------------------------------------------
Captured traceback:
~~~~~~~~~~~~~~~~~~~
    Traceback (most recent call last):
      File "/builddir/build/BUILD/sahara-plugin-spark-10.0.0/sahara_plugin_spark/tests/unit/plugins/spark/test_plugin.py", line 32, in setUp
    super(SparkPluginTest, self).setUp()
      File "/builddir/build/BUILD/sahara-plugin-spark-10.0.0/sahara_plugin_spark/tests/unit/base.py", line 52, in setUp
    db_api.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/plugins/db.py", line 21, in setup_db
    db_api.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/db/api.py", line 56, in setup_db
    return IMPL.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 268, in setup_db
    engine = get_engine()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 61, in get_engine
    facade = _create_facade_lazily()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 54, in _create_facade_lazily
    _FACADE = db_session.EngineFacade.from_config(CONF,
    TypeError: from_config() got an unexpected keyword argument 'autocommit'
sahara_plugin_spark.tests.unit.plugins.spark.test_plugin.SparkPluginTest.test_plugin21_shell_engine
---------------------------------------------------------------------------------------------------
Captured traceback:
~~~~~~~~~~~~~~~~~~~
    Traceback (most recent call last):
      File "/builddir/build/BUILD/sahara-plugin-spark-10.0.0/sahara_plugin_spark/tests/unit/plugins/spark/test_plugin.py", line 32, in setUp
    super(SparkPluginTest, self).setUp()
      File "/builddir/build/BUILD/sahara-plugin-spark-10.0.0/sahara_plugin_spark/tests/unit/base.py", line 52, in setUp
    db_api.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/plugins/db.py", line 21, in setup_db
    db_api.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/db/api.py", line 56, in setup_db
    return IMPL.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 268, in setup_db
    engine = get_engine()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 61, in get_engine
    facade = _create_facade_lazily()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 54, in _create_facade_lazily
    _FACADE = db_session.EngineFacade.from_config(CONF,
    TypeError: from_config() got an unexpected keyword argument 'autocommit'
sahara_plugin_spark.tests.unit.plugins.spark.test_plugin.SparkPluginTest.test_plugin22_edp_engine
-------------------------------------------------------------------------------------------------
Captured traceback:
~~~~~~~~~~~~~~~~~~~
    Traceback (most recent call last):
      File "/builddir/build/BUILD/sahara-plugin-spark-10.0.0/sahara_plugin_spark/tests/unit/plugins/spark/test_plugin.py", line 32, in setUp
    super(SparkPluginTest, self).setUp()
      File "/builddir/build/BUILD/sahara-plugin-spark-10.0.0/sahara_plugin_spark/tests/unit/base.py", line 52, in setUp
    db_api.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/plugins/db.py", line 21, in setup_db
    db_api.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/db/api.py", line 56, in setup_db
    return IMPL.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 268, in setup_db
    engine = get_engine()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 61, in get_engine
    facade = _create_facade_lazily()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 54, in _create_facade_lazily
    _FACADE = db_session.EngineFacade.from_config(CONF,
    TypeError: from_config() got an unexpected keyword argument 'autocommit'
sahara_plugin_spark.tests.unit.plugins.spark.test_plugin.SparkPluginTest.test_plugin12_shell_engine
---------------------------------------------------------------------------------------------------
Captured traceback:
~~~~~~~~~~~~~~~~~~~
    Traceback (most recent call last):
      File "/builddir/build/BUILD/sahara-plugin-spark-10.0.0/sahara_plugin_spark/tests/unit/plugins/spark/test_plugin.py", line 32, in setUp
    super(SparkPluginTest, self).setUp()
      File "/builddir/build/BUILD/sahara-plugin-spark-10.0.0/sahara_plugin_spark/tests/unit/base.py", line 52, in setUp
    db_api.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/plugins/db.py", line 21, in setup_db
    db_api.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/db/api.py", line 56, in setup_db
    return IMPL.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 268, in setup_db
    engine = get_engine()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 61, in get_engine
    facade = _create_facade_lazily()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 54, in _create_facade_lazily
    _FACADE = db_session.EngineFacade.from_config(CONF,
    TypeError: from_config() got an unexpected keyword argument 'autocommit'
sahara_plugin_spark.tests.unit.plugins.spark.test_plugin.SparkPluginTest.test_plugin22_shell_engine
---------------------------------------------------------------------------------------------------
Captured traceback:
~~~~~~~~~~~~~~~~~~~
    Traceback (most recent call last):
      File "/builddir/build/BUILD/sahara-plugin-spark-10.0.0/sahara_plugin_spark/tests/unit/plugins/spark/test_plugin.py", line 32, in setUp
    super(SparkPluginTest, self).setUp()
      File "/builddir/build/BUILD/sahara-plugin-spark-10.0.0/sahara_plugin_spark/tests/unit/base.py", line 52, in setUp
    db_api.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/plugins/db.py", line 21, in setup_db
    db_api.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/db/api.py", line 56, in setup_db
    return IMPL.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 268, in setup_db
    engine = get_engine()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 61, in get_engine
    facade = _create_facade_lazily()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 54, in _create_facade_lazily
    _FACADE = db_session.EngineFacade.from_config(CONF,
    TypeError: from_config() got an unexpected keyword argument 'autocommit'
sahara_plugin_spark.tests.unit.plugins.spark.test_plugin.SparkPluginTest.test_plugin23_edp_engine
-------------------------------------------------------------------------------------------------
Captured traceback:
~~~~~~~~~~~~~~~~~~~
    Traceback (most recent call last):
      File "/builddir/build/BUILD/sahara-plugin-spark-10.0.0/sahara_plugin_spark/tests/unit/plugins/spark/test_plugin.py", line 32, in setUp
    super(SparkPluginTest, self).setUp()
      File "/builddir/build/BUILD/sahara-plugin-spark-10.0.0/sahara_plugin_spark/tests/unit/base.py", line 52, in setUp
    db_api.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/plugins/db.py", line 21, in setup_db
    db_api.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/db/api.py", line 56, in setup_db
    return IMPL.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 268, in setup_db
    engine = get_engine()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 61, in get_engine
    facade = _create_facade_lazily()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 54, in _create_facade_lazily
    _FACADE = db_session.EngineFacade.from_config(CONF,
    TypeError: from_config() got an unexpected keyword argument 'autocommit'
sahara_plugin_spark.tests.unit.plugins.spark.test_plugin.SparkPluginTest.test_plugin23_shell_engine
---------------------------------------------------------------------------------------------------
Captured traceback:
~~~~~~~~~~~~~~~~~~~
    Traceback (most recent call last):
      File "/builddir/build/BUILD/sahara-plugin-spark-10.0.0/sahara_plugin_spark/tests/unit/plugins/spark/test_plugin.py", line 32, in setUp
    super(SparkPluginTest, self).setUp()
      File "/builddir/build/BUILD/sahara-plugin-spark-10.0.0/sahara_plugin_spark/tests/unit/base.py", line 52, in setUp
    db_api.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/plugins/db.py", line 21, in setup_db
    db_api.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/db/api.py", line 56, in setup_db
    return IMPL.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 268, in setup_db
    engine = get_engine()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 61, in get_engine
    facade = _create_facade_lazily()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 54, in _create_facade_lazily
    _FACADE = db_session.EngineFacade.from_config(CONF,
    TypeError: from_config() got an unexpected keyword argument 'autocommit'
sahara_plugin_spark.tests.unit.plugins.spark.test_plugin.SparkPluginTest.test_plugin11_edp_engine
-------------------------------------------------------------------------------------------------
Captured traceback:
~~~~~~~~~~~~~~~~~~~
    Traceback (most recent call last):
      File "/builddir/build/BUILD/sahara-plugin-spark-10.0.0/sahara_plugin_spark/tests/unit/plugins/spark/test_plugin.py", line 32, in setUp
    super(SparkPluginTest, self).setUp()
      File "/builddir/build/BUILD/sahara-plugin-spark-10.0.0/sahara_plugin_spark/tests/unit/base.py", line 52, in setUp
    db_api.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/plugins/db.py", line 21, in setup_db
    db_api.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/db/api.py", line 56, in setup_db
    return IMPL.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 268, in setup_db
    engine = get_engine()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 61, in get_engine
    facade = _create_facade_lazily()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 54, in _create_facade_lazily
    _FACADE = db_session.EngineFacade.from_config(CONF,
    TypeError: from_config() got an unexpected keyword argument 'autocommit'
sahara_plugin_spark.tests.unit.plugins.spark.test_plugin.SparkPluginTest.test_plugin21_edp_engine
-------------------------------------------------------------------------------------------------
Captured traceback:
~~~~~~~~~~~~~~~~~~~
    Traceback (most recent call last):
      File "/builddir/build/BUILD/sahara-plugin-spark-10.0.0/sahara_plugin_spark/tests/unit/plugins/spark/test_plugin.py", line 32, in setUp
    super(SparkPluginTest, self).setUp()
      File "/builddir/build/BUILD/sahara-plugin-spark-10.0.0/sahara_plugin_spark/tests/unit/base.py", line 52, in setUp
    db_api.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/plugins/db.py", line 21, in setup_db
    db_api.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/db/api.py", line 56, in setup_db
    return IMPL.setup_db()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 268, in setup_db
    engine = get_engine()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 61, in get_engine
    facade = _create_facade_lazily()
      File "/usr/lib/python3.9/site-packages/sahara/db/sqlalchemy/api.py", line 54, in _create_facade_lazily
    _FACADE = db_session.EngineFacade.from_config(CONF,
    TypeError: from_config() got an unexpected keyword argument 'autocommit'
======
Totals
======
Ran: 15 tests in 5.2462 sec.
 - Passed: 6
 - Skipped: 0
 - Expected Fail: 0
 - Unexpected Success: 0
 - Failed: 9
Sum of execute time for each test: 2.3337 sec.
==============
Worker Balance
==============
 - Worker 0 (4 tests) => 0:00:00.103863
 - Worker 1 (4 tests) => 0:00:00.089626
 - Worker 2 (4 tests) => 0:00:01.968213
 - Worker 3 (3 tests) => 0:00:00.212327
ERROR: InvocationError for command /usr/bin/stestr run (exited with code 1)
___________________________________ summary ____________________________________
ERROR:   py39: commands failed
RPM build errors:
error: Bad exit status from /var/tmp/rpm-tmp.AqDNev (%check)
    Bad exit status from /var/tmp/rpm-tmp.AqDNev (%check)
Child return code was: 1
EXCEPTION: [Error('Command failed: \n # bash --login -c /usr/bin/rpmbuild -ba --noprep  --target x86_64 --nodeps /builddir/build/SPECS/python-sahara-plugin-spark.spec\n', 1)]
Traceback (most recent call last):
  File "/usr/lib/python3.6/site-packages/mockbuild/trace_decorator.py", line 93, in trace
    result = func(*args, **kw)
  File "/usr/lib/python3.6/site-packages/mockbuild/util.py", line 598, in do_with_status
    raise exception.Error("Command failed: \n # %s\n%s" % (command, output), child.returncode)
mockbuild.exception.Error: Command failed: 
 # bash --login -c /usr/bin/rpmbuild -ba --noprep  --target x86_64 --nodeps /builddir/build/SPECS/python-sahara-plugin-spark.spec