I have also tried using the echo command and i have it on github also but for some reason it wont be detected.
Here is what i have inside Procfile `"worker: python main.py"
Here is the build log:
-----> Python app detected
-----> Installing python-3.8.5
-----> Installing pip 20.1.1, setuptools 47.1.1 and wheel 0.34.2
-----> Installing SQLite3
-----> Installing requirements with pip
Collecting certifi==2020.6.20
Downloading certifi-2020.6.20-py2.py3-none-any.whl (156 kB)
Collecting chardet==3.0.4
Downloading chardet-3.0.4-py2.py3-none-any.whl (133 kB)
Collecting idna==2.10
Downloading idna-2.10-py2.py3-none-any.whl (58 kB)
Collecting oauthlib==3.1.0
Downloading oauthlib-3.1.0-py2.py3-none-any.whl (147 kB)
Collecting PySocks==1.7.1
Downloading PySocks-1.7.1-py3-none-any.whl (16 kB)
Collecting requests==2.24.0
Downloading requests-2.24.0-py2.py3-none-any.whl (61 kB)
Collecting requests-oauthlib==1.3.0
Downloading requests_oauthlib-1.3.0-py2.py3-none-any.whl (23 kB)
Collecting six==1.15.0
Downloading six-1.15.0-py2.py3-none-any.whl (10 kB)
Collecting tweepy==3.9.0
Downloading tweepy-3.9.0-py2.py3-none-any.whl (30 kB)
Collecting urllib3==1.25.10
Downloading urllib3-1.25.10-py2.py3-none-any.whl (127 kB)
Installing collected packages: certifi, chardet, idna, oauthlib, PySocks, urllib3, requests, requests-oauthlib, six, tweepy
Successfully installed PySocks-1.7.1 certifi-2020.6.20 chardet-3.0.4 idna-2.10 oauthlib-3.1.0 requests-2.24.0 requests-oauthlib-1.3.0 six-1.15.0 tweepy-3.9.0 urllib3-1.25.10
-----> Discovering process types
-----> Compressing...
Done: 53.2M
-----> Launching...
Released v3
https://twitterbot1k.herokuapp.com/ deployed to Heroku
`
Have you added requirements.txt file in your root directory? I faced similar issue when I forgot to add requirements.txt. Please create empty file if you don't need any extra packages. It tells Heroku that your application is written in Python.
Related
I am getting AttributeError: 'PosixPath' object has no attribute 'read_text' when trying to deploy a django app to heroku. The problem occurs when pip is installing pathlib python library after running the git push heroku main command on the command line. Is there something I am doing wrong?
The following is the log generated on heroku, which is similar to what I saw on the command line:
-----> Building on the Heroku-20 stack
-----> Using buildpacks:
1. https://github.com/jonathanong/heroku-buildpack-ffmpeg-latest.git
2. heroku/python
-----> ffmpeg app detected
-----> Installing ffmpeg
Variable FFMPEG_DOWNLOAD_URL isn't set, using default value
Downloading https://johnvansickle.com/ffmpeg/builds/ffmpeg-git-amd64-static.tar.xz
Unpacking the archive
Installation successful
-----> Python app detected
-----> Using Python version specified in runtime.txt
! Python has released a security update! Please consider upgrading to python-3.7.13
Learn More: https://devcenter.heroku.com/articles/python-runtimes
-----> Python version has changed from python-3.9.10 to python-3.7.10, clearing cache
-----> Requirements file has been changed, clearing cached dependencies
-----> Installing python-3.7.10
-----> Installing pip 22.0.4, setuptools 60.10.0 and wheel 0.37.1
-----> Installing SQLite3
-----> Installing requirements with pip
Collecting asgiref==3.5.0
Downloading asgiref-3.5.0-py3-none-any.whl (22 kB)
Collecting attrs==21.4.0
Downloading attrs-21.4.0-py2.py3-none-any.whl (60 kB)
Collecting autobahn==22.2.2
Downloading autobahn-22.2.2.tar.gz (375 kB)
Preparing metadata (setup.py): started
Preparing metadata (setup.py): finished with status 'done'
Collecting Automat==20.2.0
Downloading Automat-20.2.0-py2.py3-none-any.whl (31 kB)
Collecting blinker==1.4
Downloading blinker-1.4.tar.gz (111 kB)
Preparing metadata (setup.py): started
Preparing metadata (setup.py): finished with status 'done'
Collecting cachetools==5.0.0
Downloading cachetools-5.0.0-py3-none-any.whl (9.1 kB)
Collecting certifi==2021.10.8
Downloading certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting cffi==1.15.0
Downloading cffi-1.15.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (427 kB)
Collecting channels==3.0.4
Downloading channels-3.0.4-py3-none-any.whl (38 kB)
Collecting charset-normalizer==2.0.12
Downloading charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting constantly==15.1.0
Downloading constantly-15.1.0-py2.py3-none-any.whl (7.9 kB)
Collecting cryptography==36.0.1
Downloading cryptography-36.0.1-cp36-abi3-manylinux_2_24_x86_64.whl (3.6 MB)
Collecting daphne==3.0.2
Downloading daphne-3.0.2-py3-none-any.whl (26 kB)
Collecting Django==3.2.12
Downloading Django-3.2.12-py3-none-any.whl (7.9 MB)
Collecting djangorestframework==3.13.1
Downloading djangorestframework-3.13.1-py3-none-any.whl (958 kB)
Collecting ffmpeg-python==0.2.0
Downloading ffmpeg_python-0.2.0-py3-none-any.whl (25 kB)
Collecting future==0.18.2
Downloading future-0.18.2.tar.gz (829 kB)
Preparing metadata (setup.py): started
Preparing metadata (setup.py): finished with status 'done'
Collecting google-api-core==2.7.2
Downloading google_api_core-2.7.2-py3-none-any.whl (114 kB)
Collecting google-api-python-client==2.44.0
Downloading google_api_python_client-2.44.0-py2.py3-none-any.whl (8.3 MB)
Collecting google-auth==2.6.4
Downloading google_auth-2.6.4-py2.py3-none-any.whl (156 kB)
Collecting google-auth-httplib2==0.1.0
Downloading google_auth_httplib2-0.1.0-py2.py3-none-any.whl (9.3 kB)
Collecting google-auth-oauthlib==0.5.1
Downloading google_auth_oauthlib-0.5.1-py2.py3-none-any.whl (19 kB)
Collecting googleapis-common-protos==1.56.0
Downloading googleapis_common_protos-1.56.0-py2.py3-none-any.whl (241 kB)
Collecting httplib2==0.20.4
Downloading httplib2-0.20.4-py3-none-any.whl (96 kB)
Collecting hyperlink==21.0.0
Downloading hyperlink-21.0.0-py2.py3-none-any.whl (74 kB)
Collecting idna==3.3
Downloading idna-3.3-py3-none-any.whl (61 kB)
Collecting incremental==21.3.0
Downloading incremental-21.3.0-py2.py3-none-any.whl (15 kB)
Collecting mega.py==1.0.8
Downloading mega.py-1.0.8-py2.py3-none-any.whl (19 kB)
Collecting oauthlib==3.2.0
Downloading oauthlib-3.2.0-py3-none-any.whl (151 kB)
Collecting protobuf==3.20.0
Downloading protobuf-3.20.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl (1.0 MB)
Collecting pyasn1==0.4.8
Downloading pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting pyasn1-modules==0.2.8
Downloading pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting pycparser==2.21
Downloading pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting pycryptodome==3.14.1
Downloading pycryptodome-3.14.1-cp35-abi3-manylinux2010_x86_64.whl (2.0 MB)
Collecting pyOpenSSL==22.0.0
Downloading pyOpenSSL-22.0.0-py2.py3-none-any.whl (55 kB)
Collecting pyparsing==3.0.8
Downloading pyparsing-3.0.8-py3-none-any.whl (98 kB)
Collecting pytz==2022.1
Downloading pytz-2022.1-py2.py3-none-any.whl (503 kB)
Collecting requests==2.27.1
Downloading requests-2.27.1-py2.py3-none-any.whl (63 kB)
Collecting requests-oauthlib==1.3.1
Downloading requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting rsa==4.8
Downloading rsa-4.8-py3-none-any.whl (39 kB)
Collecting service-identity==21.1.0
Downloading service_identity-21.1.0-py2.py3-none-any.whl (12 kB)
Collecting six==1.16.0
Downloading six-1.16.0-py2.py3-none-any.whl (11 kB)
Collecting sqlparse==0.4.2
Downloading sqlparse-0.4.2-py3-none-any.whl (42 kB)
Collecting tenacity==5.1.5
Downloading tenacity-5.1.5-py2.py3-none-any.whl (34 kB)
Collecting Twisted==22.2.0
Downloading Twisted-22.2.0-py3-none-any.whl (3.1 MB)
Collecting txaio==22.2.1
Downloading txaio-22.2.1-py2.py3-none-any.whl (30 kB)
Collecting typing_extensions==4.1.1
Downloading typing_extensions-4.1.1-py3-none-any.whl (26 kB)
Collecting uritemplate==4.1.1
Downloading uritemplate-4.1.1-py2.py3-none-any.whl (10 kB)
Collecting urllib3==1.26.9
Downloading urllib3-1.26.9-py2.py3-none-any.whl (138 kB)
Collecting zope.interface==5.4.0
Downloading zope.interface-5.4.0-cp37-cp37m-manylinux2010_x86_64.whl (251 kB)
Collecting twisted[tls]>=18.7
Downloading Twisted-22.4.0-py3-none-any.whl (3.1 MB)
Collecting pathlib==1.0.1
Downloading pathlib-1.0.1.tar.gz (49 kB)
Preparing metadata (setup.py): started
Preparing metadata (setup.py): finished with status 'error'
error: subprocess-exited-with-error
× python setup.py egg_info did not run successfully.
│ exit code: 1
╰─> [20 lines of output]
Traceback (most recent call last):
File "<string>", line 36, in <module>
File "<pip-setuptools-caller>", line 34, in <module>
File "/tmp/pip-install-fwxp6f5v/pathlib_5dc29fdd1ec44473b4d7032f8a49d94d/setup.py", line 30, in <module>
url='https://pathlib.readthedocs.org/',
File "/app/.heroku/python/lib/python3.7/site-packages/setuptools/_distutils/core.py", line 109, in setup
_setup_distribution = dist = klass(attrs)
File "/app/.heroku/python/lib/python3.7/site-packages/setuptools/dist.py", line 457, in __init__
for ep in metadata.entry_points(group='distutils.setup_keywords'):
File "/app/.heroku/python/lib/python3.7/site-packages/setuptools/_vendor/importlib_metadata/__init__.py", line 999, in entry_points
return SelectableGroups.load(eps).select(**params)
File "/app/.heroku/python/lib/python3.7/site-packages/setuptools/_vendor/importlib_metadata/__init__.py", line 449, in load
ordered = sorted(eps, key=by_group)
File "/app/.heroku/python/lib/python3.7/site-packages/setuptools/_vendor/importlib_metadata/__init__.py", line 997, in <genexpr>
dist.entry_points for dist in unique(distributions())
File "/app/.heroku/python/lib/python3.7/site-packages/setuptools/_vendor/importlib_metadata/__init__.py", line 609, in entry_points
return EntryPoints._from_text_for(self.read_text('entry_points.txt'), self)
File "/app/.heroku/python/lib/python3.7/site-packages/setuptools/_vendor/importlib_metadata/__init__.py", line 917, in read_text
return self._path.joinpath(filename).read_text(encoding='utf-8')
AttributeError: 'PosixPath' object has no attribute 'read_text'
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
error: metadata-generation-failed
× Encountered error while generating package metadata.
╰─> See above for output.
note: This is an issue with the package mentioned above, not pip.
hint: See above for details.
! Push rejected, failed to compile Python app.
! Push failed
I found a solution that worked for me; I removed the packages mega and pathlib form the requirements.txt file.
I am trying to install PySuperTuxKart onto my windows 10 computer via anaconda.
However every time I try to install PySuperTuxKart, PySuperTuxKart installs fine but PySuperTuxKartData seems to hang. I have let it sit for 5 minutes with no progress being made. My metric is that the loading animation just gets stuck this whole time and does not move. I have tried this on python versions 3.9-3.7. It installs fine in a venv that is version 3.10 but I need a package that is only available up to python 3.9. I enabled the verbose flag on the pip install and this is the output.
Using pip 22.0.4 from C:\Users\<user>\Anaconda3\envs\test\lib\site-packages\pip (python 3.7)
Collecting PySuperTuxKart
Using cached PySuperTuxKart-1.1.2-cp37-cp37m-win_amd64.whl (2.7 MB)
Collecting PySuperTuxKartData
Using cached PySuperTuxKartData-1.0.0.tar.gz (2.6 kB)
Installing build dependencies ... Running command pip subprocess to install build dependencies
Collecting setuptools>=42
Using cached setuptools-62.1.0-py3-none-any.whl (1.1 MB)
Collecting requests
Using cached requests-2.27.1-py2.py3-none-any.whl (63 kB)
Collecting wheel
Using cached wheel-0.37.1-py2.py3-none-any.whl (35 kB)
Collecting idna<4,>=2.5
Using cached idna-3.3-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17
Using cached certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
Collecting charset-normalizer~=2.0.0
Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting urllib3<1.27,>=1.21.1
Using cached urllib3-1.26.9-py2.py3-none-any.whl (138 kB)
Installing collected packages: certifi, wheel, urllib3, setuptools, idna, charset-normalizer, requests
Successfully installed certifi-2021.10.8 charset-normalizer-2.0.12 idna-3.3 requests-2.27.1 setuptools-62.1.0 urllib3-1.26.9 wheel-0.37.1
done
Getting requirements to build wheel ... Running command Getting requirements to build wheel
running egg_info
writing PySuperTuxKartData.egg-info\PKG-INFO
writing dependency_links to PySuperTuxKartData.egg-info\dependency_links.txt
writing requirements to PySuperTuxKartData.egg-info\requires.txt
writing top-level names to PySuperTuxKartData.egg-info\top_level.txt
reading manifest file 'PySuperTuxKartData.egg-info\SOURCES.txt'
writing manifest file 'PySuperTuxKartData.egg-info\SOURCES.txt'
done
Preparing metadata (pyproject.toml) ... Running command Preparing metadata (pyproject.toml)
running dist_info
creating C:\Users\<user>\AppData\Local\Temp\pip-modern-metadata-lz0lblhq\PySuperTuxKartData.egg-info
writing C:\Users\<user>\AppData\Local\Temp\pip-modern-metadata-lz0lblhq\PySuperTuxKartData.egg-info\PKG-INFO
writing dependency_links to C:\Users\<user>\AppData\Local\Temp\pip-modern-metadata-lz0lblhq\PySuperTuxKartData.egg-info\dependency_links.txt
writing requirements to C:\Users\<user>\AppData\Local\Temp\pip-modern-metadata-lz0lblhq\PySuperTuxKartData.egg-info\requires.txt
writing top-level names to C:\Users\<user>\AppData\Local\Temp\pip-modern-metadata-lz0lblhq\PySuperTuxKartData.egg-info\top_level.txt
writing manifest file 'C:\Users\<user>\AppData\Local\Temp\pip-modern-metadata-lz0lblhq\PySuperTuxKartData.egg-info\SOURCES.txt'
reading manifest file 'C:\Users\<user>\AppData\Local\Temp\pip-modern-metadata-lz0lblhq\PySuperTuxKartData.egg-info\SOURCES.txt'
writing manifest file 'C:\Users\<user>\AppData\Local\Temp\pip-modern-metadata-lz0lblhq\PySuperTuxKartData.egg-info\SOURCES.txt'
creating 'C:\Users\<user>\AppData\Local\Temp\pip-modern-metadata-lz0lblhq\PySuperTuxKartData-1.0.0.dist-info'
done
Collecting requests
Using cached requests-2.27.1-py2.py3-none-any.whl (63 kB)
Collecting urllib3<1.27,>=1.21.1
Using cached urllib3-1.26.9-py2.py3-none-any.whl (138 kB)
Collecting charset-normalizer~=2.0.0
Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Requirement already satisfied: certifi>=2017.4.17 in c:\users\<user>\anaconda3\envs\test\lib\site-packages (from requests->PySuperTuxKartData->PySuperTuxKart) (2021.10.8)
Collecting idna<4,>=2.5
Using cached idna-3.3-py3-none-any.whl (61 kB)
Building wheels for collected packages: PySuperTuxKartData
Building wheel for PySuperTuxKartData (pyproject.toml) ... Running command Building wheel for PySuperTuxKartData (pyproject.toml)
running bdist_wheel
running build
running build_py
creating build
creating build\lib
creating build\lib\pystk_data
copying .\setup.py -> build\lib\pystk_data
copying .\__init__.py -> build\lib\pystk_data
running fetch_data
It hangs on the running fetch_data step. Any suggestions?
I recently successfully deployed a Heroku app(view log below for details), but when I try to access my app through a browser, it returns with An error occurred in the application and your page could not be served. If you are the application owner, check your logs for details. You can do this from the Heroku CLI with the command heroku logs --tail. Furthermore, my app was for a Discord.Py bot, and the bot was not online like it should be.
Logs:
-----> Building on the Heroku-20 stack
-----> Using buildpack: heroku/python
-----> Python app detected
-----> No Python version was specified. Using the same version as the last build: python-3.9.9
To use a different version, see: https://devcenter.heroku.com/articles/python-runtimes
-----> Requirements file has been changed, clearing cached dependencies
-----> Installing python-3.9.9
-----> Installing pip 21.3.1, setuptools 57.5.0 and wheel 0.37.0
-----> Installing SQLite3
-----> Installing requirements with pip
Collecting discord.py==1.6.0
Downloading discord.py-1.6.0-py3-none-any.whl (779 kB)
Collecting Flask==2.0.2
Downloading Flask-2.0.2-py3-none-any.whl (95 kB)
Collecting dnspython==1.16.0
Downloading dnspython-1.16.0-py2.py3-none-any.whl (188 kB)
Collecting PyNaCl==1.3.0
Downloading PyNaCl-1.3.0-cp34-abi3-manylinux1_x86_64.whl (759 kB)
Collecting async-timeout==3.0.1
Downloading async_timeout-3.0.1-py3-none-any.whl (8.2 kB)
Collecting aiohttp<3.8.0,>=3.6.0
Downloading aiohttp-3.7.4.post0-cp39-cp39-manylinux2014_x86_64.whl (1.4 MB)
Collecting Jinja2>=3.0
Downloading Jinja2-3.0.3-py3-none-any.whl (133 kB)
Collecting itsdangerous>=2.0
Downloading itsdangerous-2.0.1-py3-none-any.whl (18 kB)
Collecting Werkzeug>=2.0
Downloading Werkzeug-2.0.2-py3-none-any.whl (288 kB)
Collecting click>=7.1.2
Downloading click-8.0.3-py3-none-any.whl (97 kB)
Collecting cffi>=1.4.1
Downloading cffi-1.15.0-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (444 kB)
Collecting six
Downloading six-1.16.0-py2.py3-none-any.whl (11 kB)
Collecting yarl<2.0,>=1.0
Downloading yarl-1.7.2-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (304 kB)
Collecting attrs>=17.3.0
Downloading attrs-21.4.0-py2.py3-none-any.whl (60 kB)
Collecting chardet<5.0,>=2.0
Downloading chardet-4.0.0-py2.py3-none-any.whl (178 kB)
Collecting typing-extensions>=3.6.5
Downloading typing_extensions-4.0.1-py3-none-any.whl (22 kB)
Collecting multidict<7.0,>=4.5
Downloading multidict-5.2.0-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (174 kB)
Collecting pycparser
Downloading pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting MarkupSafe>=2.0
Downloading MarkupSafe-2.0.1-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (30 kB)
Collecting idna>=2.0
Downloading idna-3.3-py3-none-any.whl (61 kB)
Installing collected packages: multidict, idna, yarl, typing-extensions, pycparser, MarkupSafe, chardet, attrs, async-timeout, Werkzeug, six, Jinja2, itsdangerous, click, cffi, aiohttp, PyNaCl, Flask, dnspython, discord.py
Successfully installed Flask-2.0.2 Jinja2-3.0.3 MarkupSafe-2.0.1 PyNaCl-1.3.0 Werkzeug-2.0.2 aiohttp-3.7.4.post0 async-timeout-3.0.1 attrs-21.4.0 cffi-1.15.0 chardet-4.0.0 click-8.0.3 discord.py-1.6.0 dnspython-1.16.0 idna-3.3 itsdangerous-2.0.1 multidict-5.2.0 pycparser-2.21 six-1.16.0 typing-extensions-4.0.1 yarl-1.7.2
-----> Discovering process types
Procfile declares types -> worker
-----> Compressing...
Done: 62M
-----> Launching...
Released v5
https://app.herokuapp.com/ deployed to Heroku
Your application is a worker (Procfile declares types -> worker) so it does respond to HTTP requests.
Change to web and make sure you bind the PORT provided by Heroku.
BTW BOTs can run as workers (without dealing with incoming requests) if they poll the server for messages/updates. This is a viable option too (keep it as worker) but ensure the BOT logs what it does at startup.
Objective
In Gitlab runner, run:
some JMeter tests
a Python application that uses WMI to collect server metrics
Problem
The JMeter commands worked fine, using the alpine/jmeter image.
default:
image:
name: alpine/jmeter:5.4.1
entrypoint: [""]
No problem there.
But I want to run a Python program I wrote that uses WMI to get Windows performance counters.
Problem: when doing pip install -r requirements.txt, everything gets installed, but that last part fails:
$ pip install -r ../../requirements.txt
Collecting certifi==2021.5.30
Downloading certifi-2021.5.30-py2.py3-none-any.whl (145 kB)
Collecting cffi==1.14.5
Downloading cffi-1.14.5-cp38-cp38-manylinux1_x86_64.whl (411 kB)
Collecting chardet==4.0.0
Downloading chardet-4.0.0-py2.py3-none-any.whl (178 kB)
Collecting click==8.0.1
Downloading click-8.0.1-py3-none-any.whl (97 kB)
Collecting colorama==0.4.4
Downloading colorama-0.4.4-py2.py3-none-any.whl (16 kB)
Collecting ConfigArgParse==1.5
Downloading ConfigArgParse-1.5-py3-none-any.whl (19 kB)
Collecting crypto==1.4.1
Downloading crypto-1.4.1-py2.py3-none-any.whl (18 kB)
Collecting Flask==1.1.2
Downloading Flask-1.1.2-py2.py3-none-any.whl (94 kB)
Collecting Flask-BasicAuth==0.2.0
Downloading Flask-BasicAuth-0.2.0.tar.gz (16 kB)
Preparing metadata (setup.py): started
Preparing metadata (setup.py): finished with status 'done'
Collecting gevent==21.1.2
Downloading gevent-21.1.2-cp38-cp38-manylinux2010_x86_64.whl (6.3 MB)
Collecting geventhttpclient==1.4.4
Downloading geventhttpclient-1.4.4-cp38-cp38-manylinux2010_x86_64.whl (77 kB)
Collecting greenlet==1.1.0
Downloading greenlet-1.1.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (164 kB)
Collecting idna==2.10
Downloading idna-2.10-py2.py3-none-any.whl (58 kB)
Collecting influxdb==5.3.1
Downloading influxdb-5.3.1-py2.py3-none-any.whl (77 kB)
Collecting influxdb-client==1.17.0
Downloading influxdb_client-1.17.0-py3-none-any.whl (450 kB)
Collecting itsdangerous==2.0.1
Downloading itsdangerous-2.0.1-py3-none-any.whl (18 kB)
Collecting Jinja2==3.0.1
Downloading Jinja2-3.0.1-py3-none-any.whl (133 kB)
Collecting locust==1.5.3
Downloading locust-1.5.3-py3-none-any.whl (765 kB)
Collecting locust-influxdb-listener==0.0.5
Downloading locust_influxdb_listener-0.0.5-py3-none-any.whl (7.6 kB)
Collecting MarkupSafe==2.0.1
Downloading MarkupSafe-2.0.1-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (30 kB)
Collecting msgpack==1.0.2
Downloading msgpack-1.0.2-cp38-cp38-manylinux1_x86_64.whl (302 kB)
Collecting multipledispatch==0.6.0
Downloading multipledispatch-0.6.0-py3-none-any.whl (11 kB)
Collecting Naked==0.1.31
Downloading Naked-0.1.31-py2.py3-none-any.whl (590 kB)
Collecting numpy==1.21.2
Downloading numpy-1.21.2-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (15.8 MB)
Collecting pandas==1.3.2
Downloading pandas-1.3.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (11.5 MB)
Collecting psutil==5.8.0
Downloading psutil-5.8.0-cp38-cp38-manylinux2010_x86_64.whl (296 kB)
Collecting pycparser==2.20
Downloading pycparser-2.20-py2.py3-none-any.whl (112 kB)
Collecting pycryptodome==3.10.1
Downloading pycryptodome-3.10.1-cp35-abi3-manylinux2010_x86_64.whl (1.9 MB)
Collecting python-dateutil==2.8.1
Downloading python_dateutil-2.8.1-py2.py3-none-any.whl (227 kB)
Collecting pytz==2021.1
Downloading pytz-2021.1-py2.py3-none-any.whl (510 kB)
ERROR: Could not find a version that satisfies the requirement pywin32==300 (from versions: none)
ERROR: No matching distribution found for pywin32==300
Cleaning up file based variables
00:01
ERROR: Job failed: exit code 1
What I've Tried
replace pywin32==301 with pypywin32
ensured that python version is 3.8
Tried using python:3.8-windowsservercore, but that doesn't even get off the ground because it conflicts with the default JMeter image which is Linux.
I tried adding pywin32==300 to requirements.txt.
The result then is just
ERROR: Could not find a version that satisfies the requirement pywin32==300 (from versions: none)
ERROR: No matching distribution found for pywin32==300
default:
image:
name: alpine/jmeter:5.4.1
entrypoint: [""]
api test:
image:
name: python:3.8
stage: test
script:
- |
git --version
set -e
cd ../..
- rm -rf engine-load-tests
- git clone https://gitlab-ci-token:${CI_JOB_TOKEN}#gitlab.company.net/qa/engine-automation/engine-load-tests.git
- cd engine-load-tests/src/win_perf_counters/
- PYTHONPATH=`pwd`./:$PYTHONPATH
- python3 -m venv .venv
- source .venv/bin/activate
- python -V
- pip install --upgrade pip
- pip install -r ../../requirements.txt
- python ./main.py ../../load_test.conf csv
Am I trying to do something impossible?
Any idea how I can get the Python application to run?
No, the wmi package relies on pywin32 which is only possible to use on Windows because it relies on Windows binary files (windows DLL files) which can only be run on Windows.
Therefore, you must run this Python app on Windows, not Linux.
Jmeter, however, does work on both Windows and Linux.
I am making an api for my deep learning model. The api is properly working on localhost. I tried to deploy it on Heroku. It is deployed but when I open the link it gives "Application error" like below image
Release log:
2021-02-23 05:47:36.841582: I tensorflow/compiler/jit/xla_cpu_device.cc:41] Not creating XLA devices, tf_xla_enable_xla_devices not set
Operations to perform:
Apply all migrations: admin, auth, contenttypes, sessions
Running migrations:
Applying contenttypes.0001_initial... OK
Applying auth.0001_initial... OK
Applying admin.0001_initial... OK
Applying admin.0002_logentry_remove_auto_add... OK
Applying admin.0003_logentry_add_action_flag_choices... OK
Applying contenttypes.0002_remove_content_type_name... OK
Applying auth.0002_alter_permission_name_max_length... OK
Applying auth.0003_alter_user_email_max_length... OK
Applying auth.0004_alter_user_username_opts... OK
Applying auth.0005_alter_user_last_login_null... OK
Applying auth.0006_require_contenttypes_0002... OK
Applying auth.0007_alter_validators_add_error_messages... OK
Applying auth.0008_alter_user_username_max_length... OK
Applying auth.0009_alter_user_last_name_max_length... OK
Applying auth.0010_alter_group_name_max_length... OK
Applying auth.0011_update_proxy_permissions... OK
Applying auth.0012_alter_user_first_name_max_length... OK
Applying sessions.0001_initial... OK
Build log:
-----> Building on the Heroku-20 stack
-----> Python app detected
! Python has released a security update! Please consider upgrading to python-3.7.10
Learn More: https://devcenter.heroku.com/articles/python-runtimes
-----> Requirements file has been changed, clearing cached dependencies
-----> Installing python-3.7.8
-----> Installing pip 20.1.1, setuptools 47.1.1 and wheel 0.34.2
-----> Installing SQLite3
-----> Installing requirements with pip
Collecting absl-py==0.11.0
Downloading absl_py-0.11.0-py3-none-any.whl (127 kB)
Collecting asgiref==3.3.1
Downloading asgiref-3.3.1-py3-none-any.whl (19 kB)
Collecting astunparse==1.6.3
Downloading astunparse-1.6.3-py2.py3-none-any.whl (12 kB)
Collecting cachetools==4.2.1
Downloading cachetools-4.2.1-py3-none-any.whl (12 kB)
Collecting certifi==2020.12.5
Downloading certifi-2020.12.5-py2.py3-none-any.whl (147 kB)
Collecting chardet==4.0.0
Downloading chardet-4.0.0-py2.py3-none-any.whl (178 kB)
Collecting dj-database-url==0.5.0
Downloading dj_database_url-0.5.0-py2.py3-none-any.whl (5.5 kB)
Collecting Django==3.1.7
Downloading Django-3.1.7-py3-none-any.whl (7.8 MB)
Collecting djangorestframework==3.12.2
Downloading djangorestframework-3.12.2-py3-none-any.whl (957 kB)
Collecting flatbuffers==1.12
Downloading flatbuffers-1.12-py2.py3-none-any.whl (15 kB)
Collecting gast==0.3.3
Downloading gast-0.3.3-py2.py3-none-any.whl (9.7 kB)
Collecting google-auth==1.27.0
Downloading google_auth-1.27.0-py2.py3-none-any.whl (135 kB)
Collecting google-auth-oauthlib==0.4.2
Downloading google_auth_oauthlib-0.4.2-py2.py3-none-any.whl (18 kB)
Collecting google-pasta==0.2.0
Downloading google_pasta-0.2.0-py3-none-any.whl (57 kB)
Collecting grpcio==1.32.0
Downloading grpcio-1.32.0-cp37-cp37m-manylinux2014_x86_64.whl (3.8 MB)
Collecting gunicorn==20.0.4
Downloading gunicorn-20.0.4-py2.py3-none-any.whl (77 kB)
Collecting h5py==2.10.0
Downloading h5py-2.10.0-cp37-cp37m-manylinux1_x86_64.whl (2.9 MB)
Collecting idna==2.10
Downloading idna-2.10-py2.py3-none-any.whl (58 kB)
Collecting Keras==2.4.3
Downloading Keras-2.4.3-py2.py3-none-any.whl (36 kB)
Collecting Keras-Preprocessing==1.1.2
Downloading Keras_Preprocessing-1.1.2-py2.py3-none-any.whl (42 kB)
Collecting Markdown==3.3.3
Downloading Markdown-3.3.3-py3-none-any.whl (96 kB)
Collecting numpy==1.19.5
Downloading numpy-1.19.5-cp37-cp37m-manylinux2010_x86_64.whl (14.8 MB)
Collecting oauthlib==3.1.0
Downloading oauthlib-3.1.0-py2.py3-none-any.whl (147 kB)
Collecting opencv-python-headless==4.5.1.48
Downloading opencv_python_headless-4.5.1.48-cp37-cp37m-manylinux2014_x86_64.whl (37.6 MB)
Collecting opt-einsum==3.3.0
Downloading opt_einsum-3.3.0-py3-none-any.whl (65 kB)
Collecting Pillow==8.1.0
Downloading Pillow-8.1.0-cp37-cp37m-manylinux1_x86_64.whl (2.2 MB)
Collecting protobuf==3.15.1
Downloading protobuf-3.15.1-cp37-cp37m-manylinux1_x86_64.whl (1.0 MB)
Collecting psycopg2==2.8.6
Downloading psycopg2-2.8.6.tar.gz (383 kB)
Collecting pyasn1==0.4.8
Downloading pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting pyasn1-modules==0.2.8
Downloading pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting pytz==2021.1
Downloading pytz-2021.1-py2.py3-none-any.whl (510 kB)
Collecting PyYAML==5.4.1
Downloading PyYAML-5.4.1-cp37-cp37m-manylinux1_x86_64.whl (636 kB)
Collecting requests==2.25.1
Downloading requests-2.25.1-py2.py3-none-any.whl (61 kB)
Collecting requests-oauthlib==1.3.0
Downloading requests_oauthlib-1.3.0-py2.py3-none-any.whl (23 kB)
Collecting rsa==4.7.1
Downloading rsa-4.7.1-py3-none-any.whl (36 kB)
Collecting scipy==1.6.1
Downloading scipy-1.6.1-cp37-cp37m-manylinux1_x86_64.whl (27.4 MB)
Collecting six==1.15.0
Downloading six-1.15.0-py2.py3-none-any.whl (10 kB)
Collecting sqlparse==0.4.1
Downloading sqlparse-0.4.1-py3-none-any.whl (42 kB)
Collecting tensorboard==2.4.1
Downloading tensorboard-2.4.1-py3-none-any.whl (10.6 MB)
Collecting tensorboard-plugin-wit==1.8.0
Downloading tensorboard_plugin_wit-1.8.0-py3-none-any.whl (781 kB)
Collecting tensorflow-cpu==2.4.1
Downloading tensorflow_cpu-2.4.1-cp37-cp37m-manylinux2010_x86_64.whl (144.1 MB)
Collecting tensorflow-estimator==2.4.0
Downloading tensorflow_estimator-2.4.0-py2.py3-none-any.whl (462 kB)
Collecting termcolor==1.1.0
Downloading termcolor-1.1.0.tar.gz (3.9 kB)
Collecting typing-extensions==3.7.4.3
Downloading typing_extensions-3.7.4.3-py3-none-any.whl (22 kB)
Collecting urllib3==1.26.3
Downloading urllib3-1.26.3-py2.py3-none-any.whl (137 kB)
Collecting Werkzeug==1.0.1
Downloading Werkzeug-1.0.1-py2.py3-none-any.whl (298 kB)
Collecting whitenoise==5.2.0
Downloading whitenoise-5.2.0-py2.py3-none-any.whl (19 kB)
Collecting wrapt==1.12.1
Downloading wrapt-1.12.1.tar.gz (27 kB)
Collecting importlib-metadata; python_version < "3.8"
Downloading importlib_metadata-3.4.0-py3-none-any.whl (10 kB)
Collecting zipp>=0.5
Downloading zipp-3.4.0-py3-none-any.whl (5.2 kB)
Building wheels for collected packages: psycopg2, termcolor, wrapt
Building wheel for psycopg2 (setup.py): started
Building wheel for psycopg2 (setup.py): finished with status 'done'
Created wheel for psycopg2: filename=psycopg2-2.8.6-cp37-cp37m-linux_x86_64.whl size=501629 sha256=d8931f1a9c43a4fda009cc1dd73998b6850c5cc8cd0b7262297232221c4cf24d
Stored in directory: /tmp/pip-ephem-wheel-cache-f0cb0bho/wheels/25/78/75/9c0323f7e1fb42143cbd2439302beb7850a1034abb961cb281
Building wheel for termcolor (setup.py): started
Building wheel for termcolor (setup.py): finished with status 'done'
Created wheel for termcolor: filename=termcolor-1.1.0-py3-none-any.whl size=4830 sha256=ba9857e440ee6b55aede8e87e2b021b3a0083e509349a35a86d794ec3e6a7d15
Stored in directory: /tmp/pip-ephem-wheel-cache-f0cb0bho/wheels/3f/e3/ec/8a8336ff196023622fbcb36de0c5a5c218cbb24111d1d4c7f2
Building wheel for wrapt (setup.py): started
Building wheel for wrapt (setup.py): finished with status 'done'
Created wheel for wrapt: filename=wrapt-1.12.1-cp37-cp37m-linux_x86_64.whl size=77160 sha256=d9035ca277353f42fff12ec8f3a38fea5ee667bac754005cdd9b12f528fd3847
Stored in directory: /tmp/pip-ephem-wheel-cache-f0cb0bho/wheels/62/76/4c/aa25851149f3f6d9785f6c869387ad82b3fd37582fa8147ac6
Successfully built psycopg2 termcolor wrapt
ERROR: tensorflow-cpu 2.4.1 has requirement wheel~=0.35, but you'll have wheel 0.34.2 which is incompatible.
Installing collected packages: six, absl-py, asgiref, astunparse, cachetools, certifi, chardet, dj-database-url, pytz, sqlparse, Django, djangorestframework, flatbuffers, gast, pyasn1, rsa, pyasn1-modules, google-auth, oauthlib, urllib3, idna, requests, requests-oauthlib, google-auth-oauthlib, google-pasta, grpcio, gunicorn, numpy, h5py, scipy, PyYAML, Keras, Keras-Preprocessing, typing-extensions, zipp, importlib-metadata, Markdown, opencv-python-headless, opt-einsum, Pillow, protobuf, psycopg2, Werkzeug, tensorboard-plugin-wit, tensorboard, tensorflow-estimator, wrapt, termcolor, tensorflow-cpu, whitenoise
Successfully installed Django-3.1.7 Keras-2.4.3 Keras-Preprocessing-1.1.2 Markdown-3.3.3 Pillow-8.1.0 PyYAML-5.4.1 Werkzeug-1.0.1 absl-py-0.11.0 asgiref-3.3.1 astunparse-1.6.3 cachetools-4.2.1 certifi-2020.12.5 chardet-4.0.0 dj-database-url-0.5.0 djangorestframework-3.12.2 flatbuffers-1.12 gast-0.3.3 google-auth-1.27.0 google-auth-oauthlib-0.4.2 google-pasta-0.2.0 grpcio-1.32.0 gunicorn-20.0.4 h5py-2.10.0 idna-2.10 importlib-metadata-3.4.0 numpy-1.19.5 oauthlib-3.1.0 opencv-python-headless-4.5.1.48 opt-einsum-3.3.0 protobuf-3.15.1 psycopg2-2.8.6 pyasn1-0.4.8 pyasn1-modules-0.2.8 pytz-2021.1 requests-2.25.1 requests-oauthlib-1.3.0 rsa-4.7.1 scipy-1.6.1 six-1.15.0 sqlparse-0.4.1 tensorboard-2.4.1 tensorboard-plugin-wit-1.8.0 tensorflow-cpu-2.4.1 tensorflow-estimator-2.4.0 termcolor-1.1.0 typing-extensions-3.7.4.3 urllib3-1.26.3 whitenoise-5.2.0 wrapt-1.12.1 zipp-3.4.0
-----> Discovering process types
Procfile declares types -> release, web
-----> Compressing...
Done: 305.3M
-----> Launching...
! Warning: Your slug size (305 MB) exceeds our soft limit (300 MB) which may affect boot time.
Released v17
https://damp-bayou-18222.herokuapp.com/ deployed to Heroku
Output of heroku logs --dyno router
2021-02-23T06:06:58.983807+00:00 heroku[router]: at=error code=H10 desc="App crashed" method=GET path="/" host=damp-bayou-18222.herokuapp.com request_id=9d41bcd3-8555-4a03-a247-43ff200d5061 fwd="27.61.161.75" dyno= connect= service= status=503 bytes= protocol=http
2021-02-23T06:06:59.611890+00:00 heroku[router]: at=error code=H10 desc="App crashed" method=GET path="/favicon.ico" host=damp-bayou-18222.herokuapp.com request_id=c4a3e114-c3af-4230-8081-63dd4401f113 fwd="27.61.161.75" dyno= connect= service= status=503 bytes= protocol
Procfile
release: python manage.py makemigrations --no-input
release: python manage.py migrate --no-input
web: gunicorn digitrec.wsgi
When I run the command heroku ps:sacle web=1 it gives the below output:
Scaling dynos... done, now running web at 1:Free
I have tried many solutions from different answers of stackoverflow but nothing worked. Please help me out!!!
Add log-file flag in web worker config
web: gunicorn digitrec.wsgi --log-file -
Warning: Your slug size (305 MB) exceeds our soft limit (300 MB) which may affect boot time.
Your app must boot within the time limit or it'll be killed
Using the release phase in the Procfile works, but the release phase isn't part of the build, meaning that the makemigrations command creates the files in an ephemeral file system, meaning your next push will create brand new migrations again, which can cause issues with the database.
Finally, if your app is serving any sort of static files, and is used in production, then the DEBUG setting being False will mean you'll have to run collectstatic (again, as part of the build and not release).