Trying to create a deployment file for AWS lambda - python

I am trying to create a Alexa skill that tells me when the next train time is at my train station. I want to use a package called pygtfs which is a library that models information stored in Google's General Transit Feed Specification (GTFS) format. I have the files that process the json object sent from alexa and they work fine. I am having trouble making a zip deployment package to upload to aws lambda. I have a python script that creates a deployment package but I am getting an error. The error does not add the correct files to the deployment package.
def _copy_deployment_files(deployment_dir):
for deployment_file in deployment_files:
if os.path.exists(deployment_file):
cmd = "cp{0}{1}".format(deployment_file, deployment_dir).split()
--->return_code = subprocess.call(cmd, shell=False)<----
else:
raise NameError("Deployment file not found [{0}]".format(deployment_file))
In the _copy_deployment_files function have tried making the subprocess method shell parameter to true
return_code = subprocess.call(cmd, shell=True)
but it doesn't include the AlexaBaseHandler.py, AlexaDeploymentHandler.py and main.py.
My questions is how do I get the create_development to add AlexaBaseHandler.py, AlexaDeploymentHandler.py and main.py to the deployment package and how to I ensure the pygtfs package is installed into the deployment package as well?
Here is the result when return_code = subprocess.call(cmd, shell=True)
C:\ProgramData\Anaconda3\python.exe C:/Users/Owner/PycharmProjects/PatcoSchedule/create_deployment.py
Collecting requests==2.8.1
Using cached requests-2.8.1-py2.py3-none-any.whl
Installing collected packages: requests
Successfully installed requests-2.8.1
Collecting sseclient==0.0.11
Collecting requests>=2.0.0 (from sseclient==0.0.11)
Using cached requests-2.18.4-py2.py3-none-any.whl
Collecting six (from sseclient==0.0.11)
Using cached six-1.11.0-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.0.0->sseclient==0.0.11)
Using cached chardet-3.0.4-py2.py3-none-any.whl
Collecting urllib3<1.23,>=1.21.1 (from requests>=2.0.0->sseclient==0.0.11)
Using cached urllib3-1.22-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.0.0->sseclient==0.0.11)
Using cached certifi-2017.7.27.1-py2.py3-none-any.whl
Collecting idna<2.7,>=2.5 (from requests>=2.0.0->sseclient==0.0.11)
Using cached idna-2.6-py2.py3-none-any.whl
Installing collected packages: chardet, urllib3, certifi, idna, requests, six, sseclient
Successfully installed certifi-2017.7.27.1 chardet-3.0.4 idna-2.6 requests-2.18.4 six-1.11.0 sseclient-0.0.11 urllib3-1.22
Target directory C:\Users\Owner\PycharmProjects\PatcoSchedule\deployments\deployment_15\requests already exists. Specify --upgrade to force replacement.
Collecting hammock==0.2.4
Collecting requests>=1.1.0 (from hammock==0.2.4)
Using cached requests-2.18.4-py2.py3-none-any.whl
Collecting idna<2.7,>=2.5 (from requests>=1.1.0->hammock==0.2.4)
Using cached idna-2.6-py2.py3-none-any.whl
Collecting urllib3<1.23,>=1.21.1 (from requests>=1.1.0->hammock==0.2.4)
Using cached urllib3-1.22-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=1.1.0->hammock==0.2.4)
Using cached certifi-2017.7.27.1-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=1.1.0->hammock==0.2.4)
Using cached chardet-3.0.4-py2.py3-none-any.whl
Installing collected packages: idna, urllib3, certifi, chardet, requests, hammock
Successfully installed certifi-2017.7.27.1 chardet-3.0.4 hammock-0.2.4 idna-2.6 requests-2.18.4 urllib3-1.22
Target directory C:\Users\Owner\PycharmProjects\PatcoSchedule\deployments\deployment_15\certifi already exists. Specify --upgrade to force replacement.
Target directory C:\Users\Owner\PycharmProjects\PatcoSchedule\deployments\deployment_15\certifi-2017.7.27.1.dist-info already exists. Specify --upgrade to force replacement.
Target directory C:\Users\Owner\PycharmProjects\PatcoSchedule\deployments\deployment_15\chardet already exists. Specify --upgrade to force replacement.
Target directory C:\Users\Owner\PycharmProjects\PatcoSchedule\deployments\deployment_15\chardet-3.0.4.dist-info already exists. Specify --upgrade to force replacement.
Target directory C:\Users\Owner\PycharmProjects\PatcoSchedule\deployments\deployment_15\idna already exists. Specify --upgrade to force replacement.
Target directory C:\Users\Owner\PycharmProjects\PatcoSchedule\deployments\deployment_15\idna-2.6.dist-info already exists. Specify --upgrade to force replacement.
Target directory C:\Users\Owner\PycharmProjects\PatcoSchedule\deployments\deployment_15\requests already exists. Specify --upgrade to force replacement.
Target directory C:\Users\Owner\PycharmProjects\PatcoSchedule\deployments\deployment_15\requests-2.18.4.dist-info already exists. Specify --upgrade to force replacement.
Target directory C:\Users\Owner\PycharmProjects\PatcoSchedule\deployments\deployment_15\urllib3 already exists. Specify --upgrade to force replacement.
Target directory C:\Users\Owner\PycharmProjects\PatcoSchedule\deployments\deployment_15\urllib3-1.22.dist-info already exists. Specify --upgrade to force replacement.
Target directory C:\Users\Owner\PycharmProjects\PatcoSchedule\deployments\deployment_15\__pycache__ already exists. Specify --upgrade to force replacement.
Collecting pygtfs==0.1.3
Using cached pygtfs-0.1.3-py2.py3-none-any.whl
Collecting sqlalchemy>=0.7.8 (from pygtfs==0.1.3)
Using cached SQLAlchemy-1.1.14.tar.gz
Collecting six (from pygtfs==0.1.3)
Using cached six-1.11.0-py2.py3-none-any.whl
Collecting pytz>=2012d (from pygtfs==0.1.3)
Using cached pytz-2017.2-py2.py3-none-any.whl
Collecting docopt (from pygtfs==0.1.3)
Installing collected packages: sqlalchemy, six, pytz, docopt, pygtfs
Running setup.py install for sqlalchemy: started
Running setup.py install for sqlalchemy: finished with status 'done'
Successfully installed docopt-0.6.2 pygtfs-0.1.3 pytz-2017.2 six-1.11.0 sqlalchemy-1.1.14
Target directory C:\Users\Owner\PycharmProjects\PatcoSchedule\deployments\deployment_15\six-1.11.0.dist-info already exists. Specify --upgrade to force replacement.
Target directory C:\Users\Owner\PycharmProjects\PatcoSchedule\deployments\deployment_15\six.py already exists. Specify --upgrade to force replacement.
Target directory C:\Users\Owner\PycharmProjects\PatcoSchedule\deployments\deployment_15\__pycache__ already exists. Specify --upgrade to force replacement.
Process finished with exit code 0
C:\ProgramData\Anaconda3\python.exe C:/Users/Owner/PycharmProjects/PatcoSchedule/create_deployment.py
Collecting requests==2.8.1
Using cached requests-2.8.1-py2.py3-none-any.whl
Installing collected packages: requests
Successfully installed requests-2.8.1
Collecting sseclient==0.0.11
Collecting requests>=2.0.0 (from sseclient==0.0.11)
Using cached requests-2.18.4-py2.py3-none-any.whl
Collecting six (from sseclient==0.0.11)
Using cached six-1.11.0-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.0.0->sseclient==0.0.11)
Using cached chardet-3.0.4-py2.py3-none-any.whl
Collecting urllib3<1.23,>=1.21.1 (from requests>=2.0.0->sseclient==0.0.11)
Using cached urllib3-1.22-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.0.0->sseclient==0.0.11)
Using cached certifi-2017.7.27.1-py2.py3-none-any.whl
Collecting idna<2.7,>=2.5 (from requests>=2.0.0->sseclient==0.0.11)
Using cached idna-2.6-py2.py3-none-any.whl
Installing collected packages: chardet, urllib3, certifi, idna, requests, six, sseclient
Successfully installed certifi-2017.7.27.1 chardet-3.0.4 idna-2.6 requests-2.18.4 six-1.11.0 sseclient-0.0.11 urllib3-1.22
Target directory C:\Users\Owner\PycharmProjects\PatcoSchedule\deployments\deployment_15\requests already exists. Specify --upgrade to force replacement.
Collecting hammock==0.2.4
Collecting requests>=1.1.0 (from hammock==0.2.4)
Using cached requests-2.18.4-py2.py3-none-any.whl
Collecting idna<2.7,>=2.5 (from requests>=1.1.0->hammock==0.2.4)
Using cached idna-2.6-py2.py3-none-any.whl
Collecting urllib3<1.23,>=1.21.1 (from requests>=1.1.0->hammock==0.2.4)
Using cached urllib3-1.22-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=1.1.0->hammock==0.2.4)
Using cached certifi-2017.7.27.1-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=1.1.0->hammock==0.2.4)
Using cached chardet-3.0.4-py2.py3-none-any.whl
Installing collected packages: idna, urllib3, certifi, chardet, requests, hammock
Successfully installed certifi-2017.7.27.1 chardet-3.0.4 hammock-0.2.4 idna-2.6 requests-2.18.4 urllib3-1.22
Target directory C:\Users\Owner\PycharmProjects\PatcoSchedule\deployments\deployment_15\certifi already exists. Specify --upgrade to force replacement.
Target directory C:\Users\Owner\PycharmProjects\PatcoSchedule\deployments\deployment_15\certifi-2017.7.27.1.dist-info already exists. Specify --upgrade to force replacement.
Target directory C:\Users\Owner\PycharmProjects\PatcoSchedule\deployments\deployment_15\chardet already exists. Specify --upgrade to force replacement.
Target directory C:\Users\Owner\PycharmProjects\PatcoSchedule\deployments\deployment_15\chardet-3.0.4.dist-info already exists. Specify --upgrade to force replacement.
Target directory C:\Users\Owner\PycharmProjects\PatcoSchedule\deployments\deployment_15\idna already exists. Specify --upgrade to force replacement.
Target directory C:\Users\Owner\PycharmProjects\PatcoSchedule\deployments\deployment_15\idna-2.6.dist-info already exists. Specify --upgrade to force replacement.
Target directory C:\Users\Owner\PycharmProjects\PatcoSchedule\deployments\deployment_15\requests already exists. Specify --upgrade to force replacement.
Target directory C:\Users\Owner\PycharmProjects\PatcoSchedule\deployments\deployment_15\requests-2.18.4.dist-info already exists. Specify --upgrade to force replacement.
Target directory C:\Users\Owner\PycharmProjects\PatcoSchedule\deployments\deployment_15\urllib3 already exists. Specify --upgrade to force replacement.
Target directory C:\Users\Owner\PycharmProjects\PatcoSchedule\deployments\deployment_15\urllib3-1.22.dist-info already exists. Specify --upgrade to force replacement.
Target directory C:\Users\Owner\PycharmProjects\PatcoSchedule\deployments\deployment_15\__pycache__ already exists. Specify --upgrade to force replacement.
Collecting pygtfs==0.1.3
Using cached pygtfs-0.1.3-py2.py3-none-any.whl
Collecting sqlalchemy>=0.7.8 (from pygtfs==0.1.3)
Using cached SQLAlchemy-1.1.14.tar.gz
Collecting six (from pygtfs==0.1.3)
Using cached six-1.11.0-py2.py3-none-any.whl
Collecting pytz>=2012d (from pygtfs==0.1.3)
Using cached pytz-2017.2-py2.py3-none-any.whl
Collecting docopt (from pygtfs==0.1.3)
Installing collected packages: sqlalchemy, six, pytz, docopt, pygtfs
Running setup.py install for sqlalchemy: started
Running setup.py install for sqlalchemy: finished with status 'done'
Successfully installed docopt-0.6.2 pygtfs-0.1.3 pytz-2017.2 six-1.11.0 sqlalchemy-1.1.14
Target directory C:\Users\Owner\PycharmProjects\PatcoSchedule\deployments\deployment_15\six-1.11.0.dist-info already exists. Specify --upgrade to force replacement.
Target directory C:\Users\Owner\PycharmProjects\PatcoSchedule\deployments\deployment_15\six.py already exists. Specify --upgrade to force replacement.
Target directory C:\Users\Owner\PycharmProjects\PatcoSchedule\deployments\deployment_15\__pycache__ already exists. Specify --upgrade to force replacement.
Process finished with exit code 0
Here is the create_deployment file and the code was copied from https://github.com/youngsoul/AlexaDeploymentSample
import os
import subprocess
import zipfile
"""
Script will create an AWS Lambda function deployment.
It expects there to be a deployments directory and it will create a
deployment of the form:
deployment_n
where n is incremented for each deployment based on the existing deployment
directories
"""
root_deployments_dir = "./deployments"
# List of files that should be included in the deployment
# Only the files listed here, and the libraries in the requirements.txt
# file will be included in the deployment.
deployment_files = ['AlexaBaseHandler.py', 'AlexaDeploymentHandler.py', 'main.py']
def _read_requirements():
with open("./requirements.txt", 'r') as f:
install_requirements = f.readlines()
return install_requirements
def _get_immediate_subdirectories(a_dir):
return [name for name in os.listdir(a_dir)
if os.path.isdir(os.path.join(a_dir, name))]
def _make_deployment_dir():
all_deployment_directories = _get_immediate_subdirectories(root_deployments_dir)
max_deployment_number = -1
for deployment_dir in all_deployment_directories:
dir_name_elements = deployment_dir.split("_")
if( len(dir_name_elements) == 2):
if int(dir_name_elements[1]) > max_deployment_number:
max_deployment_number = int(dir_name_elements[1])
if max_deployment_number == -1:
max_deployment_number = 0
deployment_name = "deployment_{0}".format(max_deployment_number+1)
new_deployment_dir_path = "{0}/{1}".format(root_deployments_dir, deployment_name)
if not os.path.exists(new_deployment_dir_path):
os.mkdir(new_deployment_dir_path)
return (new_deployment_dir_path, deployment_name)
def _install_requirements(deployment_requirements, deployment_dir):
"""
pip install <requirements line> -t <deployment_dir>
:param deployment_requirements
:param deployment_dir:
:return:
"""
if os.path.exists(deployment_dir):
for requirement in deployment_requirements:
cmd = "pip install {0} -t {1}".format(requirement, deployment_dir).split()
return_code = subprocess.call(cmd, shell=False)
def _copy_deployment_files(deployment_dir):
for deployment_file in deployment_files:
if os.path.exists(deployment_file):
cmd = "{0} {1}".format(deployment_file, deployment_dir).split()
return_code = subprocess.call(cmd, shell=False)
else:
raise NameError("Deployment file not found [{0}]".format(deployment_file))
def zipdir(dirPath=None, zipFilePath=None, includeDirInZip=False):
"""
Attribution: I wish I could remember where I found this on the
web. To the unknown sharer of knowledge - thank you.
Create a zip archive from a directory.
Note that this function is designed to put files in the zip archive with
either no parent directory or just one parent directory, so it will trim any
leading directories in the filesystem paths and not include them inside the
zip archive paths. This is generally the case when you want to just take a
directory and make it into a zip file that can be extracted in different
locations.
Keyword arguments:
dirPath -- string path to the directory to archive. This is the only
required argument. It can be absolute or relative, but only one or zero
leading directories will be included in the zip archive.
zipFilePath -- string path to the output zip file. This can be an absolute
or relative path. If the zip file already exists, it will be updated. If
not, it will be created. If you want to replace it from scratch, delete it
prior to calling this function. (default is computed as dirPath + ".zip")
includeDirInZip -- boolean indicating whether the top level directory should
be included in the archive or omitted. (default True)
"""
if not zipFilePath:
zipFilePath = dirPath + ".zip"
if not os.path.isdir(dirPath):
raise OSError("dirPath argument must point to a directory. "
"'%s' does not." % dirPath)
parentDir, dirToZip = os.path.split(dirPath)
#Little nested function to prepare the proper archive path
def trimPath(path):
archivePath = path.replace(parentDir, "", 1)
if parentDir:
archivePath = archivePath.replace(os.path.sep, "", 1)
if not includeDirInZip:
archivePath = archivePath.replace(dirToZip + os.path.sep, "", 1)
return os.path.normcase(archivePath)
outFile = zipfile.ZipFile(zipFilePath, "w",
compression=zipfile.ZIP_DEFLATED)
for (archiveDirPath, dirNames, fileNames) in os.walk(dirPath):
for fileName in fileNames:
filePath = os.path.join(archiveDirPath, fileName)
outFile.write(filePath, trimPath(filePath))
#Make sure we get empty directories as well
if not fileNames and not dirNames:
#or
#zipInfo.external_attr = 48
#Here to allow for inserting an empty directory. Still TBD/TODO.
outFile.writestr(zipInfo, "")
outFile.close()
if __name__ == "__main__":
(deployment_dir, deployment_name) = _make_deployment_dir()
_copy_deployment_files(deployment_dir)
install_requirements = _read_requirements()
_install_requirements(install_requirements, deployment_dir)
zipdir(deployment_dir, "{0}/{1}.zip".format(root_deployments_dir, deployment_name))
UPDATE
NVM i figured it out. The code I copied was using cp as the copy command for the subprocess. I looked up the command for my OS (windows 10) and it was COPY not cp. here is the new call to subprocess
cmd = "COPY {0}
{1}".format(deployment_file,os.path.abspath(deployment_dir)).split()

NVM i figured it out. The code I copied was using cp as the copy command for the subprocess. I looked up the command for my OS (windows 10) and it was COPY not cp. here is the new call to subprocess cmd = "COPY {0} {1}".format(deployment_file, os.path.abspath(deployment_dir)).split()

Related

pip install from private repo but requirements from PyPi only installs private package

Unlike pip install from private repo but requirements from PyPi I am able to intall my package (daisy) from our private artifactory instance with:
pip3 install -i https://our-artifactory/pypi/simple daisy
The ouput is:
Looking in indexes: https://our-artifactory/api/pypi/simple
Collecting daisy
Downloading https://our-artifactory/artifactory/api/pypi/my-repo/daisy/0.0.2/daisy-0.0.2-py3-none-any.whl (4.8 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 4.8/4.8 MB 12.1 MB/s eta 0:00:00
ERROR: Could not find a version that satisfies the requirement pandas<2.0.0,>=1.5.2 (from daisy) (from versions: none)
ERROR: No matching distribution found for pandas<2.0.0,>=1.5.2
When I then try to install pandas by itself it works:
pip3 install pandas
Collecting pandas
Downloading pandas-1.5.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (12.2 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 12.2/12.2 MB 16.0 MB/s eta 0:00:00
Collecting python-dateutil>=2.8.1
Using cached python_dateutil-2.8.2-py2.py3-none-any.whl (247 kB)
Collecting pytz>=2020.1
Downloading pytz-2022.7-py2.py3-none-any.whl (499 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 499.4/499.4 kB 4.1 MB/s eta 0:00:00
Collecting numpy>=1.20.3
Downloading numpy-1.24.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (17.3 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 17.3/17.3 MB 18.9 MB/s eta 0:00:00
Collecting six>=1.5
Using cached six-1.16.0-py2.py3-none-any.whl (11 kB)
Installing collected packages: pytz, six, numpy, python-dateutil, pandas
Successfully installed numpy-1.24.1 pandas-1.5.2 python-dateutil-2.8.2 pytz-2022.7 six-1.16.0
It is even the right version. Somehow I think in the first command it tries to get all dependencies from our private repo as well. Is there a way to get the package from the private repo and the dependencies from PyPI?
Btw, I'm working from a conda (miniforge) Python3.9 environment.
Edit: I got a bit further using:
pip3 install -i https://our-artifactory/artifactory/api/pypi/dl-innersource-pypi/simple daisy --extra-index-url https://pypi.org/simple
however it installs daisy from PyPI, I guess it's unfortunate that I picked an already existing name...
Edit: I can get it to work by specifying my daisy version, like this:
pip install --index-url https://my-artifactory/artifactory/api/pypi/dl-common-pypi/simple daisy==0.0.2
However, leaving out the version number reverts to fetching the PyPI version of daisy. Should this be considered a bug since I explicitly tell pip to look at my-artifactory first and then at public PyPI?
You can probably solve this with a solution like simpleindex. This will allow you to set some rules based on the project name to redirect pip towards an index or another.
Maybe something like this (untested):
[routes."daisy"]
source = "http"
to = "https://my-artifactory/artifactory/api/pypi/dl-common-pypi/simple/daisy/"
# Otherwise use PyPI.
[routes."{project}"]
source = "http"
to = "https://pypi.org/simple/{project}/"
[server]
host = "127.0.0.1"
port = 8000
But I guess some other index proxy software might be able to do this as well.
Some more on the topic in this discussion.

Removing and reinstalling Python-Sphinx on Windows

I am trying to use Python (3.8) and Sphinx (3.3.1) to build a documentation in HTML. However, the sphinx-build command gives me the following error:
C:\Users\Me\Dropbox\Kuchen>sphinx-build -b html source build
Running Sphinx v3.3.1
loading pickled environment... done
building [mo]: targets for 0 po files that are out of date
building [html]: targets for 1 source files that are out of date
updating environment: 0 added, 1 changed, 0 removed
reading sources... [100%] kaesekuchen
looking for now-outdated files... none found
pickling environment... done
checking consistency... done
preparing documents... done
writing output... [100%] kaesekuchen
generating indices... genindex done
writing additional pages... search done
copying static files... WARNING: Failed to copy a file in html_static_file: c:\users\me\appdata\local\programs\python\python38\lib\site-packages\sphinx\themes\basic\static/jquery-3.5.1.js: PermissionError(13, 'Permission denied')
WARNING: Failed to copy a file in html_static_file: c:\users\me\appdata\local\programs\python\python38\lib\site-packages\sphinx\themes\basic\static/jquery.js: PermissionError(13, 'Permission denied')
done
copying extra files... done
dumping search index in English (code: en)... done
dumping object inventory... done
build succeeded, 2 warnings.
However,
The HTML file kaesekuchen in build is not updated/changed.
The folder c:\users\me\appdata\local\programs\python\python38\lib\site-packages\sphinx does not exist.
The latter is my fault, because I deleted it in file explorer, but only because I encountered the exact same error before, and hoped that deleting and re-installing Sphinx would solve it.
Instead, the commands pip uninstall sphinx and subsequent pip install -U sphinx do not change anything in that folder, and the latter only gives the following upbeat output despite the following output:
Microsoft Windows [Version 10.0.18363.1198]
(c) 2019 Microsoft Corporation. All rights reserved.
C:\Users\me>pip uninstall sphinx
Found existing installation: Sphinx 3.3.1
Uninstalling Sphinx-3.3.1:
Would remove:
c:\users\me\appdata\local\programs\python\python38\lib\site-packages\sphinx-3.3.1.dist-info\*
c:\users\me\appdata\local\programs\python\python38\lib\site-packages\sphinx\*
c:\users\me\appdata\local\programs\python\python38\scripts\sphinx-apidoc.exe
c:\users\me\appdata\local\programs\python\python38\scripts\sphinx-autogen.exe
c:\users\me\appdata\local\programs\python\python38\scripts\sphinx-build.exe
c:\users\me\appdata\local\programs\python\python38\scripts\sphinx-quickstart.exe
Proceed (y/n)? y
Successfully uninstalled Sphinx-3.3.1
C:\Users\me>pip install -U sphinx
Collecting sphinx
Using cached Sphinx-3.3.1-py3-none-any.whl (2.9 MB)
Requirement already satisfied, skipping upgrade: docutils>=0.12 in c:\users\me\appdata\local\programs\python\python38\lib\site-packages (from sphinx) (0.16)
Requirement already satisfied, skipping upgrade: sphinxcontrib-serializinghtml in c:\users\me\appdata\local\programs\python\python38\lib\site-packages (from sphinx) (1.1.4)
Requirement already satisfied, skipping upgrade: snowballstemmer>=1.1 in c:\users\me\appdata\local\programs\python\python38\lib\site-packages (from sphinx) (2.0.0)
Requirement already satisfied, skipping upgrade: alabaster<0.8,>=0.7 in c:\users\me\appdata\local\programs\python\python38\lib\site-packages (from sphinx) (0.7.12)
Requirement already satisfied, skipping upgrade: setuptools in c:\users\me\appdata\local\programs\python\python38\lib\site-packages (from sphinx) (41.2.0)
Requirement already satisfied, skipping upgrade: colorama>=0.3.5; sys_platform == "win32" in c:\users\me\appdata\local\programs\python\python38\lib\site-packages (from sphinx) (0.4.4)
Requirement already satisfied, skipping upgrade: sphinxcontrib-jsmath in c:\users\me\appdata\local\programs\python\python38\lib\site-packages (from sphinx) (1.0.1)
Requirement already satisfied, skipping upgrade: babel>=1.3 in c:\users\me\appdata\local\programs\python\python38\lib\site-packages (from sphinx) (2.9.0)
Requirement already satisfied, skipping upgrade: imagesize in c:\users\me\appdata\local\programs\python\python38\lib\site-packages (from sphinx) (1.2.0)
Requirement already satisfied, skipping upgrade: sphinxcontrib-devhelp in c:\users\me\appdata\local\programs\python\python38\lib\site-packages (from sphinx) (1.0.2)
Requirement already satisfied, skipping upgrade: sphinxcontrib-qthelp in c:\users\me\appdata\local\programs\python\python38\lib\site-packages (from sphinx) (1.0.3)
Requirement already satisfied, skipping upgrade: Jinja2>=2.3 in c:\users\me\appdata\local\programs\python\python38\lib\site-packages (from sphinx) (2.11.2)
Requirement already satisfied, skipping upgrade: sphinxcontrib-applehelp in c:\users\me\appdata\local\programs\python\python38\lib\site-packages (from sphinx) (1.0.2)
Requirement already satisfied, skipping upgrade: requests>=2.5.0 in c:\users\me\appdata\local\programs\python\python38\lib\site-packages (from sphinx) (2.25.0)
Requirement already satisfied, skipping upgrade: packaging in c:\users\me\appdata\local\programs\python\python38\lib\site-packages (from sphinx) (20.4)
Requirement already satisfied, skipping upgrade: sphinxcontrib-htmlhelp in c:\users\me\appdata\local\programs\python\python38\lib\site-packages (from sphinx) (1.0.3)
Requirement already satisfied, skipping upgrade: Pygments>=2.0 in c:\users\me\appdata\local\programs\python\python38\lib\site-packages (from sphinx) (2.7.2)
Requirement already satisfied, skipping upgrade: pytz>=2015.7 in c:\users\me\appdata\local\programs\python\python38\lib\site-packages (from babel>=1.3->sphinx) (2020.4)
Requirement already satisfied, skipping upgrade: MarkupSafe>=0.23 in c:\users\me\appdata\local\programs\python\python38\lib\site-packages (from Jinja2>=2.3->sphinx) (1.1.1)
Requirement already satisfied, skipping upgrade: urllib3<1.27,>=1.21.1 in c:\users\me\appdata\local\programs\python\python38\lib\site-packages (from requests>=2.5.0->sphinx) (1.26.2)
Requirement already satisfied, skipping upgrade: idna<3,>=2.5 in c:\users\me\appdata\local\programs\python\python38\lib\site-packages (from requests>=2.5.0->sphinx) (2.10)
Requirement already satisfied, skipping upgrade: certifi>=2017.4.17 in c:\users\me\appdata\local\programs\python\python38\lib\site-packages (from requests>=2.5.0->sphinx) (2020.11.8)
Requirement already satisfied, skipping upgrade: chardet<4,>=3.0.2 in c:\users\me\appdata\local\programs\python\python38\lib\site-packages (from requests>=2.5.0->sphinx) (3.0.4)
Requirement already satisfied, skipping upgrade: six in c:\users\me\appdata\local\programs\python\python38\lib\site-packages (from packaging->sphinx) (1.15.0)
Requirement already satisfied, skipping upgrade: pyparsing>=2.0.2 in c:\users\me\appdata\local\programs\python\python38\lib\site-packages (from packaging->sphinx) (2.4.7)
Installing collected packages: sphinx
Successfully installed sphinx-3.3.1
But the folder c:\users\me\appdata\local\programs\python\python38\lib\site-packages\sphinx\ is still not there.
I even tried to run a new Sphinx project from scratch, using sphinx-quickstart:
For a list of supported codes, see
https://www.sphinx-doc.org/en/master/usage/configuration.html#confval-language.
> Project language [en]: en
Creating file C:\Users\me\Dropbox\Kuchentest\source\conf.py.
Creating file C:\Users\me\Dropbox\Kuchentest\source\index.rst.
Creating file C:\Users\me\Dropbox\Kuchentest\Makefile.
Creating file C:\Users\me\Dropbox\Kuchentest\make.bat.
Finished: An initial directory structure has been created.
You should now populate your master file C:\Users\me\Dropbox\Kuchentest\source\index.rst and create other documentation
source files. Use the Makefile to build the docs, like so:
make builder
where "builder" is one of the supported builders, e.g. html, latex or linkcheck.
But despite this output, there are no such files or source folder being created.
What can I do to cleanly reset my Sphinx installation and get my documentation to run again?
Solving this requires a somewhat awkward explanation that is simultaneously dependent on: operating system (Windows), your particular installation, and how you are executing Sphinx.
On Windows you can have several Python installations in different places (depending...):
One usual location is C:\Program Files\Python3x.
The pre-configured default path is C:\Users\me\AppData\Local\Programs\Python\Python3.x\. I find this inconvenient because it's located deep away from the root.
The currently prevalent way of extending a Python installation is using a virtual environment (venv).
Your venv, wherever you decided to place it. (Using a venv is considered the "de facto" best practice.)
At one point in time, you set PYTHONPATH as an environment variable on Windows and it's in those paths that Windows will look for your Python installations. Notice the rules for Module Search Path. The problem now becomes that if you have more than one Python installation set on your Path, Windows will also look for libraries in other installations...
(A general note about Python installations on Windows is necessary. Sometime in 2019 Microsoft included Python with Windows - as noted by a prominent SO user in this answer, and referred in the documentation. Around that time there was a Windows bug that required environment variables to be set with the administrator account - I can't find a reference but it's mentioned somewhere on SO. Meaning it's advisable to make your separate installation of Python and set environment variables as admin.)
Having said that, the problem you are describing has several aspects (take special notice of the terminal you are using):
The first warning in your sphinx-build indicates Sphinx is trying to read files from your user account installation (point 2 above). The problem is the terminal where you are executing sphinx-build does not have permission to read from the user account installation directories, because the terminal is being run under a different user account or because the account installation paths aren't set with read permission... Having said that, reconsider the warnings:
copying static files... WARNING: Failed to copy a file in html_static_file: c:\users\me\appdata\local\programs\python\python38\lib\site-packages\sphinx\themes\basic\static/jquery-3.5.1.js: PermissionError(13, 'Permission denied')
WARNING: Failed to copy a file in html_static_file: c:\users\me\appdata\local\programs\python\python38\lib\site-packages\sphinx\themes\basic\static/jquery.js: PermissionError(13, 'Permission denied')
It may also be the case you deleted Sphinx from your account installation and the files/paths simply aren't there.
Next when you try to reinstall Sphinx using pip it is not entirely clear if it's an outdated cache issue, or if pip is finding Sphinx in another installation on your PYTHONPATH... It may be the case that Sphinx is installed and the terminal simply hasn't read/write permission (depends what user account invoked the terminal), or the directory may be hidden in the file explorer...
What can I do to cleanly reset my Sphinx installation and get my documentation to run again?
Your Python base installations (points 1 and 2 above) are only supposed to be written to for system or user wide changes (not for a particular project change).
It is strongly recommended that you use a venv. (If you haven't before this would be the right time to consider doing so because it's the easiest and cleanest solution). This may initially seem confusing because historically there have been several virtual-environments for Python. Currently venv is the most commonly cited solution and using it is simple, your IDE should have a built-in UI to help you create it with a couple of clicks.
A venv is a Python environment that extends your base installation, it avoids a need to change your base installation when you have to make project specific changes (like having Sphinx installed, should ideally be on the venv not the base installation.)
Finally, when you run Sphinx from the terminal it's advisable to activate your venv on the terminal, otherwise the Python installation that is executed may depend on the user account that invoked the terminal.

Deploying an Rmarkdown file containing Python code (enabled by reticulate) to shinyapp.io

I have a problem that I have been trying to fix for a really long time now and I would really appreciate any help with it. So, I wrote an rmarkdown with both R and Python code. Everything runs fine locally. The document knits nicely (it also displays a shiny app I created inside it). The problem comes when I try to deploy it to shinyapp.io. I get a warning and an error when I do that.
Warning:
using reticulate but python was not specified; will use python at /usr/bin/python. Did you forget to set the RETICULATE_PYTHON environment variable in your .Rprofile before publishing?
Error:
Error in py_call_impl: ImportError: No module named spacy. When I start installing numPy first, it gives the same error saying it can't find numPy.
(I got these from the log in shinyapp.io. I have put the entire log at the bottom)
Here is the relevant part of my code:
reticulate::virtualenv_create(envname = 'test', python = "python")
reticulate::virtualenv_install(envname = 'test', packages =
c('spacy','numpy'))
reticulate::use_virtualenv(virtualenv = 'test', required = FALSE)
And I also have a .Rprofile file that contains the line Sys.setenv(RETICULATE_PYTHON = ".venv/bin/python"). This file is in the same directory as the Rmd file I am deploying.
I have searched for a solution everywhere and everything I have found hasn't been useful. Some similar Github issues are still open and some similar questions on Stack Overflow don't have answers. The most useful thing I have found is this and I have tried everything they suggest there. Here is what I have tried so far:
I have downgraded rsconnect. Initially, I couldn't even deploy the Rmd. I was able to do that after downgrading rsconnect, but this didn't solve the two problems that came next.
I read that I could change the file name .Rprofile to .rsconnect_profile but this hasn't helped at all.
I have tried using Python 3.6 by specifying it when creating the virtual environment (virtualenv_create(envname = 'test', python = "python3.6"). This made things worse. Moreover, shinyappy.io keeps overriding my choice of Python. It always defaults to Python2.7.
What is that I am missing here? Did I specify the virtual environments wrong? What is the difference between the virtual environment in .Rprofile and test, the one that I create in my code and install Python packages in? How are they related? At some point, test was actually RETICULATE_PYTHON and that didn't make a difference. Having read about all of this, I am actually even more confused.
This is the rest of the log:
Server version: 1.7.6-6
LANG: en_US.UTF-8
R version: 3.6.1
shiny version: 1.3.2
httpuv version: 1.5.1
rmarkdown version: 1.14
knitr version: 1.23
jsonlite version: 1.6
RJSONIO version: (none)
htmltools version: 0.3.6
warning: using reticulate but python was not specified; will use python at /usr/bin/python
Did you forget to set the RETICULATE_PYTHON environment variable in your .Rprofile before publishing?
Using pandoc at /opt/connect/ext/pandoc2
Using jsonlite for JSON processing
Starting R with process ID: '23'
Listening on http://127.0.0.1:46584
processing file: index.Rmd
|
| | 0%
|
|... | 5%
ordinary text without R code
|
|...... | 10%
label: setup (with options)
List of 1
$ include: logi FALSE
New python executable in /home/shiny/.virtualenvs/test/bin/python2
Also creating executable in /home/shiny/.virtualenvs/test/bin/python
Installing setuptools, pkg_resources, pip, wheel...done.
Running virtualenv with interpreter /usr/bin/python2
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting spacy
Downloading https://files.pythonhosted.org/packages/0f/bf/f76eed483f5bcb9772869a002b0029eb2888dd662f75538b3bcc197bb655/spacy-2.1.6-cp27-cp27mu-manylinux1_x86_64.whl (30.8MB)
Collecting numpy
Downloading https://files.pythonhosted.org/packages/1f/c7/198496417c9c2f6226616cff7dedf2115a4f4d0276613bab842ec8ac1e23/numpy-1.16.4-cp27-cp27mu-manylinux1_x86_64.whl (17.0MB)
Collecting blis<0.3.0,>=0.2.2 (from spacy)
Downloading https://files.pythonhosted.org/packages/61/b7/6f32b1e2506937525802d94136eb73dec2cacd4a21c9bec9c90549e2b413/blis-0.2.4-cp27-cp27mu-manylinux1_x86_64.whl (3.2MB)
Collecting thinc<7.1.0,>=7.0.8 (from spacy)
Downloading https://files.pythonhosted.org/packages/c4/d9/944e0d409e8af994d8d09268a3b7fb9eacbdd08f9bc72b9b0b66c405c05a/thinc-7.0.8-cp27-cp27mu-manylinux1_x86_64.whl (2.1MB)
Downloading https://files.pythonhosted.org/packages/ac/aa/9b065a76b9af472437a0059f77e8f962fe350438b927cb80184c32f075eb/pathlib-1.0.1.tar.gz (49kB)
Collecting pathlib==1.0.1; python_version < "3.4" (from spacy)
Downloading https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl (57kB)
Collecting requests<3.0.0,>=2.13.0 (from spacy)
Collecting preshed<2.1.0,>=2.0.1 (from spacy)
Downloading https://files.pythonhosted.org/packages/df/b1/4ff2cbd423184bd68e85f1daa6692753cd7710b0ba68552eb64542906a57/cymem-2.0.2-cp27-cp27mu-manylinux1_x86_64.whl
Collecting cymem<2.1.0,>=2.0.2 (from spacy)
Collecting plac<1.0.0,>=0.9.6 (from spacy)
Collecting murmurhash<1.1.0,>=0.28.0 (from spacy)
Downloading https://files.pythonhosted.org/packages/ed/31/247b34db5ab06afaf5512481e77860fb4cd7a0c0ddff9d2566651c8c2f07/murmurhash-1.0.2-cp27-cp27mu-manylinux1_x86_64.whl
Downloading https://files.pythonhosted.org/packages/9e/9b/62c60d2f5bc135d2aa1d8c8a86aaf84edb719a59c7f11a4316259e61a298/plac-0.9.6-py2.py3-none-any.whl
Collecting srsly<1.1.0,>=0.0.6 (from spacy)
Downloading https://files.pythonhosted.org/packages/3d/17/e003b2e9122500762a9ba6f3dfe4db912604e6be840c7d3041ae72787ae3/srsly-0.0.7-cp27-cp27mu-manylinux1_x86_64.whl (175kB)
Collecting wasabi<1.1.0,>=0.2.0 (from spacy)
Downloading https://files.pythonhosted.org/packages/25/b1/9098d07e70b960001a8a9b99435c6987006d0d7bcbf20523adce9272f66e/preshed-2.0.1-cp27-cp27mu-manylinux1_x86_64.whl (80kB)
Collecting tqdm<5.0.0,>=4.10.0 (from thinc<7.1.0,>=7.0.8->spacy)
Downloading https://files.pythonhosted.org/packages/be/ba/08c53c55cc97f62310ed83e1a4d91e424f221645c88c2dddd41f179fd1f7/wasabi-0.2.2.tar.gz
Downloading https://files.pythonhosted.org/packages/9f/3d/7a6b68b631d2ab54975f3a4863f3c4e9b26445353264ef01f465dc9b0208/tqdm-4.32.2-py2.py3-none-any.whl (50kB)
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests<3.0.0,>=2.13.0->spacy)
Downloading https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl (150kB)
Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.13.0->spacy)
Downloading https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl (157kB)
Collecting chardet<3.1.0,>=3.0.2 (from requests<3.0.0,>=2.13.0->spacy)
Downloading https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl (133kB)
Collecting idna<2.9,>=2.5 (from requests<3.0.0,>=2.13.0->spacy)
Downloading https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl (58kB)
Building wheels for collected packages: pathlib, wasabi
Building wheel for pathlib (setup.py): started
Building wheel for pathlib (setup.py): finished with status 'done'
Stored in directory: /home/shiny/.cache/pip/wheels/f9/b2/4a/68efdfe5093638a9918bd1bb734af625526e849487200aa171
Building wheel for wasabi (setup.py): started
Building wheel for wasabi (setup.py): finished with status 'done'
Stored in directory: /home/shiny/.cache/pip/wheels/b3/2c/d1/78fd1255da73ff77b372ecc56bcdb15115ab0882bb6f67af17
Successfully built pathlib wasabi
Installing collected packages: numpy, blis, tqdm, wasabi, pathlib, plac, cymem, preshed, murmurhash, srsly, thinc, urllib3, certifi, chardet, idna, requests, spacy
Successfully installed blis-0.2.4 certifi-2019.6.16 chardet-3.0.4 cymem-2.0.2 idna-2.8 murmurhash-1.0.2 numpy-1.16.4 pathlib-1.0.1 plac-0.9.6 preshed-2.0.1 requests-2.22.0 spacy-2.1.6 srsly-0.0.7 thinc-7.0.8 tqdm-4.32.2 urllib3-1.25.3 wasabi-0.2.2
|
|......... | 14%
ordinary text without R code
label: load_packages (with options)
List of 2
$ include: logi FALSE
|
|............ | 19%
$ engine : chr "python"
Quitting from lines 22-29 (index.Rmd)
Warning: Error in py_call_impl: ImportError: No module named spacy
File "<string>", line 1, in <module>
Detailed traceback:
149: <Anonymous>
Have you tried:
reticulate::use_virtualenv(virtualenv = 'test', required = TRUE)
So far it seems logical to enforce.
I also came across this one:
https://github.com/ranikay/shiny-reticulate-app

Where is symfit?

I pip'ed symfit, (https://pythonhosted.org/symfit/) and it appears to have been installed, to wit:
C:\>pip install symfit
Collecting symfit
Using cached https://files.pythonhosted.org/packages/6e/58/0a58f7a7e39c052afe790ae0989c070e1a2d1c7d472ae3cfd4ee785f1c55/symfit-0.4.5-py2.py3-none-any.whl
Collecting numpy>=1.12 (from symfit)
Downloading https://files.pythonhosted.org/packages/d2/9a/e377ff2dabf66493ac607f6b45b4efeda898ad3fbc43b418bd7dba4a1d67/numpy-1.15.3-cp34-none-win_amd64.whl (13.5MB)
100% |████████████████████████████████| 13.5MB 975kB/s
Collecting sympy<=1.1.1 (from symfit)
Collecting scipy>=1.0 (from symfit)
Using cached https://files.pythonhosted.org/packages/6f/ee/cfce56ea456a809b983ac4089876dbffd15233c17df7bca1e35e84c3ce95/scipy-1.1.0-cp34-none-win_amd64.whl
Collecting mpmath>=0.19 (from sympy<=1.1.1->symfit)
Installing collected packages: numpy, mpmath, sympy, scipy, symfit
Found existing installation: numpy 1.8.1
Cannot uninstall 'numpy'. It is a distutils installed project and thus we cannot
accurately determine which files belong to it which would lead to only a partial uninstall.
But it is not found when I try to import it into a python program, and a
dir /s symfit
from the root directory of the drive fails to find symfit.
Where did pip put it, and how can I access it?

Pip wheel is building a new wheel when one is already present

I'm trying to build a wheel for pandas at 0.17.1. I want it to use numpy version 1.9.2. I have a wheel for that version of numpy already built in $PWD/wheelhouse, and a few other pandas dependencies as well:
ls wheelhouse/
numpy-1.9.2-cp34-cp34m-linux_x86_64.whl python_dateutil-2.4.2-py2.py3-none-any.whl pytz-2015.7-py2.py3-none-any.whl six-1.10.0-py2.py3-none-any.whl
However, when I tell pandas to build, even though I inform it of the wheelhouse folder with --find-links, it still builds a new wheel of numpy:
$ pip --version
pip 6.0.8 from /home/me/.pyenv/versions/3.4.3/lib/python3.4/site-packages (python 3.4)
$ pip wheel pandas==0.17.1 --find-links=$PWD/wheelhouse
Collecting pandas==0.17.1
Using cached pandas-0.17.1.tar.gz
[... snipped, building stuff ...]
Collecting python-dateutil>=2 (from pandas==0.17.1)
File was already downloaded /home/me/rebuild_numpy_py3/wheelhouse/python_dateutil-2.4.2-py2.py3-none-any.whl
Collecting pytz>=2011k (from pandas==0.17.1)
File was already downloaded /home/me/rebuild_numpy_py3/wheelhouse/pytz-2015.7-py2.py3-none-any.whl
Collecting numpy>=1.7.0 (from pandas==0.17.1)
Using cached numpy-1.10.2.tar.gz
Running from numpy source directory.
Collecting six>=1.5 (from python-dateutil>=2->pandas==0.17.1)
File was already downloaded /home/me/rebuild_numpy_py3/wheelhouse/six-1.10.0-py2.py3-none-any.whl
Skipping python-dateutil, due to already being wheel.
Skipping pytz, due to already being wheel.
Skipping six, due to already being wheel.
Building wheels for collected packages: pandas, numpy
Running setup.py bdist_wheel for pandas
Destination directory: /home/me/rebuild_numpy_py3/wheelhouse
Running setup.py bdist_wheel for numpy
Destination directory: /home/me/rebuild_numpy_py3/wheelhouse
Successfully built pandas numpy
$ ls wheelhouse/
numpy-1.10.2-cp34-cp34m-linux_x86_64.whl numpy-1.9.2-cp34-cp34m-linux_x86_64.whl pandas-0.17.1-cp34-cp34m-linux_x86_64.whl python_dateutil-2.4.2-py2.py3-none-any.whl pytz-2015.7-py2.py3-none-any.whl six-1.10.0-py2.py3-none-any.whl
The version bound on pandas is >=1.7.0, so certainly the wheel that's there should work. So why is it building a new wheel? How can I force it to use the existing one?
That's not how wheel works. pandas requires a >= 1.7.0 version of numpy. You're trying to force it to look for 1.9.2 even though there's already a newer version of numpy. Even if you already have it in your wheelhouse dir, it will check PyPI for the latest version of numpy as this is what is stated in pandas' setup.py file.

Categories

Resources