I would like to use my own package located in private repository.
Actually, i use requirements.txt file to install dependencies in my Python AppEngine.
However, i don't understand where do I have to add my private dependency.
Thanks
You have to add the private package to the requirements.txt by using the git+ssh protocol. Better instructions can be found here: Is it possible to use pip to install a package from a private github repository?
However, I am not sure how GAE handles private keys required to access the package. If GAE only copies the lib directory from the AppEngine project repository without other magical operations upon deployment, it should work. If AppEngine only uses the lib dependencies to collect the names of the packages and then does something funny by itself, I guess you would be out of luck, if there does not exist a way to give credentials to the AppEngine for deployment.
I am unable to test this now, but will update as soon as I have.
Related
I'm trying to install a private python package in the Google Cloud composer environment. I usually install the package using a personal access token. That works with usual pip.
pip install git+https://$TOKEN#github.com/org/repo.git#main
works as expected. However, trying to use this in Cloud Composer tells me the following:
PyPI package name must follow the format of 'identifier' specified in PEP-508. (https://peps.python.org/pep-0508/)
I'm not sure what I should do here. Does anyone have some experience with this?
If the repo hosting your private Python packages accepts external calls, you can generate a pip.conf file with params to access to this repo :
[global]
extra-index-url = https://username:password#yourrepo.com/simple
Then you have to copy this file to Cloud Composer bucket : {composer_bucket_name}/config/pip/pip.conf
You can also host your Python private packages in GCP Artifact registry, then generate a pip.conf file allowed to access to the created Python repo in Artifact registry and then copy this file to the Composer bucket : {composer_bucket_name}/config/pip/pip.conf
After this you can add your private Python Packages in the PyPi packages list from Cloud Composer, like public packages :
You can also check this link from official documentation : https://cloud.google.com/composer/docs/how-to/using/installing-python-dependencies
I'm using the command func azure functionapp publish to publish my python function app to Azure. As best I can tell, the command only packages up the source code and transfers it to Azure, then on a remote machine in Azure, the function app is "built" and deployed. The build phase includes the collection of dependencies from pypi. Is there a way to override where it looks for these dependencies? I'd like to point it to my ow pypi server, or alternatively, provide the wheels locally in my source tree and have it use those. I have a few questions/problems:
Are my assumptions correct?
Assuming they are, is this possible, and how?
I've tried a few things, read some docs, looked at the various --help options in the CLI tool, I've set up a pip.conf file that I've verified works for local pip usage, then on purpose "broken it" and tried to see if the publish would fail (it did not, so this leads me to believe it ignores pip.conf, or the build (and collection of dependencies happens on the remote end). I'm at a loss and any tips, pointers, or answers are appreciated!
You can add additional pip source to point to your own pypi server. Check https://learn.microsoft.com/en-us/azure/azure-functions/functions-reference-python#remote-build-with-extra-index-url
Remote build with extra index URL:
When your packages are available from an accessible custom package index, use a remote build. Before publishing, make sure to create an app setting named PIP_EXTRA_INDEX_URL. The value for this setting is the URL of your custom package index. Using this setting tells the remote build to run pip install using the --extra-index-url option. To learn more, see the Python pip install documentation.
You can also use basic authentication credentials with your extra package index URLs. To learn more, see Basic authentication credentials in Python documentation.
And regarding referring local packages, that is also possible. Check https://learn.microsoft.com/en-us/azure/azure-functions/functions-reference-python#install-local-packages
I hope both of your questions are answered now.
I noticed that the Flask tutorial involves use of pip. It looks like it's only used to create a wheel locally that will make setup on a server easier, but as a web dev newbie I'm curious: Does anyone actually go all the way to uploading their websites to a public repository like PyPI? What are the implications (security-related or otherwise) of doing so?
No, you should not upload private web projects to PyPI (the Python Package Index)! PyPI is for public, published projects intended to be shared.
Creating a package for your web project has advantages when deploying to your production servers, but that doesn't require that your package is available on PyPI. The pip command-line tool can find and install packages from other repositories, including private Git or Mercurial or SVN repositories or private package indexes too, as well as from the filesystem.
For the record: I've not bothered with creating packages for any of my recent deployed Flask projects I (helped) develop. These were put into production on cloud-hosted hardware and / or in Docker containers, directly from their Github repositories. Dependencies are installed with pip (as driven by the Pipenv tool in all cases), but the project code itself was just loaded directly from the checkout.
That said, if those projects start using continuous integration down the line, then it may make sense to use the resulting tested code, packaged as wheels, in production too. Publish those wheels to a private index or server; there are several projects and even a few SaaS services already available that let you manage a private package index.
If you do publish to PyPI, then anyone can download your package and analyse how your website works. It'd make it trivial for black-hat hackers to find and exploit security issues in your project that way.
Here is the scenario I am dealing with:
I WANT to/HAVE setup CircleCI build for my project with unit tests etc.
In this project I use another one of my libraries which needs to be installed on the build container in CirleCi, otherwise my tests are failing.
I need to find a way to either:
pull git repository of external reference and install it
Or download it as zip
Or some other way ?
Happy to add more explanation if needed.
From the section Using Resources External to Your Repository:
CircleCI supports git submodule, and has advanced SSH key management to let you access multiple repositories from a single test suite. From your project’s Project Settings > Checkout SSH keys page, you can add a “user key” with one-click, allowing you access code from multiple repositories in your test suite. Git submodules can be easily set up in your circle.yml file (see example 1).
CircleCI’s VMs are connected to the internet. You can download dependencies directly while setting up your project, using curl or wget.
(Or just using git clone without submodules.)
Previously I worked on a Node.js project I ran a private npm registry and used PayPal's Kappa to proxy between this and the public npm registry. I also used an npmrc file in each project to define the url of the registry. Now when developers checkout the projects and run npm install the request will go to Kappa which first looks in the local registry and if it doesn't find the matching module it forwards the request to the public registry.
Is there anyway to achieve this with Python projects. Ideally I'd like the developers to simple check out a service and use pip install -r requirements.txt without knowing if the dependencies are coming from public or private PyPi.