I am trying to clone a GCP Cloud Source Repository (CSR) using Cloud Build.
when trying to clone the Repo in my Dockerfile, I get this error:
fatal: could not read Username for 'https://source.developers.google.com': No such device or address
This appears to be an authentication error with git and CSR.
What I've tried: I have found https://cloud.google.com/build/docs/build-config-file-schema#network and this github issue https://github.com/GoogleCloudPlatform/cloud-builders/issues/343. I may not have put them in the right place in the config file though.
I'm aware that I could just use an SSH key, but I'd like to use the "inherited" authentication if that's possible. By "inherited" I mean use the cloud build service account.
If anyone can help with the specific issue, that would be great. However if you can enlighten me as to how authentication works from the Build to other GCP services that would be welcome.
-ps I'm actually installing via pip (just like the person in the GitHub issue linked above), but it needs to clone the repo first, and that's where the error is.
You can use the https://source.developers.google.com URL with a PAT(Personal Access Token) in the Dockerfile.
For that you need to generate a PAT.
After generating the token, use it in the dockerfile as follows:
FROM gcr.io/cloud-builders/git
RUN git clone https://<PAT>#source.developers.google.com/p/<PROJECT_ID>/r/<REPO_NAME>
Related
I need to have private Python packages in GCP usable in multiple projects. I haven't tried the Artifact Registry since that's still in alpha, so right now I've been trying with simple repositories, but I'm open to alternatives.
I have a Python package source code in a GCP Repository in Project A, and I have a cloud function in a repository also in Project A. In this cloud function I import the mentioned package by adding git+https://source.developers.google.com/p/project-a/r/my-python package in my requirements.txt file.
If I deploy this cloud function in Project A via gcloud functions in my terminal, specifying --source=https://source.developers.google.com/projects/project-a/repos/my-cloud-function and --project=project-a, it works fine, and the function can successfully import the elements from the package when I call it, but if I deploy this function in Project B instead, I get the following error:
Deploying function (may take a while - up to 2 minutes)...failed.
ERROR: (gcloud.functions.deploy) OperationError: code=3, message=Build failed: `pip_download_wheels` had stderr output:
Running command git clone -q https://source.developers.google.com/p/project-a/r/my-python-package /tmp/pip-req-build-f_bcp4y9
remote: PERMISSION_DENIED: The caller does not have permission
remote: [type.googleapis.com/google.rpc.RequestInfo]
remote: request_id: "abe4(...)"
fatal: unable to access 'https://source.developers.google.com/p/project-a/r/my-python-package/': The requested URL returned error: 403
ERROR: Command errored out with exit status 128: git clone -q https://source.developers.google.com/p/project-a/r/my-python-package /tmp/pip-req-build-f_bcp4y9 Check the logs for full command output.
This seems like a permissions issue. However, if I remove the package dependency from requirements.txt, it deploys fine, which means that Project B indeed has access to repos from Project A, so it seems like a issue inside Pip. However, Pip has no problems if I deploy to Project A, so I'm a little lost.
Many thanks in advance.
Artifact Registry is GA and no longer on Alpha/Beta since last year.
I replicated your issue. The error is indeed due to permissions, it didn't happened on the deployment when you remove the line on the requirements.txt, probably because the credentials had access to both projects.
In order to make the deployment correct you have to add the permissions on the repository to the service account that makes the deployment (which is the CF service account) that can be found on Cloud Functions - (select your Cloud Function) - Details, it should be something like project#appspot.gserviceaccount.com
Once you have located the service account add it to the Cloud Repository by clicking on Settings - Permissions and add it with at least the Source Repository Reader role
I am using PyPi to clone a github repository, however it a private repo. So I also need to authorise it with my token. I am unable to find any example of how to pass token with the clone request.
import git
git.Git('/local/file/path').clone('git#github.com:sample-repo.git', token=(mytoken))
This gives an error
"GitCommandError: Cmd('git') failed due to: exit code(129)"
cmdline: git clone --token=mytoken git#github.com:sample-repo.git
stderr: 'error: unknown option token=mytoken'
It works fine without the token when I try to clone a public repository. So the only issue here is how to pass the token to the above request. Is that possible or is there any other way to authorise git clone in a python script? My objective here is to automate a process to clone a github repository, generate some files using some API calls, add and commit those files to the repository, all within the same python script.
Thanks to the comments, I was able to clone my repository using the https url instead of the ssh url and it worked without the need for token.
import git
git.Git('/local/file/path').clone('https://github.com/sample-repo')
I'm using azure devops for building a python project where we have a requirements.txt that references private git repositories.
These repos are under the same github account that our devops account has access to, but are unable to get them using pip.
I notice that the 'Get sources' step adds an extra header to git commands:
[command]git -c http.extraheader="AUTHORIZATION: basic ***" fetch --tags --prune --progress --no-recurse-submodules origin
Is there any way that we can get this header into other steps to allow us to use the same authentication for private pip installation or otherwise access private git repositories that devops already has access to?
I do not want to use personal access tokens, or deploy keys as it just another layer of security to be maintained and managed. Azure devops already has authentication to these repositories and we should be able to use this.
Is there a way to check if there are updated files from the server using Google App Engine? Then download only the updated files to your local to match it on the server?
It may be possible if you're talking about the standard environment and you always follow disciplined app deployment procedure:
deploy from a git repository (unsure if other VCS systems work)
don't deploy when you have uncommitted changes in the repository from which you deploy
If you meet these requirements then you can access the source code for a particular deployed version of a service via StackDriver, as described in Google Cloud DataStore automatic indexing.
At least in my case in between the files in the root directory I found an automatically generated file called source-context.json, containing the git URL and revisionID of the repository from which the deployment was made. Which you can use to selectively update your local repo.
Another approach, requiring the same deployment discipline mentioned above, plus always deploying from the same repository, which you'll consider the unique master copy of your code or one of its mirrors (needed due to git's distributed nature). Then you only need to compare your local repo against the version being deployed in this master copy repo (or its mirror).
You might want to check out the Google Cloud Source Repositories as a possible such master copy repo. Or mirroring a public repo master copy (see Connecting a Hosted Repository). The advantage would be some convenience and integration with other Google Cloud Platform tools, for example:
gcloud source repos
the Source Browser
deployments from the cloud shell, see for example Google Cloud: How to deploy mirrored Repository
Otherwise direct downloading of the app code from GAE is monolithic - you can't select just specific files, see Downloading Your Source Code.
I am trying to run through the creation of a Flask web app in Azure using this instruction page.
Creating Web apps with Flask in Azure
In the "Application Overveiw" section, it lists some FlaskWebProjectfiles saying.
Here's an overview of the files you'll find in the initial Git repository
\FlaskWebProject\__init__.py
\FlaskWebProject\views.py
\FlaskWebProject\static\content\
\FlaskWebProject\static\fonts\
\FlaskWebProject\static\scripts\
\FlaskWebProject\templates\about.html
\FlaskWebProject\templates\contact.html
\FlaskWebProject\templates\index.html
\FlaskWebProject\templates\layout.html
The problem is that I don't get these files when I connect up Azure to a Github repository. I know they exist because my Azure app renders the this default Flask webapp. The files exist in /wwwroot.
I am sure that I am missing something obvious here, so if anyone has followed the most recent Flask setup instruction for Azure, and had success, their input would be great.
Your initial GitHub repository is empty, so you need to clone the repository.
The process is described in the same article you mentioned, but a little later.
Basically:
1) Go to the deployment source and configure the deployment source - for example, local github
2) Go to Settings => Properties. Here you should have Git URL where your files are placed
3) Go to your workstation, and execute
git clone https://yourdeploymentusername#todeleteflask.scm.azurewebsites.net:443/todeleteflask.git
Enter password.
You should be all set now. Now, if you make change, you may push to the repository and it will arrive on the site.