I am trying to make a python script which would check for changes in my git local working folder, and automatically push them to the online repo. Currently only using git manually to do it. I want to know, what would a script require to do this without manual intervention.
The commands I'd type in my shell are:
#for checking the status, and determining if there are untracked files
git status
#if there are untracked files...add them
git add .
#add my commit message
git commit -m "7/8/2012 3:25am"
#push it to my online repo
git push origin master
#check if changes came on remote
git diff origin/master
#merge my repo with origin
git merge origin/master
When doing the git push, you'd always have to enter username/password. I know that git has a way around this which involves making ssh keys and all. But I am assuming there is some way GitPython is doing it. I mean we can pass username/password through code, or go with the former. So what are my options regarding authentication when I am using GitPython?
Edit: There are apps which actually generate the ssh keys, for e.g. github's windows application. How is the windows app doing this? My assumption is that there is surely some git api for it...
If you are authenticating using SSH keys, then use ssh-agent to load your key once and then you can keep using the key without having to provide the password all the time.
Alternatively, you could simply generate a key without a password, if you don’t care about your key security.
I have looked up the code to be sure, there is nothing that you can define a username/password combination for communication.
This has to be it because ssh does not give you the ability to provide password beforehand, it intentionally asks for user prompt. The only was is to use ssh keys for automation.
However, if you really want to bend your limits. There's open source app for non-interaction ssh communication without using ssh keys: http://sourceforge.net/projects/sshpass/
You compile & install this and direct a communication protocol like ssh:// to this app, it may work. However, I don't think that you should; just use keys, they're great =)
Related
I am just messing around, but I have a blank project in Gitlab. I have a Python script (well, ipython notebook). I would like the Python script to simply push any files in "C:/users/files" to the Gitlab project.
I cannot find any instruction on how to do this from the API webpage here - https://python-gitlab.readthedocs.io/en/stable/gl_objects/commits.html.
It looks to me like it is impossible to do this, but surely it is.
Push is not possible with API but only possible with Git protocol. You can use GitPython (depends on git as it runs git under the hood) or dulwich (doesn't depends on git as it implements Git protocol in Python).
I would like to be able to install python packages to openshift, but those packages live in my private repositories, on bitbucket.
How can I create a SSH key for Openshift, and how do I make Openshift use it when installing packages? (after adding the corresponding public key to bitbucket as a Deploy Key)
What I've tried:
I used ssh-keygen to create a key on ~/.openshift_ssh/. It was created, but I'm not sure it is being used.
I also tried adding the publick key on <jenkins_dir>/app-root/data/.ssh/jenkins_id_rsa.pub, but the result is always the same. On the jenkins console output of the buildjob:
Doing git clone from ssh://git#bitbucket.org/jpimentel/zed.git to /tmp/easy_install-FpEKam/zed.git
Host key verification failed.
fatal: Could not read from remote repository.
Please make sure you have the correct access rights
and the repository exists.
Is there anything that can be done?
So, at this time OpenShift does not offer a simple mechanism to do this. I would urge developers to look at https://gondor.io/support/dependencies/ for an effective solution to the problem.
That said, I was finally able to find an acceptable (at least, for me) workaround that works on both scalable and non scalable apps, with the following procedure:
create a deploy/ directory in the repository
put a copy of your private deploy key in said directory
create a bash script deploy/wrapper.sh that will run ssh with the provided key:
#!/bin/sh
ssh -o StrictHostKeyChecking=no -i $OPENSHIFT_REPO_DIR/deploy/id_deploy $#
note the option passed to disable host key check; cloning will fail without it.
install dependencies in the build hook (.openshift/action_hooks/build). In my case I added something like
echo "Cloning private repo..."
source $VIRTUAL_ENV/bin/activate
GIT_SSH=$OPENSHIFT_REPO_DIR/deploy/wrapper.sh pip install git+ssh://git#bitbucket.org/team/reponame.git#egg=reponame
commit everything and push it to openshift.
profit!
If you want to deploy your custom python modules then recommended way is to create a libs directory in the application source code root and push them to your application git repository. OpenShift will automatically pick your modules.
Problem:
I have a passwords.py that I need to push to Heroku for my app to work, but I cant commit it to my public git repo because then anyone would be able to view my passwords.
The passwords are tokens / secert_key's / other_api_keys to allow my app to authenticate its requests to 3rd party apis. I'm storing them in base64 encoding in the passwords.py, but if I push it to git encoded anyone would easily be able to see the passwords with b64decode().
How can I push my passwords file to Heroku with out including it in my public git repo?
or
How can I securely store my passwords in my public git repo?
What I've tried:
git push only one file to Heroku
Hiding a password in a python script (insecure obfuscation only)
Git pushing single file doesnt seem to be an option. While using any similar method to encode/decode the passwords would only give me a false sense of security. Any ideas on how to solve it? Thanks!
Use environment variables! You can access them from your python scripts, and heroku lets you easily set them for your app.
Here is some information about setting config vars in heroku.
Create a second branch containing the file. Do not track it on your public repository.
Whenever you need to push to heroku, rebase that branch to master and then push that branch to Heroku.
I have a server deployed. I am writing a crontab task. It uses a python script that need to check if there is new push into the repository. If it finds a new push, it will pull and update the server code and should restart the server.
My problem is how to make the python script know if there is new commit into the repository?
I know you can use
git rev-list deployment..origin/deployment
to check if there is any commit is available on the remote server.
But how to implement in the python script and make it decide to know that it need to pull?
Thanks
Nick
You will need to contact the server in any case, so you might as well pull.
If you are on the current branch when you pull, and want to detect whether the current branch has changed, you can always do (note: written in shell)
BEFORE=$(git rev-parse HEAD)
# git pull here
AFTER=$(git rev-parse HEAD)
# Changes if $BEFORE is different from $AFTER
fge is correct. You cannot see if someone else pushed without first either doing a git pull origin or a git fetch origin. Your command would have worked with a git fetch origin first.
git fetch origin
git rev-list deployment..origin/deployment
I am currently using github to develop a python application and am looking to deploy it on EC2.
Is there a good way to automatically handle the messiness this entails (setting up SSH key pairs on the EC2 instance for github, pulling from the github repository every time a commit is pushed to the master branch, etc.) without a bunch of custom scripts? Alternatively, is there an open-source project that has focused on this?
I wrote a simple python script to do this once. I also posted about it on my blog.
You set up mappings of your repositories and brances to point to local folders which already contain a checkout of that repo and branch. Then, you enable GitHub's post-receive hooks to hit the script which will then automatically trigger a git pull in the appropriate folder.