Git pre-pushed object on remote server? git ls-tree - python

I have a Atlassian Stash server for git.
I am looking to write a script that will run java code formatter as a pre-receive hook (before it pushes the changes to the repository).
So, what I am looking to do is NOT to do the work on the stash server itself rather perform the work on another server and send the status back (0 or 1) to the Stash server.
I have written the script in Python where it calls a cgi (python) script on the remote server with "ref oldrev newrev" as HTTP GET Method. Once I have the STDIN values (ref oldrev newrev) on a remote server, I created a dir, git init, git remote add origin URL, and git fetch (i even tried git pull) to get the latest contents/objects of a reporsitory in hoping to get the object that has not been pushed to the repository but its in a pre-pushed stage environment.
The hash or SHA key or "newrev" key of the object that is in the pre-pushed stage: 36ac63fe7b15049c132c310e1ee153e044b236b7
Now, when I run 'git ls-tree 36ac63fe7b15049c132c310e1ee153e044b236b7 Test.java' inside the directory I created above, it gives me error.
'fatal: not a tree object'
Now, My questions are:
How to get that object on a remote server?
What might be the git command that I run that will give me that object in that stage?
Is there any other way of doing this?
Does it make any sense of what I've asked above. Let me know if I am not clear and I will try to clear things up more.
Thanks very much in advanced for any/all the help?

java code formatter as a pre-receive hook
Don't do it. You're trying to run the equivalent of git filter-branch behind your developer's back. Don't do it.
Is there any other way of doing this?
If you want inbound code formatted in a particular way, validate the inbound files. If any aren't done right list them and reject the push.
How to get that object on a remote server?
You can't fetch arbitrary objects, you can only fetch by ref (branch or tag) name. The pre-receive hook runs before any refs have been updated, so no ref names the inbound commits.

Related

How to see change on remote repo, which not in local repo?

I'm executing a command 'git push' and get error.
My remote repository is ahead of the local one.
I want to find out what changes have occurred on the remote repository.
Assuming you don't want to pull those changes yet, you can accomplish this with git log. First run git fetch to update your local refs. Then you'll want to run git log my-branch..origin/my-branch. This will show you commits on origin/my-branch (i.e. remote), which do not exist in your local repository.

Set credentials to pull from local git repository with PythonGit

I have a git repository on an internal server and now want to have a scheduled task that automatically pulls changes to my local version. Found the GitPython package, which seems to be exactly what I need, but can't get it to work due to the password protection of the repo.
I have already cloned the repo to my local path (git clone git#LOCAL_IP:/repo/MY_GIT.git .) and will get prompted for the password every time I execute git pull from the command line (fair enough). According to How can I call 'git pull' from within Python?, I then tried exactly this
import git
g = git.cmd.Git(MY_LOCAL_PATH)
g.pull()
and (of course) get an error:
...
git.exc.GitCommandError: Cmd('git') failed due to: exit code(1)
cmdline: git pull
stderr: 'Permission denied, please try again.
Permission denied, please try again.
...
Unfortunately, from the many answers around the web dealing with PythoGit, I found none that tells me how I can set the password (I know that you should never ever hardcode passwords, but still...).
You can save credentials within git so that every git client can access them and won't ask the user. See How can I save username and password in Git? for details on how to do that.

How to run git fetch over ssh in a windows subprocess

I've got some code which needs to grab code from github periodically (on a Windows machine).
When I do pulls manually, I use GitBash, and I've got ssh keys running for the repos I check so everything is fine. However when I try to run the same actions in a python subprocess I don't have the ssh services which GitBash provides and I'm unable to authenticate to the repo.
How should I proceed from here. I can think of a couple of different options:
I could revert to using https:// fetches. This is problematic because the repos I'm fetching use 2-factor authentication and are going to be running unattended. Is there a way to access an https repo that has 2fa from a command line?
I've tried calling sh.exe with arguments that will fire off ssh-agent and then issuing my commands so that everything is running more or less the way it does in gitBash, but that doesn't seem to work:
"C:\Program Files (x86)\Git\bin\sh.exe" -c "C:/Program\ Files\ \(x86\)/Git/bin/ssh-agent.exe; C:/Program\ Files\ \(x86\)/Git/bin/ssh.exe -t git#github.com"
produces
SSH_AUTH_SOCK=/tmp/ssh-SiVYsy3660/agent.3660; export SSH_AUTH_SOCK;
SSH_AGENT_PID=8292; export SSH_AGENT_PID;
echo Agent pid 8292;
Could not create directory '/.ssh'.
The authenticity of host 'github.com (192.30.252.129)' can't be established.
RSA key fingerprint is XXXXXXXXXXX
Are you sure you want to continue connecting (yes/no)? yes
Failed to add the host to the list of known hosts (/.ssh/known_hosts).
Permission denied (publickey).
Could I use an ssh module in python like paramiko to establish a connection? It looks to me like that's only for ssh'ing into a remote terminal. Is there a way to make it provide an ssh connection that git.exe can use?
So, I'd be grateful if anybody has done this before or has a better alternative
The git bash set the HOME environment variable, which allows git to find the ssh keys (in %HOME%/.ssh)
You need to make sure the python process has or define HOME to the same PATH.
As explained in "Python os.environ[“HOME”] works on idle but not in a script", you need to set HOME to %USERPROFILE% (or, in python, to os.path.expanduser("~") ).

hg mercurial - how to merge in 3 way merge window in shell

I have less experience with mercurial. i am having this problem:
I push everytime from my windows pc in tortoiseHG-Workbench to repo. works fine.
I pull to my server from repo with hg pull .... works fine.
then it asks me to run hg update. i will do it. but then it says, there is something to merge in my views.py and automatically opens a 3-way merge window in shell. I am using ssh tunnel - PuTTy.
in this 3way merge window, no hg commands are available. what i always do is:
> views.py #emptying the file
then i copy paste the views.py from my local pc to server and save it.
this works. but thru this, there will be always conflict because i am changing the same views.py in both sides. how do i solve this so that i dont have to merge everytime? i desperately need some help!
the problem is, there is no hg commands available in 3-way merge window
If you don't have any differences between your production and development script that should be merged, it is safe to always do update clean with update -C. This will replace all local changes you made with the latest version that you pulled from the repository. So the workflow would be:
hg pull
hg update -C

ssh - getting metadata of many remote files

there's a remote file system which i can access using ssh.
i need to:
scan this file system to find all the files newer than a given datetime.
retrieve a list of those files' names, size, and modified_time_stamp
some restrictions:
i can't upload a script to this remote server. i can only run commands through ssh
there could be well over 100k of files in the remote server, and this process should happen at least once a minute, so the number of ssh calls should be minimal, and preferably equal to 1
i've already managed to get (1) using this:
`touch -am -t {timestamp} /tmp/some_filename; find {path} -newer /tmp/some_filename; rm /tmp/some_filename')`
and i thought i can move in the direction of piping the results into "xargs ls -l" and then parsing the results to extract the size and timestamp from there, but then i found this article...
also, i'm running the command using python (i.e. it's not just a command line), so it's ok to do some post processing on the results coming from the ssh command
I suggest writing or modifying your python script on the server side as follows:
When no data has been acquired in a while, acquire initial data using the touch/find script you provided and making calls on the found files to get the needed properties
Then, in the python script on the server, subscribe to inotify() data to get updates.
When a remote connects and needs all this data, provide the latest update from combining 1+2
inotify is a system call supported in Linux that allows you to monitor file system events on a directory in real time.
See:
https://serverfault.com/questions/30292/automatic-notification-of-new-or-changed-files-in-a-folder-or-share
http://linux.die.net/man/7/inotify
https://github.com/seb-m/pyinotify

Categories

Resources