replicate modification from Devstack to Openstack (production) - python

I installed openstack through DevStack because I had to modify some files.
When I install DevStack, I have all the files under /opt/stack. There I have services folders (glance, keystone...) and libraries folders (python-glanceclient, python-keystoneclient).
If I modify those files, how can I replicate modification on a already deployed Openstack? Installing openstack without devstack builds a different structure of folder.
I mean, where the python-'service'client folders are in a fresh openstack installation?
Thank you

devstack pulls it's openstack software stack from github. the git repot's it installs from are located in /opt/stack.
What you may want to do is fork openstack as well as the repot of the openstack projects you wish to modify, then make your devstack deployment deploy not from the openstack repot, but your own forked repot.
you can do this by modifying the stack.sh script ( I believe ). devstack.org has a line by line explanation of the entire script on their site and that can point you in the right direction.
http://devstack.org/stack.sh.html read this.
once you've deployed using your own git repository you can of course edit and commit back to it. even push to it.
then any other devstack deployment you have can also be setup to pull from your repository instead of the public openstack repos.
of course rebasing later against openstack will be increasingly difficult as openstack developement moves at a fairly brisk pace.
If the modification you are making is one you want to commit back to the open source project check out this site:
http://wiki.openstack.org/HowToContribute
Basically openstack has a commit review and continuous integration environment that is based off of gerrit and jenkins. This is the method by which commits back to the open source repository are gate tested and manually reviewed by other developers before being merged.
If you are intending to deploy this for production use, I recommend against deploying from devstack. This is not the proper way to do that.

Related

Is it possible to commit and push into a remote repo from my local system using only the github api v3? [duplicate]

This question already has an answer here:
Are we able to use GitHub API to create a commit? especially v4?
(1 answer)
Closed 3 years ago.
Ok, this seems like a stupid question and I don't have a lot of experience using git as I have always been dependent on GitHub desktop for my work. So my question is, Can a github remote repo be committed to and pushed to via only the github api. Can it be done without installing git on your system? I have used PyGitHub and git-python or even python subprocess that can be used to create a remote repo, initialize and add to the the local repo and perform the commit and pushes to the remote repo and I know that these services require the use of the git client installed on the system.
So I was just wondering if only the standalone github api can be called through python requests to do the same but the requirement being that I don't have to get Git installed in my local system. Any help on the matter would be really enlightening.
Can a github remote repo be committed to and pushed to via only the github api. Can it be done without installing git on your system?
Kinda: you can interact with the raw objects via the API, I'm not sure you can behave as if git was on your machine and to push/pull from a local working copy as you'd do if you did have git installed locally.
My experience of it is that it requires some understanding of the low-level fundamentals of Git (blobs, trees, commits, refs, and their relations): the v3 API exposes a git data / git database endpoint which provides transparent access to low-level git structures. There are some limitations (e.g. can't interact with a brand new empty repository via this, you have to have at least one commit in it somehow, high-level operations like "cherrypick" or "rebase" are unavailable and have to be hand-rolled if you want them, ...) but aside from that it works fine.
To understand the low-level objects I would recommend the "Git Internals" chapter of the git book, sections 10.2 Git Objects and 10.3 Git References. There are also "from the ground up" tutorials out there which explain these structures in a more hands-on way by building partial git clients from the ground up.
So I was just wondering if only the standalone github api can be called through python requests to do the same but the requirement being that I don't have to get Git installed in my local system.
See above, kinda: you can most certainly interact with a github repository via the API in most every way possible, but rebuilding a git client out of the API might be difficult.
PyGithub will enable you to deal with whatever we have in Github.com. To manage your local git. Like, committing or pushing or stashing. You might need to use some python modules that deal with Git, not Github. One of the examples could be
https://github.com/gitpython-developers/GitPython

What are the risks/benefits of uploading my website to a public repository?

I noticed that the Flask tutorial involves use of pip. It looks like it's only used to create a wheel locally that will make setup on a server easier, but as a web dev newbie I'm curious: Does anyone actually go all the way to uploading their websites to a public repository like PyPI? What are the implications (security-related or otherwise) of doing so?
No, you should not upload private web projects to PyPI (the Python Package Index)! PyPI is for public, published projects intended to be shared.
Creating a package for your web project has advantages when deploying to your production servers, but that doesn't require that your package is available on PyPI. The pip command-line tool can find and install packages from other repositories, including private Git or Mercurial or SVN repositories or private package indexes too, as well as from the filesystem.
For the record: I've not bothered with creating packages for any of my recent deployed Flask projects I (helped) develop. These were put into production on cloud-hosted hardware and / or in Docker containers, directly from their Github repositories. Dependencies are installed with pip (as driven by the Pipenv tool in all cases), but the project code itself was just loaded directly from the checkout.
That said, if those projects start using continuous integration down the line, then it may make sense to use the resulting tested code, packaged as wheels, in production too. Publish those wheels to a private index or server; there are several projects and even a few SaaS services already available that let you manage a private package index.
If you do publish to PyPI, then anyone can download your package and analyse how your website works. It'd make it trivial for black-hat hackers to find and exploit security issues in your project that way.

Creating and transferring a site with Django

As a fledgling Django developer, I was wondering if it was customary, or indeed possible, to create a site with Django then transfer the complete file structure to a different machine where it would "go live".
Thanks,
~Caitlin
You could use GIT or Mercurial - or other version control system. To put the site structure on a central server. After that you could deploy the site for example with fabric to multiple servers. For deployment process you should consider using for example virtualenv to isolate the project from global python packages and requirements.
Of course that's possible and in fact it's the only way to "go live". You don't want to develop in your live server, do you? And it's true for any platform, not just django.
If I understood your question correctly, you need a system to push your development code to live.
Use a version control system: git, svn, mercurial etc.
Identify environment specific code like setting/config files etc. and have separate instances of them for each environment.
Create a testing/staging/PP environment which has live data or live-like data and deploy your code there before pushing it to live.
To avoid any downtime during deployment process, usually a symbolic link is created which points to the existing code folder. When a new release is to be pushed, a new folder is created with new code, after all other dependencies are done (like setting and database changes) and the sym link is pointed to the new folder.

How to use Mercurial to deploy Django applications?

I'm creating a server with Apache2 + mod_python + Django for development and would like to know how to use Mercurial to manage application development.
My idea is to make the folder where the Mercurial stores the project be the same folder to deploy Django.
Thank you for your attention!
I thought about this, good idea for development.
Use mercurial in common way. And of course you need deploy mercurial server before.
If you update your django project, it will be compiled on the fly.
My workflow:
Set up mercurial server or use bitbucket
Init repo locally
Push repo to central repo
On server pull repo in some target dir
Edit smth locally and push to central repo
Pull repo on server and everything is fine

How to use github and ec2 together to deploy a python application

I am currently using github to develop a python application and am looking to deploy it on EC2.
Is there a good way to automatically handle the messiness this entails (setting up SSH key pairs on the EC2 instance for github, pulling from the github repository every time a commit is pushed to the master branch, etc.) without a bunch of custom scripts? Alternatively, is there an open-source project that has focused on this?
I wrote a simple python script to do this once. I also posted about it on my blog.
You set up mappings of your repositories and brances to point to local folders which already contain a checkout of that repo and branch. Then, you enable GitHub's post-receive hooks to hit the script which will then automatically trigger a git pull in the appropriate folder.

Categories

Resources