I have a dev Linux server (RHEL) which doesn't have any internet connectivity where we need to develop a python application. I can connect to this dev box from my local Windows server where I have internet connection.
I would need to install python-3.75 and some other packages (some of which need gcc compilers and other dependencies) on this dev box.
What is the best way to do this considering that some packages will have many dependencies and there is no internet on the dev box ?
Some options that the internet research suggests for package installation are:
Download the packages using PIP DOWNLOAD on the local server > copy the package tar to the dev server > pip install package
download and unpack the source distribution > using the setup.py file of the package: run python setup.py install --user
Install using Wheels: Find the wheel for the package > upload it to the dev server > run pip install SomePackage.whl
Please let me know which one of these is good considering the limitations and kindly suggest if there is any other option as well.
Its kinda late but for those who may need it:
To start installing we need a virtual online server to download and configure file.
You can use VMware or VirtualBox to go through this procedure.
Steps below are the ones that you should do on server with internet connection.
First, we go to https://www.python.org/downloads/source/ and find our required version of python and download Gzipped source tarball of it.
Then we copy downloaded file to our target machine. You can copy using command below.
scp file username#ipaddress:dir
Then go to your specified dir and make sure the file has copied successfully. Now you can unzip the file using command below:
Tar -xvf file_name
Now go to unzipped folder which has configure file in it. And run command below:
./configure
This step needs internet connection.
Step 4 should create some files including make files. Now in your current directory run command below:
make
After having make process, go back to your previous directory and zip the directory which has make files in it. You can zip using command below:
tar -czf
We are going to copy this file to our target machine which doesn’t have internet connection.
Copy zipped file using scp command to your target machine.
Now you can go to your target machine and directory that you copied zipped file and it’s time to unzip your file using tar command.
Once you unzipped your file go to your directory which has make files in it and run command below:
make install
it should begin to install Python with its dependencies.
You can type python3.8 –version to make sure that your python is installed. (Instead of 3.8 type your own version)
Related
I've been facing a problem for some time and I'm not finding a solution. At the company where I work, I'm trying to implement Python, but when I run the conventional command "Pip install pandas" in my vscode terminal, it gives an error because the company blocks the installation of external libraries, so it's as if I had to install these libraries on a PC without connection.
How should I follow this procedure?
I downloaded the .whl library from PyPi:
pandas-1.5.2-cp310-cp310-win_amd64.whl
ran pip install pandas-1.5.2-cp310-cp310-win_amd64.whl -f ./ --no-index --no-deps
Ok, the installation was successful. But this installation of pandas by cmd is not going to my system, because when trying to import pandas in my vscode it is not running, as if it had not been installed.
Would it be possible for me to download several libraries and leave them located in a folder where everyone in the company can use them? example using a function where I declare my path where all the libraries will be, and then I import them from there??
First confirm if you can run any python commands from VSCode. If the answer is yes, you can proceed to install your .whl file in python script folder.
Look for the Scripts folder
Add in your .whl file in this folder
Then open the folder, select the path, press Ctrl+D to display full path, type in cmd to open command prompt to this directory
Then just run your pip install here and you should be good to go
Remember the key is that your VSCode must be able to find Python.exe. If it can't from the start, you will need to add your python directory to PATH in environment variable
we can use this sample code from this link:
mkdir keystone-deps
pip download python-keystoneclient -d "/home/aviuser/keystone-deps"
tar cvfz keystone-deps.tgz keystone-deps
for download packages and use installation files in another PC. But packages may not be compatible with the destination PC.
Is there a way to download packages based on the destination system hardware and software information?
Hi I want to clone a python virtualenv to a server that's not connected to the internet, I searched different forums but didn't find a clear answer. Here are the methods I found and the problems I have with each :
Methode 1 : (safest but most time consuming)
Save all the libraries via a pip freeze > requierments.txt then go download each one manually and store them in a directory. Copy this directory to the offline server, then create a new virtualenv in the offline server, and install all requirements from the files downloaded.
To avoid downloading each one by hand I used pip download -r requirements.txt -d wheelfiles in the source machine, but I couldn't find a way to install all the packages in one command. But I could use a script with a loop to go through each one. The problem is when even the source server doesn't have internet connection to download these packages.
Methode 2 : (less recommended but I didn't understand why)
Is to simply copy the virtualenv directory with all its files to the offline machine, both machines should have apparently the same Python version, and you'll have to manually modify some hardcoded paths for example modifying all files containing sourceserver\user1\dev\virtualenv with targetserver\user4\dev\virtualenv Usually the files to modify start with activate* or pip*.
But this method is said to be not recommended but I don't understand why.
Also if this method work without problems, can I copy the virtualenv folder from a linux server to a windows server and vice versa ?
You can install all the requirements using
pip install -r requirements.txt
which means the options are:
pip freeze > requirements.txt
pip download -r requirements.txt -d wheelfiles
pip install -r requirements.txt --no-index --find-links path/to/wheels
or
Ensure target machine is the same architecture, OS, and Python version
Copy virtual environment
Modify various hardcoded paths in files
It should be clear why the former is preferred, especially as it is completely independent of Python version, machine architecture, OS, etc.
Additionally, the former means that the requirements.txt can be committed to source control in order to recreate the environment on demand on any machine, including by other people and when the original machine or copy of the virtual environment is not available. In terms of size, the requirements.txt file is also significantly smaller than an entire virtual environment.
I am SSHed into a remote machine and I do not have rights to download python packages but I want to use 3rd party applications for my project. I found cx_freeze but I'm not sure if that is what I need.
What I want to achieve is to be able to run different parts of my project (will mains everywhere) with command line arguments on the remote machine. My project will be filled with a few 3rd party python packages. Not sure how to get around this as I cannot pip install and am not a sudoer. I can SCP files to the remote machine
It is basically useless if you don't have executable permission in the remote machine. You need to contact your administrator to obtain an executable permission.
In the case for the SCP files to the remote server, you may still be able to cp you files but you may not be able to execute it.
easy_install can install packages in your home directory.
Replace pip --install package-name with easy_install --user package-name.
Update: pip also has a --user switch. Try:
pip install --user package-name
Can anyone explain how to install Mongodb, pyMongo on a RedHat server without an internet connection. Although I have used both before I haave never had to install anything myself.
I have downloaded mongodb-linux-x86_64-rhel62-3.0.0.tgz from Official download page and copied it to the server but what do I do next?
Do I need to modify the .repo file shown in docs and install with yum?
For pyMongo, pip and easy_install are not installed so I'm guessing I need to install from source. The link in the documentation, github.com/mongodb/mongo-python-driver.git, says "Otherwise you can download the project source and do python setup.py install to install." Where do I get the source from (the link doesn't work and where do I put it on the server?
What I did eventually (hope that helps anyone)
is that I had an access to RHEL with internet connection too (made one on google cloud).
So I modified /etc/yum.conf to set keepcache=1 so packages are kept and I installed mongodb for the framework I needed there and then copied the packages from /var/cache/yum/x86_64/server/10gen/packages
It had meta package and four packages, for server, mongos, shell and tools
and installed the latter four, one by one with yum install path/to/package.rpm
and ran service mongod start
and it worked
Installing MongoDB on Linux machine from Binary distribution goes like this
Download desired binary from Mongodb official download page for your respective architecture and distro
curl -O https://fastdl.mongodb.org/linux/mongodb-linux-x86_64-3.0.4.tgz
Extract this tar ball
tar -zxvf mongodb-linux-x86_64-3.0.4.tgz
Make directory for mongo binary and copy them their
mkdir -p /mongodb
cp -R -n mongodb-linux-x86_64-3.0.4/ /mongodb
Add this path in PATH variable
export PATH=<PATH_OF_MONGODB_BIN>:$PATH
Installing Pymongo assuming you have internet on some system to clone repo of Pymongo from Github
$ git clone git://github.com/mongodb/mongo-python-driver.git pymongo
$ cd pymongo/
$ python setup.py install
You will need atleast on machine where internet is available.