How to install poppler in ubuntu 15.04? - python

Poppler is a PDF rendering library based on the xpdf-3.0 code base.
I have already downloaded the tar.xz file from the official site http://poppler.freedesktop.org/
But I do not know what to do with this file
Is there any command to install or run?
P.S. - I am new to linux, so I don't know a lot about it yet..

Not sure how it worked in 15.04, but I know in 16.04 (Xenial), the package's official name is poppler-utils:
http://packages.ubuntu.com/xenial/poppler-utils
And thus can be installed with:
sudo apt-get install -y poppler-utils
The marked answer by codefreak is not correct if you need the poppler-command-line tools, such as pdftotext. Also, installing Python poppler via apt-get doesn't seem to play nice if you're on a customized system, e.g. one that is running off of the Anaconda distribution.

What you downloaded from poppler site is source code and you may not be expert enough to install it yourself. For such situations, Ubuntu and other linux distros manage packages of popular software so you don't have to go through manual installation via source code. In your case, poppler for python is available in package python-poppler which can be installed via Ubuntu's package manager apt.
To install poppler python bindings open terminal and run this:
sudo apt-get install python-poppler
You should have poppler available in python then.
To search for such packages in future you can do apt-cache search poppler. It will list down all packages you can install via apt.

Go to link below
https://www.ubuntuupdates.org/package/core/focal/main/base/poppler-utils
And only click appropriate box to install

so run these commands on you system
step 1 sudo apt-get update
step 2 sudo apt install -y software-properties-common
step 3 sudo apt update
step 4 sudo add-apt-repository main
step 5 sudo add-apt-repository universe
step 6 sudo add-apt-repository restricted
step 7 sudo add-apt-repository multiverse
step 8 apt-get install -y poppler-utils

Related

How to install rpm and dependencies on RHEL?

I'm trying to install python3-gnupg on my RHEL EC2 server.
I used the command
wget https://download.fedoraproject.org/../python-gnupg-0.4.6-1.fc32.src.rpm
sudo rpm -i file.rpm
Get the error
error: Failed dependencies:
python(abi) = 3.8 is needed by python3-gnupg-0.4.6-1.fc32.noarch
rpmlib(PayloadIsZstd) <= 5.4.18-1 is needed by python3-gnupg-0.4.6-1.fc32.noarch
How do I download & install all dependencies at once?
You may want to use dnf or yum (if dnf is not available for some reason) to install your package instead of the rpm command.
Why ?
Because it will actually download dependencies. The rpm command does not comes with a 'remote repository' like yum or dnf, thus its incapacity to download missing dependencies.
Command for dnf:
sudo dnf install https://download.fedoraproject.org/../python-gnupg-0.4.6-1.fc32.src.rpm
Command for yum:
sudo yum install https://download.fedoraproject.org/../python-gnupg-0.4.6-1.fc32.src.rpm
(you can replace the URL by the path to your RPM file and you'll want to replace the URL by the correct one)
You may still have a problem with python.
Indeed, it requires a fixed version of python (the 3.8 version) and dnf/yum may refuse to install one.
You have two ways to fix this.
The first way
Install the python3-gnupg package directly from the RHEL repo (if available, I'm not quite sure) to directly install the correct dependencies (and the version corresponding to your distro that received the RHEL tests blessing)
So you may try
sudo dnf install python3-gnupg
Or
sudo yum install python3-gnupg
The second way
Try to install the corresponding version of python. Either with dnf or yum.
sudo dnf install python3.8
Or
sudo yum install python3.8
What I would recommend
IMHO, the first solution is better because you'll actually get the official RHEL version of the python3-gnupg, which has been build for your distro AND tested accordingly. But it may not be available. I actually tested those commands on my Fedora 33, because it uses the same tools as RHEL, but its dnf/yum repositories are actually different.

How do we install yaafe on linux?

I've been trying to install Yaafe Library on my linux system, but I'm unable to do it as I can't compile the yaafe source using ccmake. Does anyone have the detailed step-by-step procedure for the same?
I tried to follow the instructions, which failed for me during compile. The ccmake can be replace by cmake. I could not install libhdf5-serial-1.8.4, because it was integrated in the main package.
Alternative approach
An alternative to yaafe would be librosa, which has the advantage of being available via PyPi. You install it via (assuming Debian/Ubuntu)
apt-get install pip
(for the PyPi client), and
pip install librosa
if you follow their advice and install scikits.samplerate, you also need libsamplerate0-dev:
apt-get install libsamplerate0-dev
The home page of the library includes a thorough manual for compiling yaafe.
I am citing the beginning here:
$ sudo apt-get install cmake cmake-curses-gui libargtable2-0 libargtable2-dev libsndfile1 libsndfile1-dev libmpg123-0 libmpg123-dev libfftw3-3 libfftw3-dev liblapack-dev libhdf5-serial-dev libhdf5-serial-1.8.4
$ mkdir build
$ cd build
$ ccmake -DCMAKE_PREFIX_PATH=<lib-path> -DCMAKE_INSTALL_PREFIX=<install-path> ..
see the rest there.
That's what I had to do in Ubuntu 14.04 to get it working:
sudo apt-get install build-essential -y
sudo apt-get install libeigen3-dev
sudo apt-get install cmake cmake-curses-gui libargtable2-0 libargtable2-dev libsndfile1 libsndfile1-dev libmpg123-0 libmpg123-dev libfftw3-3 libfftw3-dev liblapack-dev libhdf5-serial-dev libhdf5-7
Download Yaafe from here:https://github.com/Yaafe/Yaafe/archive/master.zip
Extract Yaafe-master.zip
Inside Yaafe-master directory:
mkdir build
cd build
ccmake ..
make
sudo make install
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/lib/
Also add this path to your IDE (Enviromental Variables) if yaafe does not work with it.
I just installed it using Anaconda, and it was extremely easy! Just install Anaconda like the link tells you to. On the last step, I recommend you allow Anaconda to modify your PATH so that when you type python on the command line, it uses the Anaconda version of Python. Then restart your terminal, just to make sure it's using the Anaconda stuff you just installed.
Then, assuming you're using Ubuntu, you just need to type the following command:
conda install --channel https://conda.anaconda.org/Yaafe yaafe

Python3: ImportError: No module named '_ctypes' when using Value from module multiprocessing

I am using Ubuntu and have installed Python 2.7.5 and 3.4.0. In Python 2.7.5 I am able to successfully assign a variable x = Value('i', 2), but not in 3.4.0. I am getting:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib/python3.4/multiprocessing/context.py", line 132, in Value
from .sharedctypes import Value
File "/usr/local/lib/python3.4/multiprocessing/sharedctypes.py", line 10, in <
module>
import ctypes
File "/usr/local/lib/python3.4/ctypes/__init__.py", line 7, in <module>
from _ctypes import Union, Structure, Array
ImportError: No module named '_ctypes'
I just updated to 3.3.2 through installing the source of 3.4.0. It installed in /usr/local/lib/python3.4.
Did I update to Python 3.4 correctly?
One thing I noticed that Python 3.4 is installed in usr/local/lib, while Python 3.3.2 is still installed in usr/lib, so it was not overwritten.
Installing libffi-dev and re-installing python3.7 fixed the problem for me.
to cleanly build py 3.7 libffi-dev is required or else later stuff will fail
If using RHEL/Fedora:
yum install libffi-devel
or
sudo dnf install libffi-devel
If using Debian/Ubuntu:
sudo apt-get install libffi-dev
On a fresh Debian image, cloning https://github.com/python/cpython and running:
sudo apt-get update
sudo apt-get upgrade
sudo apt-get dist-upgrade
sudo apt-get install build-essential python-dev python-setuptools python-pip python-smbus
sudo apt-get install libncursesw5-dev libgdbm-dev libc6-dev
sudo apt-get install zlib1g-dev libsqlite3-dev tk-dev
sudo apt-get install libssl-dev openssl
sudo apt-get install libffi-dev
Now execute the configure file cloned above:
./configure
make # alternatively `make -j 4` will utilize 4 threads
sudo make altinstall
Got 3.7 installed and working for me.
SLIGHT UPDATE
Looks like I said I would update this answer with some more explanation and two years later I don't have much to add.
this SO post explains why certain libraries like python-dev might be necessary.
this SO post explains why one might use the altinstall as opposed to install argument in the make command.
Aside from that I guess the choice would be to either read through the cpython codebase looking for #include directives that need to be met, but what I usually do is keep trying to install the package and just keep reading through the output installing the required packages until it succeeds.
Reminds me of the story of the Engineer, the Manager and the Programmer whose car rolls down a hill.
If you use pyenv and get error "No module named '_ctypes'" (like i am) on Debian/Raspbian/Ubuntu you need to run this commands:
sudo apt-get install libffi-dev
pyenv uninstall 3.7.6
pyenv install 3.7.6
Put your version of python instead of 3.7.6
Detailed steps to install Python 3.7 in CentOS or any redhat linux machine:
Download Python from https://www.python.org/ftp/python/3.7.0/Python-3.7.0.tar.xz
Extract the content in new folder
Open Terminal in the same directory
Run below code step by step :
sudo yum -y install gcc gcc-c++
sudo yum -y install zlib zlib-devel
sudo yum -y install libffi-devel
./configure
make
make install
Thought I'd add the Centos installs:
sudo yum -y install gcc gcc-c++
sudo yum -y install zlib zlib-devel
sudo yum -y install libffi-devel
Check python version:
python3 -V
Create virtualenv:
virtualenv -p python3 venv
On my Ubuntu 18.04 machine, I had the common problem of python not finding _ctypes with the pyenv installed python.
In my case libffi-dev was already installed. Installing cpython from source, as suggested by #MikeiLL, didn't help either.
Turned out to be an homebrew issue.
ajkerrigans suggested solution on pyenvs github issues solved this problem for me.
Solution summary: Tell pyenv to build Python using the Homebrew-managed GCC, with a command like:
CC="$(brew --prefix gcc)/bin/gcc-11" \
pyenv install --verbose 3.10.0
This assumes that any build dependencies have also been installed via Homebrew as specified in the pyenv wiki. As of this writing, that looks like this for Homebrew on Linux:
brew install bzip2 libffi libxml2 libxmlsec1 openssl readline sqlite xz zlib
This solved the same error for me on Debian:
sudo apt-get install libffi-dev
and compile again
Reference: issue31652
None of the solution worked. You have to recompile your python again; once all the required packages were completely installed.
Follow this:
Install required packages
Run ./configure --enable-optimizations
https://gist.github.com/jerblack/798718c1910ccdd4ede92481229043be
I run into this error when I tried to install Python 3.7.3 in Ubuntu 18.04 with next command: $ pyenv install 3.7.3.
Installation succeeded after running $ sudo apt-get update && sudo apt-get install libffi-dev (as suggested here).
The issue was solved there.
Based on this answer, just copy-paste into the terminal.
First run:
sudo apt-get -y update
then:
sudo apt-get -y upgrade
sudo apt-get -y dist-upgrade
sudo apt-get -y install build-essential python-dev python-setuptools python-pip python-smbus
sudo apt-get -y install libncursesw5-dev libgdbm-dev libc6-dev
sudo apt-get -y install zlib1g-dev libsqlite3-dev tk-dev
sudo apt-get -y install libssl-dev openssl
sudo apt-get -y install libffi-dev
PS: You can just copy-paste the whole chunk into the terminal in one go.
In my case what was causing all sorts of Python installation issues including the one having to do with _ctypes and libffi was Homebrew on Linux / Linuxbrew. pyenv was happy again once brew was no longer in the $PATH.
Refer to this thread or this thread, for customized installation of libffi, it is difficult for Python3.7 to find the library location of libffi. An alternative method is to set the CONFIGURE_LDFLAGS variable in the Makefile, for example CONFIGURE_LDFLAGS="-L/path/to/libffi-3.2.1/lib64".
My solution:
Installing libffi-dev with apt-get didn't help.
But this helped: Installing libffi from source and then installing Python 3.8 from source.
My configuration:
Ubuntu 16.04 LTS
Python 3.8.2
Step by step:
I got the error message "ModuleNotFoundError: No module named '_ctypes'" when starting the debugger from Visual Studio Code, and when running python3 -c "import sklearn; sklearn.show_versions()".
download libffi v3.3 from https://github.com/libffi/libffi/releases
install libtool: sudo apt-get install libtool
The file README.md from libffi mentions that autoconf and automake are also necessary. They were already installed on my system.
configure libffi without docs:
./configure --disable-docs
make check
sudo make install
download python 3.8 from https://www.python.org/downloads/
./configure
make
make test
make install
After that my python installation could find _ctypes.
CentOS without root
Install libffi-3.2 (Do NOT use libffi-3.3)
wget ftp://sourceware.org/pub/libffi/libffi-3.2.tar.gz
tar -xzf libffi-3.2.tar.gz
cd libffi-3.2/
./configure --prefix=$YOUR_LIBFFI_DIR
make && make install
Install Python3
./configure --prefix=$YOUR_PATH/python/3.7.10 LDFLAGS=-L${YOUR_LIBFFI_DIR}/lib64 PKG_CONFIG_PATH=${YOUR_LIBFFI_DIR}/lib/pkgconfig --enable-shared
make && make install
Thanks for JohnWSteill
I was having the same problem. None of the above solutions worked for me. The key challenge was that I didn't have the root access. So, I first download the source of libffi. Then I compiled it with usual commands:
./configure --prefix=desired_installation_path_to_libffi
make
Then I recompiled python using
./configure --prefix=/home/user123/Softwares/Python/installation3/ LDFLAGS='-L/home/user123/Softwares/library/libffi/installation/lib64'
make
make install
In my case, 'home/user123/Softwares/library/libffi/installation/lib64' is path to LIBFFI installation directory where libffi.so is located. And, /home/user123/Softwares/Python/installation3/ is path to Python installation directory. Modify them as per your case.
If you don't mind using Miniconda, the necessary external libraries and _ctypes are installed by default. It does take more space and may require using a moderately older version of Python (e.g. 3.7.6 instead of 3.8.2 as of this writing).
You have to load the missing php3 (Python3) modules from the package manager.
If you have Ubuntu I recommend the Synaptic Package Manager:
sudo apt-get install synaptic
There you can simply search for the missing modules. search for ctypes and install all the packages. Then go to your Python dir and do
./configure
make install.
This should solve your problem.
How to install Python from source without libffi in /usr/local?
Download libffi from github and install to /path/to/local
Download python source code and compile with the following configuration:
export PKG_CONFIG_PATH=/path/to/local/lib/pkgconfig
./configure --prefix=/path/to/python \
LDFLAGS='-L/path/to/local/lib -Wl,-R/path/to/local/lib' \
--enable-optimizations
make
make install
I am using MAC M1 and I had this error:
... __boot__.py", line 30, in <module> import ctypes
and something was said about the file libffi.8.dylib
I downloaded this thing on Anaconda and now everything works:
https://anaconda.org/wakari/libffi
I inform you, since much of the above is either not for MAC or outdated, my Python is on Anaconda version 3.10.4
Application file created with py2app works now!!
If your issue is with the VSCODE DEBUGGER, check your currently selected python interpreter. I had both python3.10.9 and python3.10.6 installed; however, the former was probably missing some dependencies so I switched to the latter(my OS default interpreter) which solved the issue.
To change your python interpreter in VSCODE:
Hold ctrl+shift+P
Search Python:Select Interpreter and try your OS default version(The version you get when you run python3 --version
If the issue is still not resolved, run sudo apt-get install libffi-dev.
If you are doing something nobody here will listen you about because "you're doing it the wrong way", but you have to do it "the wrong way" for reasons too asinine to explain and also beyond your ability to control, you can try this:
Get libffi and install it into your user install area the usual way.
git clone https://github.com/libffi/libffi.git
cd libffi
./configure --prefix=path/to/your/install/root
make
make install
Then go back to your Python 3 source and find this part of the code in setup.py at the top level of the python source directory
ffi_inc = [sysconfig.get_config_var("LIBFFI_INCLUDEDIR")]
if not ffi_inc or ffi_inc[0] == '':
ffi_inc = find_file('ffi.h', [], inc_dirs)
if ffi_inc is not None:
ffi_h = ffi_inc[0] + '/ffi.h'
if not os.path.exists(ffi_h):
ffi_inc = None
print('Header file {} does not exist'.format(ffi_h))
ffi_lib = None
if ffi_inc is not None:
for lib_name in ('ffi', 'ffi_pic'):
if (self.compiler.find_library_file(lib_dirs, lib_name)):
ffi_lib = lib_name
break
ffi_lib="ffi" # --- AND INSERT THIS LINE HERE THAT DOES NOT APPEAR ---
if ffi_inc and ffi_lib:
ext.include_dirs.extend(ffi_inc)
ext.libraries.append(ffi_lib)
self.use_system_libffi = True
and add the line I have marked above with the comment. Why it is necessary, and why there is no way to get configure to respect '--without-system-ffi` on Linux platforms, perhaps I will find out why that is "unsupported" in the next couple of hours, but everything has worked ever since. Otherwise, best of luck... YMMV.
WHAT IT DOES: just overrides the logic there and causes the compiler linking command to add "-lffi" which is all that it really needs. If you have the library user-installed, it is probably detecting the headers fine as long as your PKG_CONFIG_PATH includes path/to/your/install/root/lib/pkgconfig.

psycopg: Python.h: No such file or directory

I'm compiling psycopg2 and get the following error:
Python.h: No such file or directory
How to compile it, Ubuntu12 x64.
Python 2:
sudo apt-get install python-dev
Python 3:
sudo apt-get install python3-dev
This is a dependency issue.
I resolved this issue on Ubuntu using apt-get. Substitute it with a package manager appropriate to your system.
For any current Python version:
sudo apt-get install python-dev
For alternative Python version:
sudo apt-get install python<version>-dev
For example 3.5 as alternative:
sudo apt-get install python3.5-dev
if you take a look at PostgreSQL's faq page ( http://initd.org/psycopg/docs/faq.html ) you'll see that they recommend installing pythons development package, which is usually called python-dev. You can install via
sudo apt-get install python-dev
As mentioned in psycopg documentation http://initd.org/psycopg/docs/install.html
Psycopg is a C wrapper around the libpq PostgreSQL client library. To install it from sources you will need:
C compiler
Python header files
They are usually installed in a package such as python-dev a message error such: Python.h: no such file or directory indicate that you missed mentioned python headers.
How you can fix it? First of all you need check which python version installed in your virtual envitonment or in system itself if you didnt use virtual environment. You can check your python version by:
python --version
After it you should install the same python-dev version which installed on your virtual env or system. For example if you use python3.7 you should install
apt-get install python3.7-dev
Hope my answer will help anyone
Based on the python version your your pipenv file requires, you need to install the corresponding dev file.
I was getting this error and my default python version was 3.8 but the pipenv file was requiring the Python3.9 version. So I installed the python3.9 dev.
$ sudo apt install python3.9-dev
While all answers here are correct, they won't work correctly anyway:
- sudo apt-get install python3-dev
- sudo apt-get install python3.5-dev
- etc ..
won't apply when you are using python3.8, python3.9 or future versions
I recommend using a deterministic way instead :
sudo apt install python3-all-dev
On Fedora, Redhat or centos
Python 2:
sudo yum install python-devel
Python 3:
sudo yum install python3-devel
if none of the above-suggested answers is not working, try this it's worked for me.
sudo apt-get install libpq-dev

How to install which programs requires "sudo" in virtualenv?

I'm trying to install kivy, in the docs it says:
$ sudo apt-get install python-setuptools python-pygame python-opengl \
python-gst0.10 python-enchant gstreamer0.10-plugins-good cython python-dev \
build-essential libgl1-mesa-dev libgles2-mesa-dev
$ sudo easy_install kivy
But I don't want to use sudo I like to keep my projects organized in virtualenv, so how install the requirements without using sudo. apt-get install won't work unless i use sudo. and i can't find the requirements in pip. Lets say i want to install easy_install in virtualenv for example, how to do that?
I do not think you can get around installing kivy's dependent packages without sudo/root access.
Once you have them installed, follow steps outlined in Andrew's answer.
when you use virtualenv and start it running, you can use the easy_install / pip that is installed there. that doesn't require sudo because it installs directly to virtualenv.
in other words - it just works. have you tried it?
there's a simple example here http://www.arthurkoziel.com/2008/10/22/working-virtualenv/
sudo easy_install virtualenv (the last sudo you need)
virtualenv kivydir
source kivydir/bin/activate
easy_install kivy (installs to kivydir)

Categories

Resources