pip install scrapy but it's bad - python

I want to install Scrapy like this
pip install scrapy
However I'm getting
-bash: /usr/local/bin/pip: /usr/bin/python: bad interpreter: No such file or directory
How can I get fix this?

It might be that you do not have a python interpreter installed or may have it in a non-standard location.
Similar issue with YUM:
How to fix "Bad interpreter" error when using yum?

Related

getting error why trying to dffml in development mode?

These are the commands i am running as mentioned in the documentation
git clone https://github.com/intel/dffml
cd dffml
python3 -m pip install -U pip setuptools wheel
python3 -m pip install --prefix=~/.local -e .[dev]
But i am getting warnings and error on the last command
warning
WARNING: Ignoring invalid distribution -jango (/home/dhruv/.local/lib/python3.8/site-packages)
error
Found existing installation: Pillow 7.0.0
Uninstalling Pillow-7.0.0:
ERROR: Could not install packages due to an OSError: [Errno 13] Permission denied: '_imagingtk.cpython-38-x86_64-linux-gnu.so'
Consider using the `--user` option or check the permissions.
And program stops executing after the error , i dont know what is the source of error and how to remove it, any help is appreciated
For the error, try using --user, like this:
python3 -m pip install --user -e .[dev]
Their docs talk about why they don't use --user, but the explanation doesn't make much sense to me.
Installing to your home directory will reduce permissions issues. To do this we use the --prefix=~/.local flag. pip sometimes gets confused about the --user flag (and will blow up in your face if you try to pass it). So we use the --prefix=~/.local flag, which has the same effect but should always work.
Specifically, I don't know why pip would "get confused", but I'm not an expert.

How to fix "Python quit unexpectedly" when trying to install a module through pip install? Terminal Crashes with message "zsh: abort"

I think I broke my terminal. For some reason I am unable to use pip install to install anything on my Macbook Pro.
When I try to install such as
pip install Flask
I get
zsh: abort pip install Flask
I've tried both on pip and pip, but I get the same error on both.
Now I am unable to install any python module. I thought it was due to Bash -> zsh from Catalina update but when I changed it to Bash and tried the same, I get the same result.
Any idea how I can fix this? I am unable to do any python work due to needed modules unable to being install now
setting DYLD_LIBRARY_PATH before installing any packages with pip solved the issue for me:
export DYLD_LIBRARY_PATH=/usr/local/Cellar/openssl/1.0.2t/lib
(adjust for a valid openssl path in your system)
I found the solution in this github issue.

Unable to execute /usr/local/bin/scrapyd-deploy: No such file or directory

I am trying to get scrapyd to deploy but everytime I run the command
sudo scrapyd-deploy local
I get the following error
Unable to execute /usr/local/bin/scrapyd-deploy: No such file or directory
I did the following to try and trouble shoot
reinstall python
pip install scrapy
pip install scrapyd
pip install scrapyd-client
I checked usr/local/bin and found that the following files exist
scrapy
scrapyd
scrapyd-deploy
I'm not sure why the scrapy files exist in the folder but when I try to run scrapyd-deploy local it cannot find them.
I had upgraded to os mojave after which all of the errors started. When I first tired to run scrapyd, brew installed python.
I was able to resolve issue by starting over.
I did the following:
brew uninstall python
pip uninstall scrapy
pip uninstall scrapyd
pip uninstall scrapyd-client
I deleted docker
I then reinstalled scrapyd-client after which the error resolved and I was able to deploy scrapyd.

CMDER not finding sudo and pip commands

So I have been using regular windows command prompt and wanted to try using bash as most forums give commands in bash and it's a little cumbersome to try to find the translation to windows. Currently trying out Spotify API and I want to run a virtual environment.
I do the following windows command and everything runs fine:
[WINDOWS]
python -m pip install virtualenv
this, does not:
[BASH]
pip install virtualenv
and I get returned bash: pip: command not found
SO I go to install pip using sudo easy_install pip and get returned bash: sudo: command not found.
I am running CMDER as admin in bash so I thought ok, I will try easy_install pip and returned bash: easy_install: command not found. SO i went to the actual python directory and went to install pip again and no luck.
Any insight on how I can address this?
[Windows]]1[Bash]2
You can try to install pip by downloading the get-pip.py from here and then run it using python get-pip.py
After that
You might need to set your Environment Variable to include PIP in your path. you can use Environment Variables in Control Panel and add the path to System Variables.
I ran into this issue as well. Not sure what causes it, but switching to cmd.exe and running pip install ... worked without issue.

Scrapy not installed correctly on mac?

I have tried to install Scrapy on mac 10.8.2. Here's what I did:
In terminal, I ran the command from with myuser directory:
pip install --user scrapy
I got the following message in Terminal:
Successfully installed scrapy
Cleaning up...
Next I do the following from the same myuser dir:
scrapy shell http://example.com
Here's the error I am getting:
-bash: scrapy: command not found
I believe this is a path issue, scrapy has been installed in /Library/Python/2.7/lib/python/site-packages. How do I get scrapy to run?
--user option is used when you want to install a package into the local user's $HOME, e.g. on Mac it should be $HOME/Library/Python/2.7/lib/python/site-packages.
scrapy executable could be found at $HOME/Library/Python/2.7/bin/scrapy. So, you should edit your .bash_login file and modify PATH env variable:
PATH="$HOME/Library/Python/2.7/bin/:$PATH"
Or, just reinstall scrapy without --user flag.
Hope that helps.

Categories

Resources