How to parse python module for openstack - python

Recently, when I was installing openstack on 3 vm on centos 7 using answer file I had the following error:
10.7.35.174_osclient.pp: [ ERROR ]
Applying Puppet manifests [ ERROR ]
ERROR : Error appeared during Puppet run: 10.7.35.174_osclient.pp
Error: Execution of '/usr/bin/yum -d 0 -e 0 -y list python-iso8601' returned 1: Error: No matching Packages to list
You will find full trace in log /var/tmp/packstack/20160318-124834-91QzZC/manifests/10.7.35.174_osclient.pp.log
Please check log file /var/tmp/packstack/20160318-124834-91QzZC/openstack-setup.log for more information
Additional information:
* Time synchronization installation was skipped. Please note that unsynchronized time on server instances might be problem for some OpenStack components.
* File /root/keystonerc_admin has been created on OpenStack client host 10.7.35.174. To use the command line tools you need to source the file.
* To access the OpenStack Dashboard browse to http://10.7.35.174/dashboard .
Please, find your login credentials stored in the keystonerc_admin in your home directory.
I have already manually installed that module, but the same problem occures anyway.
That command only runs like that:
/usr/bin/yum -d 0 -e 0 -y list python2-iso8601
Is there any way to parse it to python?
Do you have any ideas how to solve it?

Found that kilo version works fine.

Related

Issues trying to install AirFlow locally

I'm new at airflow and I'm trying to install locally, following the instructions on the link below:
https://airflow.apache.org/docs/apache-airflow/stable/start/local.html
I'm running this code (as mentioned on the link):
# Airflow needs a home. `~/airflow` is the default, but you can put it
# somewhere else if you prefer (optional)
export AIRFLOW_HOME=~/airflow
# Install Airflow using the constraints file
AIRFLOW_VERSION=2.2.5
PYTHON_VERSION="$(python --version | cut -d " " -f 2 | cut -d "." -f 1-2)"
# For example: 3.6
CONSTRAINT_URL="https://raw.githubusercontent.com/apache/airflow/constraints-${AIRFLOW_VERSION}/constraints-${PYTHON_VERSION}.txt"
# For example: https://raw.githubusercontent.com/apache/airflow/constraints-2.2.5/constraints-3.6.txt
pip install "apache-airflow==${AIRFLOW_VERSION}" --constraint "${CONSTRAINT_URL}"
# The Standalone command will initialise the database, make a user,
# and start all components for you.
airflow standalone
# Visit localhost:8080 in the browser and use the admin account details
# shown on the terminal to login.
# Enable the example_bash_operator dag in the home page
and getting this error:
File "C:\Users\F43555~1\AppData\Local\Temp/ipykernel_12908/3415008398.py", line 3
export AIRFLOW_HOME=~/airflow
^
SyntaxError: invalid syntax
Someone knows how to deal with it?
I'm running on windows 10, vs code (jupyter notebook).
Tks!
Airflow is only supported on Linux and it looks like you're trying to run this on a windows machine.
If you want to install Airflow on Windows you'll need to use something like Windows Subsystem for Linux (WSL) or Docker. There are some examples around which show you how to do this on WSL (and loads using docker) - Here is one of them with WSL.

Run Python script from browser using virtual environment on EC2 instance

TLDR;
I created a virtual environment on my EC2 instance. How can I access this from the browser?
Hey everyone,
I created a virtual environment, following this tutorial, on my EC2 instance to run a simple Python script. Within the terminal, it works without errors. However, I have made a web application and I would like to activate this script from the browser using the virtual environment. When I try this, I get a "Permission denied" error.
PHP
$output=shell_exec('bash /var/app/current/scripts/script.sh');
echo "<pre>$output</pre>";
script.sh
#!/bin/bash
source /home/ec2-user/venv/python3/bin/activate
python3 /var/app/current/scripts/test.py
test.py
from datetime import datetime
from bs4 import BeautifulSoup
import requests
print('hello')
print(datetime.now())
url = "https://www.stackoverflow.com/"
website = requests.get(url).text
soup = BeautifulSoup(website, "html.parser")
print(soup.title)
error
/var/app/current/scripts/script.sh: line 2: /home/ec2-user/venv/python3/bin/activate: Permission denied
Traceback (most recent call last):
File "/var/app/current/scripts/test.py", line 2, in <module>
from bs4 import BeautifulSoup
ModuleNotFoundError: No module named 'bs4'
What I have tried:
I tried to change the permissions on the virtual environment using the following:
chmod a+x /home/ec2-user/venv
This should give all users access to the virtual environment folder: /home/ec2-user/venv
However, I am still getting the error:
/home/ec2-user/venv/python3/bin/activate: Permission denied
I have also tried to give all users the possibility of executing the activation script (/home/ec2-user/venv/python3/bin/activate):
chmod 665 /home/ec2-user/venv/python3/bin/activate
Which results in:
-rw-rw-r-x 1 ec2-user ec2-user /home/ec2-user/venv/python3/bin/activate
However, I still get the same error:
/home/ec2-user/venv/python3/bin/activate: Permission denied
Note:
Note that if I only import datetime and I comment out bs4 and requests (along with everything else regarding BeautifulSoup), then the script works great as it does not have to access the virtual environment to pull in the packages.
*Virtual environment tutorial
You get this error because you have not added libraries that are used in the python script to the virtual env.
In the tutorial you mentioned only boto library is installed.
You need to install libraries you use.
Run this from the command line:
source /home/ec2-user/venv/python3/bin/activate
pip install beautifulsoup4
pip install requests
Alternatively you can create a file and name it for example /home/ec2-user/requirements.txt for example and list all requirements your script use:
beautifulsoup4
requests
Then you can use this file to install all requirements into virtual env:
source /home/ec2-user/venv/python3/bin/activate
pip install -r /home/ec2-user/requirements.txt
Solved!
I got some help from this post, however, needed to modify a few things.
Let's dive into what his answer was:
sudo chown -R your_username:your_username path/to/virtuaelenv/
Okay, this is great, but I needed a bit of information.
For me, the web application's username is webapp.
Then, one thing that isn't very clear above is the path. So, my path is:
/home/ec2-user/venv/python3/bin/activate
as mentioned above. Here, you need to change permissions to the /home/ec2-user and NOT to /home/ec2-user/venv
So, to give my application permission to my virtual environment, I needed ran:
sudo chown -R webapp:webapp /home/ec2-user
That worked in the browser! However, this took away my ability to work with it on the server. To do so, I would have to switch it back to:
sudo chown -R ec2-user:ec2-user /home/ec2-user
Being far from ideal to switch back and forth, I tried to change the permissions with chmod instead.
sudo chmod 711 /home/ec2-user
Now I have read, write, and execution permissions whereas everyone else, including the web app, can only execute.
Now it all works 🤓

install z3 in a remote linx server control without being root

I am trying to install z3 on a remote server that I am not the root of. I followed the steps to the point where I have this message:
Z3 was successfully built.
Z3Py scripts can already be executed in the 'build/python'
Use the following command to install Z3 at prefix /usr.
sudo make install
Since it says that Z3py scripts can already be executed, is the next command necessary? if so, how can I execute it without being root. Is there an alternative?
I have changed the prefix to a directory that I have write access for. Again, it installed z3 and z3py successfully but then it says:
Use the following command to install Z3 at prefix /z3/z3-master.
sudo make install
when I use make install this is what I get:
mkdir: cannot create directory ‘/z3’: Permission denied
Makefile:4462: recipe for target 'install' failed
make: *** [install] Error 1
Configure it like this: python scripts/mk_make.py --prefix=/a/place/with/write/access

How do I download the code for a specific google cloud "service"

This doc show the command to download the source of an app I have in app engine:
appcfg.py -A [YOUR_APP_ID] -V [YOUR_APP_VERSION] download_app [OUTPUT_DIR]
Thats fine, but I also have services that I deployed. Using this command I can only seem to download the "default" service. I also deployed "myservice01" and "myservice02" to app engine in my GCP project. How do I specify the code of a specific service to download?
I tried this command as suggested:
appcfg.py -A [YOUR_APP_ID] -M [YOUR_MODULE] -V [YOUR_APP_VERSION] download_app [OUTPUT_DIR]
It didn't fail but this is the ouput I got (and it didn't download anything)
01:30 AM Host: appengine.google.com
01:30 AM Fetching file list...
01:30 AM Fetching files...
Now as a test I tried it with the name of a module I know doesn't exist and I got this error:
Error 400: --- begin server output ---
Version ... of Module ... does not exist.
So I at least know its successfully finding the module and version, but doesn't seem to want to download them?
Also specify the module (services used to be called modules):
-M MODULE, --module=MODULE
Set the module, overriding the module value from
app.yaml.
So something like:
appcfg.py -A [YOUR_APP_ID] -M [YOUR_MODULE] -V [YOUR_APP_VERSION] download_app [OUTPUT_DIR]
Side note: YOUR_APP_VERSION should really read YOUR_MODULE_VERSION :)
Of course, the answer assumes the app code downloads were not permanently disabled from the Console's GAE App Settings page:
Permanently prohibit code downloads
Once this is set, no one, including yourself, will ever be able to
download the code for this application using the appcfg download_app
command.

stratum-mining-proxy error - Can't decode message

I'm attempting to run stratum-mining-proxy with minerd. Proxy starts and runs with the following command:
python ./mining_proxy.py -o ltc-stratum.kattare.com -p 3333 -pa scrypt
Proxy starts fine. Run Minerd (U/P removed):
minerd -a scrypt -r 1 -s 6 -o http://127.0.0.1:3333 -O USERNAME.1:PASSWORD
Following errors are received. This one from the proxy:
2013-07-18 01:33:59,981 ERROR protocol protocol.dataReceived # Processing of message failed
Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/stratum-0.2.12-py2.7.egg/stratum/protocol.py", line 185, in dataReceived
self.lineReceived(line, request_counter)
File "/usr/local/lib/python2.7/dist-packages/stratum-0.2.12-py2.7.egg/stratum/protocol.py", line 216, in lineReceived
raise custom_exceptions.ProtocolException("Cannot decode message '%s'" % line)
'rotocolException: Cannot decode message 'POST / HTTP/1.1
And this from minerd. What am I doing wrong? Any help is appreciated!
[2013-07-18 01:33:59] HTTP request failed: Empty reply from server
[2013-07-18 01:33:59] json_rpc_call failed, retry after 30 seconds
I am a little curious, I don't know as a fact but I was under the impression that the mining proxy was for BTC not LTC.
But anyways I believe I got a similar message when I first installed it as well. To fix, or rather to actually get it running I had to use the Git installation method instead of installing manually.
Installation on Linux using Git
This is advanced option for experienced users, but give you the easiest way for updating the proxy.
1.git clone git://github.com/slush0/stratum-mining-proxy.git
2.cd stratum-mining-proxy
3.sudo apt-get install python-dev # Development package of Python are necessary
4.sudo python distribute_setup.py # This will upgrade setuptools package
5.sudo python setup.py develop # This will install required dependencies (namely Twisted and Stratum libraries), but don't install the package into the system.
6.You can start the proxy by typing "./mining_proxy.py" in the terminal window. Using default settings, proxy connects to Slush's pool interface.
7.If you want to connect to another pool or change other proxy settings, type "./mining_proxy.py --help".
8.If you want to update the proxy, type "git pull" in the package directory.

Categories

Resources