Currently, I have a running Flink Kubernetes session cluster (Flink version 1.13.2) and I can access the web UI by port-forward also, I can submit the WordCount jar example by this command ./bin/flink run -m localhost:8081 examples/batch/WordCount.jar from my local environment.
But when I try to submit the pyFlink example by command ./bin/flink run -m localhost:8081 -py examples/python/table/batch/word_count.py the job freezes and the log says that is waiting for the results.
I tried many ways including creating virtualenv, passing pyClientExecutable and pyexec, syncing local and remote python versions but, none of them worked.
What am I missing? How can I submit python example to the remote session cluster?
Note: when I submit pyFlink word_count example in the job manager pod, it runs without any problem.
I don't have any Flink-1.13 on hand; however, the same example in Flink-1.15 has a line of comment to remind you to remove the .wait.
Related
I am building CICD through Jenkins.
But there are problems.
It is planning to upload source code first and turn on flask server through batch file.
I wrote a shell script for Jenkins' Build>Execute Shell.
postCommand=/cygdrive/c/workspace/ContactPortal_Flask/run.bat
sshpass -p ${deployPassword} ssh -o StrictHostKeyChecking=no ${deployUser}#${deployServer} ${postCommand}
Here is run.bat file
set FLASK_ENV=development
set path=%path%;C:\Program Files\Microsoft SQL Server\110\Tools\Binn\;C:\develop\instantclient_12_1;C:\develop\Anaconda3;C:\develop\Anaconda3\Library\mingw-w64\bin;C:\develop\Anaconda3\Library\usr\bin;C:\develop\Anaconda3\Library\bin;C:\develop\Anaconda3\Scripts;
set "START=C:\workspace\ContactPortal_Flask\start.bat"
cd C:\workspace\ContactPortal_Flask
python -m flask run
then, The source code upload was successful, and turning on the flask server was also successful, but Jenkins was not marked Success and continued to load.
please help!!
I think the main problem here is that python -m flask run starts the server and will not finish until user hit Ctrl+C.
Since the target system is on Windows, you may want to create custom service and have jenkin start that service at the end instead. For service creation see https://learn.microsoft.com/en-us/troubleshoot/windows-client/deployment/create-user-defined-service And by starting this service (e.g. with NET START <service-name>) jenkin can finish, and flask can start running in the background.
Also for a production system, you may want to consider checking this and pick a proper web server instead of using the builtin web server provided by flask.
I have a python script that does some Database operations from an ec2 server by SSHing into another server using paramiko. The script runs fine when I run it directly from the server as ec2-user but when I run the same from Jenkins I get a permission error on /home/ec2-user/.ssh/id_rsa file.
used python3.8 /home/ec2-user/db_refresh.py command to run the script from Jenkins
After some reading and with the help of whomai command, I found that's expected since Jenkins runs the scripts as Jenkins user and no one part from the owner has permissions to read private keys in ~/.ssh/ folder.
I could change the permission so that everyone can read ec2-user's private key but I think that would be a terrible idea(As far as I've read) and I think ssh wouldn't even work if anyone apart from the owner has read permission to that private key(I remember reading it somewhere but not sure)
sshcon = paramiko.SSHClient()
sshcon.connect(MYSQL_HOST, username=SSH_USERNAME, key_filename='/home/ec2-user/.ssh/id_rsa')
That is how SSH into my database server using paramiko.
Can I run my scripts from jenkins as ec2-user or is there some other way that I can overcome this.
In the end, it turned to be quite simple(stupid me)
I just created a key pair for Jenkins user and used it for doing my operations.
One thing to note is since jenkins is service account normal su jenkins won't work. I had to do this sudo su -s /bin/bash jenkins.
There is a python script in my repo that I would like to run whenever I call an API.This Python script merely transfer data from one database to another. The Jenkins server for the project currently is used for builds/pipelines/running tests, I was wondering if I could use this Jenkins service to run this script when i call an API since I found that Jenkins allows you to remotely trigger scripts via REST.
I was wondering it I could use Jenkin's feature of trigger remotely to run this python script in my repo when I need to. The python script is built using a python image in the dockerfile, so docker helps to setup the dependencies/python needed to run the script. the command to run by Jenkins is something like docker build and docker run
Yes you can.
Just setup a pipeline that
Runs in docker (with your image). Have a look at this
Does a git clone of you repository
Runs you python script with something like: sh "python <your script>"
I'm having troubles to run a python script in DigitalOcean.
I have two doubts.
How to upload the scripy.py to DigitalOcean droplet.
How to run the script.
I'm able to access to the console, but further that I don't know what to do and i can't find any specific information on internet.
I'm running a Ubuntu 14.4 Droplet through web.
Ok first, in order to upload any file to your droplet you can user the command scp
scp foobar.txt your_username#remotehost.edu:/some/remote/directory
Here is a related question that shows you how to use scp from Windows.
Then in the console setup in the remote host check if you can run the command python. If you do not have it, just follow the steps in the documentation and you will have python running inside your remote machine.
If you put a Python script on the server and ssh in, you can run it from the command line. For instance,
python yourFantasticScript.py
If you want a level of automation to triggering the script to run, you will need to learn more about automation scheduling and server technologies.
I have a Hadoop job packaged in a jar file that I can execute in a server using command line and store the results in the hdfs of the server using the command line.
Now, I need to create a Web Service in Python (Tornado) that must to execute the Hadoop Job and get the results to present them to the user. The Web Service is hosted in other server.
I googled a lot for call the Job from outside the server in python Script but unfortunately did not have answers.
Anyone have a solution for this?
Thanks
One option could be install the binaries of hadoop in your webservice server using the same configuration than in your hadoop cluster. You will require that to be able to talk with the cluster. You don't nead to lunch any hadoop deamon there. At least configure HADOOP_HOME, HADOOP_CONFIG_DIR, HADOOP_LIBS and set properly the PATH environment variables.
You need the binaries because you will use them to submit the job and the configurations to tell hadoop client where is the cluster (the namenode and the resourcemanager).
Then in python you can execute the hadoop jar command using subprocess: https://docs.python.org/2/library/subprocess.html
You can configure the job to notify your server when the job has finished using a callback: https://hadoopi.wordpress.com/2013/09/18/hadoop-get-a-callback-on-mapreduce-job-completion/
And finally you could read the results in HDFS using WebHDFS (HDFS WEB API) or using some python HDFS package like: https://pypi.python.org/pypi/hdfs/