I want to make an application that interact host and guest virtual machine in Python.
But, I don't know how to achieve this goal. . .
In my opinion,
Step 1. I Make a sample code to execute calculate in python scrypt.
Step 2. And move that scrypt in share folder.
Step 3. Now I can execute the scrypt in VMware, calculate will executed on VMware.
But, how can I perform Step 3 on host??
If my cursor is on VMware,
I can open console(Ex> cmd), write command "python C:\shared\calc_scrypt.py".
But in this time, my cursor is on host. . .
Please sombody tell me the way how can I perform Step 3 on host.
Related
I would like to run a python script directly on startup of the raspberry pi. I can execute the script by calling /home/pi/scripts/script.py. Now I tried to add the script to /etc/rc.local or /etc/profile or start it with systemd. For all cases the script is only executed after a connection to the pi via SSH and a login to the pi.
Therefore I would like to know who I could execute the script on boot (startup) without having to do a login?
You were on the right track with systemd and unit files.
First read this:
[https://www.digitalocean.com/community/tutorials/understanding-systemd-units-and-unit-files]
You probably used multi-user.target as wanted-by but that is too late.
You could use
sudo systemd-analyze critical-chain
to see which .targets might be better for your need.For example basic.target looks good.
You can of course also state in the unit file explicitly that your unit (.service) needs to be loaded before for example ifup#eth0.service
Use sudo systemctl list-units or sudo systemctl list-unit-files to see everything that in this respect is going on.
So there are variants of this question - but none quite hit the nail on the head.
I want to run spyder and do interactive analysis on a server. I have two servers , neither have spyder. They both have python (linux server) but I dont have sudo rights to install packages I need.
In short the use case is: open spyder on local machine. Do something (need help here) to use the servers computation power , and then return results to local machine.
Update:
I have updated python with my packages on one server. Now to figure out the kernel name and link to spyder.
Leaving previous version of question up, as that is still useful.
The docker process is a little intimidating as does paramiko. What are my options?
(Spyder maintainer here) What you need to do is to create an Spyder kernel in your remote server and connect through SSH to it. That's the only facility we provide to do what you want.
You can find the precise instructions to do that in our docs.
I did a long search for something like this in my past job, when we wanted to quickly iterate on code which had to run across many workers in a cluster. All the commercial and open source task-queue projects that I found were based on running fixed code with arbitrary inputs, rather than running arbitrary code.
I'd also be interested to see if there's something out there that I missed. But in my case, I ended up building my own solution (unfortunately not open source).
My solution was:
1) I made a Redis queue where each task consisted of a zip file with a bash setup script (for pip installs, etc), a "payload" Python script to run, and a pickle file with input data.
2) The "payload" Python script would read in the pickle file or other files contained in the zip file. It would output a file named output.zip.
3) The task worker was a Python script (running on the remote machine, listening to the Redis queue) that would would unzip the file, run the bash setup script, then run the Python script. When the script exited, the worker would upload output.zip.
There were various optimizations, like the worker wouldn't run the same bash setup script twice in a row (it remembered the SHA1 hash of the most recent setup script). So, anyway, in the worst case you could do that. It was a week or two of work to setup.
Edit:
A second (much more manual) option, if you just need to run on one remote machine, is to use sshfs to mount the remote filesystem locally, so you can quickly edit the files in Spyder. Then keep an ssh window open to the remote machine, and run Python from the command line to test-run the scripts on that machine. (That's my standard setup for developing Raspberry Pi programs.)
I'm wondering about the best way to use ssh to work with routers interactively in python version 2.7.
Scenario:
Access 20 devices at the same time.
run 10 commands on each device.
parse specific words from the commands output.
run another command based on step 3.
exit the devices.
Take a look at the Netmiko package ktbyers made, here.
The package allows you create ssh connections and send commands/configurations/etc.
I have a python script that checks the temperature every 24 hours, is there a way to leave it running if I shut the computer down/log off.
Shutdown - no.
Logoff - potentially, yes.
If you want to the script to automatically start when you turn the computer back on, then you can add the script to your startup folder (Windows) or schedule the script (Windows tasks, cron job, systemd timer).
If you really want a temperature tracker that is permanently available, you can use a low-power solution like the Raspberry Pi rather than leaving your pc on.
The best way to accomplish this is to have your program run on some type of server that your computer can connect to. A server could be anything from a raspberry pi to an old disused computer or a web server or cloud server. You would have to build a program that can be accessed from your computer, and depending on the server and you would access it in a lot of different ways depending the way you build your program and your server.
Doing things this way means your script will always be able to check the temperature because it will be running on a system that stays on.
Scripts are unable to run while your computer is powered off. What operating system are you running? How are you collecting the temperature? It is hard to give much more help without this information.
One thing I might suggest is powering on the system remotely at a scheduled time, using another networked machine.
You can take a look at the following pages
http://www.wikihow.com/Automatically-Turn-on-a-Computer-at-a-Specified-Time
http://lifehacker.com/5831504/how-can-i-start-and-shut-down-my-computer-automatically-every-morning
Additionally once it turn on, you can perform a cronjob, for execute your python code by a console command >> python yourfile.py . What is the Windows version of cron?
I have coded a Python Script for Twitter Automation using Tweepy. Now, when i run on my own Linux Machine as python file.py The file runs successfully and it keeps on running because i have specified repeated Tasks inside the Script and I also don't want to stop the script either. But as it is on my Local Machine, the script might get stopped when my Internet Connection is off or at Night. So i couldn't keep running the Script Whole day on my PC..
So is there any way or website or Method where i could deploy my Script and make it Execute forever there ? I have heard about CRON JOBS before in Cpanel which can Help repeated Tasks but here in my case i want to keep running my Script on the Machine till i don't close the script .
Are their any such solutions. Because most of twitter bots i see are running forever, meaning their Script is getting executed somewhere 24x7 . This is what i want to know, How is that Task possible?
As mentioned by Jon and Vincent, it's better to run the code from a cloud service. But either way, I think what you're looking for is what to put into the terminal to run the code even after you close the terminal. This is what worked for me:
nohup python code.py &
You can add a systemd .service file, which can have the added benefit of:
logging (compressed logs at a central place, or over network to a log server)
disallowing access to /tmp and /home-directories
restarting the service if it fails
starting the service at boot
setting capabilities (ref setcap/getcap), disallowing file access if the process only needs network access, for instance