Run python in background - python

I've been searching a lot for this problem, but I didnt find any valuable answer.
I want to make a script (lets say it is a library) which runs some functions at reboot. Inside my library, there will be a function like
def randomfunction():
print("randomtext")
After loading this function, everytime a call for randomfunction() in any python run (I will .py as cgi scripts) will return me "randomtext".
Is that possible or I miss something?
It is working on python idle if I use exec, but I want this exec to be on system. That would be for a linux OS.

Don't you need some kind of Interprocess Communication for this?
Might be worth taking a look at these docs: Python IPC
Also,
this SO post might help you. I think it offers a solution to what you are looking for.

Related

How to read the documentation of a certain module?

I've just finished my course of Python, so now I can write my own script. So to do that I started to write a script with the module Scapy, but the problem is, the documentation of Scapy is used for the interpreter Scapy, so I don't know how to use it, find the functions, etc.
I've found a few tutorials in Internet with a few examples but it's pretty hard. For example, I've found in a script the function "set_payload" to inject some code in the layer but I really don't know where he found this function.
What's your suggestion for finding how a module works and how to write correctly with it? Because I don't really like to check and pick through other scripts on Internet.
If I have understood the question correctly, roughly what you are asking is how to find the best source to understand a module.
If you are using an inbuilt python module, the best source is the python documentation.
Scapy is not a built-in python module. So you may have some issues with some of the external modules (by external I mean the ones you need to explicitly install).
For those, if the docs aren't enough, I prefer to look at some of the github projects that may use that module one way or the other and most of the times it works out. If it doesn't, then I go to some blogs or some third party tutorials. There is no right way to do it, You will have to put in the effort where its needed.
I've never used Scapy but it seems well documented.
https://buildmedia.readthedocs.org/media/pdf/scapy/latest/scapy.pdf
This version appearing to have been released at the time of writing this.

Is it Possible to use schtasks on Python Code?

I have set up a windows server on AWS and have set it up to run python. I'm trying to get this to run on a regular basis but I'm not sure is schtasks will work to run the python code. Could you give some advice around this please.
Also, the reason I didn't set this up on a native python friendly OS is because I was having some issues installing the libraries I needed.
Any help or advice is hugely appreciated.
'schtasks' is just an executable. You can always launch that within python script, using subprocess for example. You need to keep in mind for the following though:
It could require elevation. If your script is within ordinary process, you could get UAC prompt, and will need a way to deal with it.
If you'd like to create a task running without user logging on, you need to manage user/password.
If you create a bunch of tasks, you need to manage them, query/pause/stop/delete???
That's all I can think of right now...

Execution permissions in Python

I need to send code to remote clients to be executed in them but security is a concern for me right now. I don't want unsafe code to be executed there so I would like to control what a program is doing. I mean for example, know if is making connections, where is connecting to, if is reading local files, etc. Is this possible with Python?
EDIT: I'm thinking in something similar to Android permission system. I want to know what a code will do and if it does something different, stop it.
You could use a different Python runtime:
if you run your script using Jython; you can exploit Java's permission system
with Pypy's sandboxed version you can choose what is allowed to run in your controller script
There used to be a module in Python called bastian, but that was deprecated as it wasn't that secure. There's also I believe something called RPython, but I don't know too much about that.
I would in this case use Pyro and write the code on the target server. That way you know clients can only execute written and tested code.
edit - it's probably worth noting that Pyro also supports http://en.wikipedia.org/wiki/Privilege_separation - although I've not had to use it for that.
I think you are looking for a sandboxed python. There used to be an effort to implement this, but it has been abolished a couple of years ago.
Sandboxed python in the python wiki offers a nice overview of possible options for your usecase.
The most rigourous (but probably the slowest) way is to run Python on a bare OS in an emulator.
Depending on the OS you use, there are several ways of running programs with restrictions, but without the overhead of an emulator:
FreeBSD has a nice integrated solution in the form of jails.
These grew out of the chroot system call.
Linux-VServer aims to do more or less the same on Linux.

What is the best method to call a Python 3.x program from within Python 2.x?

I'm writing a Django web application. As of now, Django does not support Python 3. For the purposes of my web application, and without getting into to much detail, I essentially need to use some libraries that only support Python 3. Suffice it to say that after much thorough research no 2.x alternative was found.
So my question is this: How should I go about this?
I have both Python 2 and 3 installed on my server, and I have the Python 3 code written and waiting to be called. I was considering simply using the subprocess module, effectively calling Python 3 from the command line, but the question is, is this the best method or is there a best practice I could use instead here? Using subprocess seems pretty hackish to me. Don't get me wrong, I'm okay with hackish, I just want to make sure there's nothing else I should be doing instead.
Since the Python 3 and Python 2 interpreters are totally separate executables and have separate libraries installed on your system, using subprocess to invoke one from the other is the best practice. It's not a hack at all. There are a number of ways to pass data between them but the two interpreters should be run as separate processes.
That said, you may need to keep in mind the startup time associated with launching an interpreter process. That gets back to how to pass data between the two processes. If your Python 2 code is going to be frequently calling the Python 3 routines, you may need to structure the Python 3 program as a daemon. But you would still use subprocess to launch it.
Run the 3.x program as a separate service and then connect using some kind of RPC mechanism?

Writing a kernel mode profiler for processes in python

I would like seek some guidance in writing a "process profiler" which runs in kernel mode. I am asking for a kernel mode profiler is because I run loads of applications and I do not want my profiler to be swapped out.
When I said "process profiler" I mean to something that would monitor resource usage by the process. including usage of threads and their statistics.
And I wish to write this in python. Point me to some modules or helpful resource.
Please provide me guidance/suggestion for doing it.
Thanks,
Edit::: Would like to add that currently my interest isto write only for linux. however after i built it i will have to support windows.
It's going to be very difficult to do the process monitoring part in Python, since the python interpreter doesn't run in the kernel.
I suspect there are two easy approaches to this:
use the /proc filesystem if you have one (you don't mention your OS)
Use dtrace if you have dtrace (again, without the OS, who knows.)
Okay, following up after the edit.
First, there's no way you're going to be able to write code that runs in the kernel, in python, and is portable between Linux and Windows. Or at least if you were to, it would be a hack that would live in glory forever.
That said, though, if your purpose is to process Python, there are a lot of Python tools available to get information from the Python interpreter at run time.
If instead your desire is to get process information from other processes in general, you're going to need to examine the options available to you in the various OS APIs. Linux has a /proc filesystem; that's a useful start. I suspect Windows has similar APIs, but I don't know them.
If you have to write kernel code, you'll almost certainly need to write it in C or C++.
don't try and get python running in kernel space!
You would be much better using an existing tool and getting it to spit out XML that can be sucked into Python. I wouldn't want to port the Python interpreter to kernel-mode (it sounds grim writing it).
The /proc option does sound good.
some code code that reads proc information to determine memory usage and such. Should get you going:
http://www.pixelbeat.org/scripts/ps_mem.py reads memory information of processes using Python through /proc/smaps like charlie suggested.
Some of your comments on other answers suggest that you are a relatively inexperienced programmer. Therefore I would strongly suggest that you stay away from kernel programming, as it is very hard even for experienced programmers.
Why would you want to write something that
is a very complex system (just look at existing profiling infrastructures and how complex they are)
can not be done in python (I don't know any kernel that would allow execution of python in kernel mode)
already exists (oprofile on Linux)
have you looked at PSI? (http://www.psychofx.com/psi/)
"PSI is a Python module providing direct access to real-time system and process information. PSI is a Python C extension, providing the most efficient access to system information directly from system calls."
it might give you what you are looking for. .... or at least a starting point.
Edit 2014:
I'd recommend checking out psutil instead:
https://pypi.python.org/pypi/psutil
psutil is actively maintained and has some nifty process monitoring features. PSI seems to be somewhat dead (last release 2009).

Categories

Resources