How can I run untrusted code safely using Lupa? - python

I'm working on this project where I am using Python + Lupa to run Lua code.
I want to run untrusted Lua code (as a string) within my Python script using lupa.LuaRuntime().eval(). I've looked around to see what I need to do to restrict what this Lua code has access to.
I stumbled upon this old post (https://stackoverflow.com/a/17455485) that shows you how to do it in Lua 5.1 using setfenv():
import lupa
L = lupa.LuaRuntime()
sandbox = L.eval("{}")
setfenv = L.eval("setfenv")
sandbox.print = L.globals().print
sandbox.math = L.globals().math
sandbox.string = L.globals().string
sandbox.foobar = foobar
# etc...
setfenv(0, sandbox)
L.execute("os.execute('rm -rf *')")
the setfenv function doesn't exist in Lua 5.4, which I'm using.
How do you do this in more modern versions of Lua?
I've tried to create a new function (sb_func()) within load() and then call it, but it does nothing
import lupa
L = lupa.LuaRuntime()
sandbox = L.eval("{}")
load = L.eval("load")
sandbox.math = L.globals().math
sandbox.print = L.globals().print
sb_func = load("function sb_func() print('test') return nil end","","t",sandbox)
sb_func

Related

Could not load the file : Error when calling a dll in python project

We have a python library(Lets call it TestLibrary), made as a 'whl' file.
We are consuming that library in another python project(Main Project)(flask based).
In test library, we are calling a dll(C#, Net standard 2.0) , which has few encryption methods, which returns us encrypted data.
Now this test library gives error when called those encryption methods from TestLibrary.
How can we consume those dll's in TestLibrary, and get the data on main project.
// below code is in TestLibrary
def get_encrypted_data():
try:
clr.AddReference('folder/dlls/EncryptionDLL')
from EncryptionDLL import EncryptionClass
encryptionObj = EncryptionClass()
encryptedData = encryptionObj.Encrypt('Data', 'Encryption Key')
return encryptedData
except Exception as e:
return e
//Below Code is in Flask Application
//pip install TestLibrary
from TestLibrary import get_encrypted_data
encryptedData = get_encrypted_data(); //Error here, not able to read dll
I have tried it with, PythonNet, LibMono installation. It works fine when created a POC with only that dll in python.
When we place it another library and consume that library, we are getting error.

Trying to append a string within a shell script using python

I'm trying to change the value of "dockerversion=" in this bash script.
# Docker Variables
containerid=$(docker ps -qf "name=vaultwarden")
imageid=$(docker images -q vaultwarden/server)
dockerversion=1
---------------
# Stop/RM Image
docker stop $containerid
docker rm $containerid
docker rmi $imageid
I'm using python and currently am at
# Pull Portainer Version
url = 'https://github.com/dani-garcia/vaultwarden/releases/latest'
r = requests.get(url)
version = r.url.split('/')[-1]
# Pull Current Version
with open('vaultwarden-update', 'r') as vaultwarden:
fileversion = vaultwarden.readlines()
vcurrentversion = re.sub(r'dockerversion=', '', fileversion[14])
# Check who is higher
if version > vcurrentversion:
with open('vaultwarden-update', '') as vaultwarden:
for line in fileversion[14]:
vaultwarden.write(re.sub(re.escape(vcurrentversion), version))
I basically want python to check the github releases, see if there's a change, compare that number to the bash script variable, update that within the bash-script and run the script.
The # Check who is higher
Won't work as I need to keep the entire other script file. Just mainly looking for ways to append/change a value through python. Dynamically.
Any thoughts?
(this is literally my first python script)
I took your code and changed it just enough to do what you want it to do. You would need to do some value validations, so that you could log and exit before execution comes to the final part where you rewrite your file (you don't want to open and rewrite if there is nothing to rewrite with).
import re
import requests
from distutils.version import LooseVersion
# Pull Portainer Version
url = 'https://github.com/dani-garcia/vaultwarden/releases/latest'
r = requests.get(url)
github_version = r.url.split('/')[-1]
# Pull Current Version
with open('vaultwarden-update') as vaultwarden:
file_content = vaultwarden.read()
file_match = re.search(r'(dockerversion=([0-9.]*))', file_content)
file_version = file_match.group(2)
# Check who is higher
if LooseVersion(github_version) > LooseVersion(file_version):
print(f'github version ({github_version}) > file version ({file_version})')
with open('vaultwarden-update', 'w') as vaultwarden:
new_file_content = file_content.replace(file_match.group(1), f'dockerversion={github_version}')
vaultwarden.write(new_file_content)
Currently for your content, it outputs:
github version (1.23.0) > file version (1)

Function executed via Erlport stops responding

I am writing my thesis application. I need linear programming, but my app is written in Elixir, which is really not the language for such operations. That is why I decided to use Erlport as the Elixir dependency, which is capable of connecting Python code with Elixir. I'm also using Pulp as the python library for the optimization.
Elixir version: 1.10.4,
Erlport version: 0.10.1,
Python version: 3.8.5,
PuLP version: 2.3
I've written such a module for Elixir-Python communication, which leverages the GenServer as the main 'communication hub' between Elixir and Python:
defmodule MyApp.PythonHub do
use GenServer
def start_link(_) do
GenServer.start_link(__MODULE__, nil, name: __MODULE__)
end
def init(_opts) do
path = [:code.priv_dir(:feed), "python"]
|> Path.join() |> to_charlist()
{:ok, pid} = :python.start([{ :python_path, path }, { :python, 'python3' }])
{:ok, pid}
end
def handle_call({:call_function, module, function_name, arguments}, _sender, pid) do
result = :python.call(pid, module, function_name, arguments)
{:reply, result, pid}
end
def call_python_function(file_name, function_name, arguments) do
GenServer.call(__MODULE__, {:call_function, file_name, function_name, arguments}, 10_000)
end
end
The GenServer module is calling python file, which contains such a function:
def calculate_meal_4(products_json, diet_json, lower_boundary, upper_boundary, enhance):
from pulp import LpMinimize, LpProblem, LpStatus, lpSum, LpVariable, value
import json
products_dictionary = json.loads(products_json)
print(products_dictionary)
diets_dictionary = json.loads(diet_json)
print(diets_dictionary)
model = LpProblem(name="diet-minimization", sense=LpMinimize)
# ... products setup ...
x = LpVariable("prod_1_100g", lower_boundary, upper_boundary)
y = LpVariable("prod_2_100g", lower_boundary, upper_boundary)
z = LpVariable("prod_3_100g", lower_boundary, upper_boundary)
w = LpVariable("prod_4_100g", lower_boundary, upper_boundary)
optimization_function = # ... optimization function setup ...
model += # ... optimization boundary function setup ...
model += optimization_function
print(model)
solved_model = model.solve()
print(value(model.objective))
return [value(x), value(y), value(z), value(w)]
The call to the GenServer itself looks like that:
PythonHub.call_python_function(:diets, python_function, [products_json, meal_statistics_json, #min_portion, #max_portion, #macro_enhancement])
where python_function is :calculate_meal_4 and products_json and meal_statistic_json are jsons containing required data.
While calling calculate_meal_4 via python3 diets.py, which launches the python script above with some example, but real (taken from the app), data everything works fine - I've got the minimized result in almost no time. The problem occurs while calling the python script via Elixir Erlport. Looking at the printed outputs I can tell that it seems working until
solved_model = model.solve()
is called. Then the script seems to freeze and GenServer finally reaches the timeout on GenServer.call function.
I've tested also the call on a simple python test file:
def pass_var(a):
print(a)
return [a, a, a]
and it worked fine.
That is why I am really consterned right now and I am looking for any advices. Shamefully I found nothing yet.
Hmm, it might be that calling an external solver freezes the process.
Given that you can execute bash scripts using elixir, you can easily change the python script to be command line executable (I recommend click). Then, you can write the output to a .json or .csv file and read it back in with Elixir when you're done.
#click.group()
def cli():
pass
#cli.command()
#click.argument('products_json', help='your array of products')
#click.argument('diet_json', help='your dietary wishes')
#click.option('--lower-bound', default=0, help='your minimum number of desired calories')
#click.option('--upper-bound', default=100, help='your maximum number of desired calories')
#click.option('--enhance', default=False, help="whether you'd like to experience our enhanced experience")
def calculate_meal_4(products_json, diet_json, lower_boundary, upper_boundary, enhance):
pass
if __name__ == '__main__':
cli()
which you can then call using python3 my_file.py <products_json> <diet_json> ... et cetera.
You can even validate the JSON and then return the parsed data directly.

Is it possible to let the entire R Studio program run from Python? [duplicate]

I searched for this question and found some answers on this, but none of them seem to work. This is the script that I'm using in python to run my R script.
import subprocess
retcode = subprocess.call("/usr/bin/Rscript --vanilla -e 'source(\"/pathto/MyrScript.r\")'", shell=True)
and I get this error:
Error in read.table(file = file, header = header, sep = sep, quote = quote, :
no lines available in input
Calls: source ... withVisible -> eval -> eval -> read.csv -> read.table
Execution halted
and here is the content of my R script (pretty simple!)
data = read.csv('features.csv')
data1 = read.csv("BagofWords.csv")
merged = merge(data,data1)
write.table(merged, "merged.csv",quote=FALSE,sep=",",row.names=FALSE)
for (i in 1:length(merged$fileName))
{
fileConn<-file(paste("output/",toString(merged$fileName[i]),".txt",sep=""))
writeLines((toString(merged$BagofWord[i])),fileConn)
close(fileConn)
}
The r script is working fine, when I use source('MyrScript.r') in r commandline. Moreover, when I try to use the exact command which I pass to the subprocess.call function (i.e., /usr/bin/Rscript --vanilla -e 'source("/pathto/MyrScript.r")') in my commandline it works find, I don't really get what's the problem.
I would not trust too much the source within the Rscript call as you may not completely understand where are you running your different nested R sessions. The process may fail because of simple things such as your working directory not being the one you think.
Rscript lets you directly run an script (see man Rscript if you are using Linux).
Then you can do directly:
subprocess.call ("/usr/bin/Rscript --vanilla /pathto/MyrScript.r", shell=True)
or better parsing the Rscript command and its parameters as a list
subprocess.call (["/usr/bin/Rscript", "--vanilla", "/pathto/MyrScript.r"])
Also, to make things easier you could create an R executable file. For this you just need to add this in the first line of the script:
#! /usr/bin/Rscript
and give it execution rights. See here for detalis.
Then you can just do your python call as if it was any other shell command or script:
subprocess.call ("/pathto/MyrScript.r")
I think RPy2 is worth looking into, here is a cool presentation on R-bloggers.com to get you started:
http://www.r-bloggers.com/accessing-r-from-python-using-rpy2/
Essentially, it allows you to have access to R libraries with R objects that provides both a high level and low level interface.
Here are the docs on the most recent version: https://rpy2.github.io/doc/latest/html/
I like to point Python users to Anaconda, and if you use the package manager, conda, to install rpy2, it will also ensure you install R.
$ conda install rpy2
And here's a vignet based on the documents' introduction:
>>> from rpy2 import robjects
>>> pi = robjects.r['pi']
>>> pi
R object with classes: ('numeric',) mapped to:
<FloatVector - Python:0x7fde1c00a088 / R:0x562b8fbbe118>
[3.141593]
>>> from rpy2.robjects.packages import importr
>>> base = importr('base')
>>> utils = importr('utils')
>>> import rpy2.robjects.packages as rpackages
>>> utils = rpackages.importr('utils')
>>> packnames = ('ggplot2', 'hexbin')
>>> from rpy2.robjects.vectors import StrVector
>>> names_to_install = [x for x in packnames if not rpackages.isinstalled(x)]
>>> if len(names_to_install) > 0:
... utils.install_packages(StrVector(names_to_install))
And running an R snippet:
>>> robjects.r('''
... # create a function `f`
... f <- function(r, verbose=FALSE) {
... if (verbose) {
... cat("I am calling f().\n")
... }
... 2 * pi * r
... }
... # call the function `f` with argument value 3
... f(3)
... ''')
R object with classes: ('numeric',) mapped to:
<FloatVector - Python:0x7fde1be0d8c8 / R:0x562b91196b18>
[18.849556]
And a small self-contained graphics demo:
from rpy2.robjects.packages import importr
graphics = importr('graphics')
grdevices = importr('grDevices')
base = importr('base')
stats = importr('stats')
import array
x = array.array('i', range(10))
y = stats.rnorm(10)
grdevices.X11()
graphics.par(mfrow = array.array('i', [2,2]))
graphics.plot(x, y, ylab = "foo/bar", col = "red")
kwargs = {'ylab':"foo/bar", 'type':"b", 'col':"blue", 'log':"x"}
graphics.plot(x, y, **kwargs)
m = base.matrix(stats.rnorm(100), ncol=5)
pca = stats.princomp(m)
graphics.plot(pca, main="Eigen values")
stats.biplot(pca, main="biplot")
The following code should work out:
import rpy2.robjects as robjects
robjects.r.source("/pathto/MyrScript.r", encoding="utf-8")
I would not suggest using a system call to there are many differences between python and R especially when passing around data.
There are many standard libraries to call R from Python to choose from see this answer
If you just want run a script then you can use system("shell command") of the sys lib available by import sys. If you have an usefull output you can print the result by " > outputfilename" at the end of your shell command.
For example:
import sys
system("ls -al > output.txt")
Try adding a line to the beginning of your R script that says:
setwd("path-to-working-directory")
Except, replace the path with the path to the folder containing the files features.csv and BagofWords.csv.
I think the problem you are having is because when you run this script from R your working directory is already the correct path, but when you run the script from python, it defaults to a working directory somewhere else (likely the top of the user directory).
By adding the extra line at the beginning of your R script, you are explicitly setting the working directory and the code to read in these files will work. Alternatively, you could replace the filenames in read.csv() with the full filepaths of these files.
#dmontaner suggested this possibility in his answer:
The process may fail because of simple things such as your working directory not being the one you think.

Import Existing Python App into AWS Lambda

I need to create an AWS Lambda version of an existing Python 2.7 program written by someone else who has left the company.
Using one function I need to convert as an example:
#!/usr/bin/env python
from aws_common import get_profiles,get_regions
from aws_ips import get_all_public_ips
import sys
def main(cloud_type):
# csv header
output_header = "profile,region,public ip"
profiles = get_profiles(cloud_type)
regions = get_regions(cloud_type)
print output_header
for profile in profiles:
for region in regions:
# public_ips = get_public_ips(profile,region)
public_ips = get_all_public_ips(profile,region)
for aws_ip in public_ips:
print "%s,%s,%s" % (profile,region,aws_ip)
if __name__ == "__main__":
cloud_type = 'commercial'
if sys.argv[1]:
if sys.argv[1] == 'govcloud':
cloud_type = 'govcloud'
main(cloud_type)
I need to know how to create this as an AWS handler with event and context arguments from the code above.
If I could get some pointers on how to do this it would be appreciated.
You can simply start writing python function inside the handler of aws labda.
in handler simply start defining functions and variables and uplaod zip file inside lambda if there is any type of dependency.
you can change the python version in lambda as per if you are using python 2.7.
i would like to suggest server less framework and uplaoding your code to lambda. it's so easy to manage dependency code management from locally.
here you are using aws_common and importing you have to check it is inside aws sdk or not.
you can import aws-sdk and use it
var aws = require('aws-sdk');
exports.handler = function (event, context)
{
}
inside exports handler you can start making for loops in python or goes further

Categories

Resources