Use child_process and sys to send back data from python file - python

I have a nodejs app that uses child_process to send data to a python file. That aspect works. The issue is I don't quite understand how to return specific new data back to the node that called it.
The nodejs:
const spawn = require("child_process").spawn;
app.get("/", (request, response) => {
var dataToSend;
var nodeData = "jeff";
var obj = {nodeData: nodeData}
const python = spawn('python', ['script1.py'], obj);
python.stdout.on('data', function (data) {
dataToSend = data.toString();
});
python.stderr.on('data', data => {
console.error(`stderr: ${data}`);
console.log(data)
})
python.on('exit', (code) => {
console.log('exited')
console.log(dataToSend);
//Ideally datatosend is the returned image buffer
});
})
The python:
import sys
print('hello' + sys.argv[0][0])
sys.stdout.flush()
In the python, there will be a var "x = imageBufferYThing or Image file"
I need some way to send that imagebuffer from python back to nodejs.
Like
"sys.stdout.flush(imageBuffer)"
If that is a thing.
I know that some data is already sent back to the node as the program runs, but I don't understand that well. Like, all I really need is to go back in nodejs after the program runs and do "var theImageBuffer = theImageBufferFromPythonProgram".
Thanks, and let me know if there is more info needed.

Related

How to fetch data analyzed in python to node.js and pass it to angular?

I am new to angular and i want to display JSON data from python to angular with the help of node.js and I used child process to connect python and node.js but I dont know how to pass it to angular service
node.js file
const express = require('express')
const { spawn } = require('child_process')
const app = express()
const port = 8000
app.get('/', (req, res) => {
let dataToSend
let largeDataSet = []
// spawn new child process to call the python script
const python = spawn('python', ['test.py'])
// collect data from script
python.stdout.on('data', function (data) {
console.log('Pipe data from python script ...')
//dataToSend = data;
largeDataSet.push(data)
})
// in close event we are sure that stream is from child process is closed
python.on('close', (code) => {
console.log(`child process close all stdio with code ${code}`)
// send data to browser
res.send(largeDataSet.join(''))
})
})
app.listen(port, () => {
console.log(`App listening on port ${port}!`)
})
Technically you just have to send a Http GET request from your service.
I suggest that you should read and follow this offical http client guide to set it up correctly.
Here is a simple service snippet. This should be enough.
#Injectable({
providedIn: 'root',
})
export class MyService {
constructor(private http: HttpClient) {}
getData(): Observable<any> {
const url = '';
return this.http.get(url);
}
}

tRPC Mutation Wait For Data Updated

I'm using tRPC, NextJS, and PyShell for my project. I'll send user input to trpc and use that info as input to python script. Wait for python file done and return updated data to trpc, then send back to frontend.
Currently, it takes longer for python file to finish, and trpc does not send back the right info to frontend. Is there a way to fix that?
The code below is for trpc:
.mutation("upload", {
// validate input with Zod
input: z.object({ input: z.string() }).nullish(),
async resolve({ input }) {
var msg = ""
if (input?.input) {
console.log("-----------------------")
pyshell.on('message', async function (message) {
console.log("MSG: ", message);
msg = message;
});
pyshell.send(['sent', input.input]).end( function (err) {
if (err) console.error(err);
console.log("done");
});
}
return msg.length != 0;
},
});
msg is supposed to get updated with info from print statement in python file. It shows empty string in msg now.

How do I correctly make consecutive calls to a child process in Node.js?

I have a Node.js application which is currently a web-based API. For one of my API functions, I make a call to a short Python script that I've written to achieve some extra functionality.
After reading up on communicating between Node and Python using the child_process module, I gave it a try and achieved my desired results. I call my Node function that takes in an email address, sends it to Python through std.in, my Python script performs the necessary external API call using the provided e-mail, and writes the output of the external API call to std.out and sends it back to my Node function.
Everything works properly until I fire off several requests consecutively. Despite Python correctly logging the changed e-mail address and also making the request to the external API with the updated e-mail address, after the first request I make to my API (returning the correct data), I keep receiving the same old data again and again.
My initial guess was that Python's input stream wasn't being flushed, but after testing the Python script I saw that I was correctly updating the e-mail address being received from Node and receiving the proper query results.
I think there's some underlying workings of the child_process module that I may not be understanding... since I'm fairly certain that the corresponding data is being correctly passed back and forth.
Below is the Node function:
exports.callPythonScript = (email)=>
{
let getPythonData = new Promise(function(success,fail){
const spawn = require('child_process').spawn;
const pythonProcess = spawn('python',['./util/emailage_query.py']);
pythonProcess.stdout.on('data', (data) =>{
let dataString = singleToDoubleQuote(data.toString());
let emailageResponse = JSON.parse(dataString);
success(emailageResponse);
})
pythonProcess.stdout.on('end', function(){
console.log("python script done");
})
pythonProcess.stderr.on('data', (data) => {
fail(data);
})
pythonProcess.stdin.write(email);
pythonProcess.stdin.end();
})
return getPythonData;
}
And here is the Python script:
import sys
from emailage.client import EmailageClient
def read_in():
lines = sys.stdin.readlines()
return lines[0]
def main():
client = EmailageClient('key','auth')
email = read_in()
json_response = client.query(email,user_email='authemail#mail.com')
print(json_response)
sys.stdout.flush()
if __name__ == '__main__':
main()
Again, upon making a single call to callPythonScript everything is returned perfectly. It is only upon making multiple calls that I'm stuck returning the same output over and over.
I'm hitting a wall here and any and all help would be appreciated. Thanks all!
I've used a Mutex lock for this kind of example. I can't seem to find the question the code comes from though, as I found it on SO when I had the same kind of issue:
class Lock {
constructor() {
this._locked = false;
this._waiting = [];
}
lock() {
const unlock = () => {
let nextResolve;
if (this._waiting.length > 0) {
nextResolve = this._waiting.pop(0);
nextResolve(unlock);
} else {
this._locked = false;
}
};
if (this._locked) {
return new Promise((resolve) => {
this._waiting.push(resolve);
});
} else {
this._locked = true;
return new Promise((resolve) => {
resolve(unlock);
});
}
}
}
module.exports = Lock;
Where I then call would implement it like this, with your code:
class Email {
constructor(Lock) {
this._lock = new Lock();
}
async callPythonScript(email) {
const unlock = await this._lock.lock();
let getPythonData = new Promise(function(success,fail){
const spawn = require('child_process').spawn;
const pythonProcess = spawn('python',['./util/emailage_query.py']);
pythonProcess.stdout.on('data', (data) =>{
let dataString = singleToDoubleQuote(data.toString());
let emailageResponse = JSON.parse(dataString);
success(emailageResponse);
})
pythonProcess.stdout.on('end', function(){
console.log("python script done");
})
pythonProcess.stderr.on('data', (data) => {
fail(data);
})
pythonProcess.stdin.write(email);
pythonProcess.stdin.end();
})
await unlock();
return getPythonData;
}
}
I haven't tested this code, and i've implemented where i'm dealing with arrays and each array value calling python... but this should at least give you a good start.

Best way to send TypedArray data from NodeJS to Python using python-shell

I'm trying to send data from NodeJS to process in Python using python-shell. My Python script requires a float32 array but I am not sure how to send that data type using python-shell. I can send a string without issue and I know my python script works fine otherwise. Is there a way to send the array directly or do I need to do some data conversion or parsing in python?
Here is what I am trying right now:
In Python:
import sys
input = sys.stdin.read()
print type(input)
In Node:
var PythonShell = require('python-shell');
var pyshell = new PythonShell('script.py', {mode:'binary'});
// data is float32 TypedArray
pyshell.send(data).end(function(err){
if (err){console.log(err, 'did not work')};
});
pyshell.on('message', function (message) {
console.log('message received', message);
});
Here I get the following error:
net.js:655
throw new TypeError(
^
TypeError: Invalid data, chunk must be a string or buffer, not object
at Socket.write (net.js:655:11)
at PythonShell.send (/project/node_modules/python-shell/index.js:205:16)
at Object.<anonymous> (/project/server.js:59:11)
at Module._compile (module.js:570:32)
at Object.Module._extensions..js (module.js:579:10)
at Module.load (module.js:487:32)
at tryModuleLoad (module.js:446:12)
at Function.Module._load (module.js:438:3)
at Module.runMain (module.js:604:10)
at run (bootstrap_node.js:394:7)
If I convert the TypedArray to string it sends fine but it feels wrong to receive this long string in Python rather than an array. I'm sure there is a simple fix. Any advice would be hugely appreciated!
In the end I converted my float32 arrays to javascript Buffer objects and used 'binary' mode. Also I needed to switch from pyshell.on to pyshell.stdout.on which is in the python-shell test scripts for binary mode but not in the readme...
In Node:
var options = {mode: 'binary'};
var pyshell = new PythonShell('test.py', options);
var data = Buffer.from(myFloat32TypedArray.buffer, 'float32');
pyshell.send(data).end((err) => {
if (err){
console.log(err);
}else{
console.log('data sent');
};
});
pyshell.stdout.on('data', function (data) {
console.log(data);
});
In Python:
input_data = np.frombuffer(sys.stdin.read(), dtype=np.float32)
sys.stdout.write(input_data)
I don't know about a direct way (which I assume isn't available as well) but you can use JSON.stringify on the javascript side to convert the array into a string, send it to python, read it as raw input and convert that back into a json object (which will be an array).
JAVASCRIPT SIDE:
var a = [1,2,3];
pyshell.send(JSON.stringify(data), ....)
PYTHON SIDE:
import json,sys
input = sys.stdin.read()
print(json.loads(input))
In Node:
let shell_bin = new PythonShell('test.py', { mode: 'binary' });
shell_bin.on('stderr', function (stderr) {
console.log((stderr));
});
let myFloat32TypedArray = new Float32Array(10);
myFloat32TypedArray[1] = 10.0;
var data = Buffer.from(myFloat32TypedArray.buffer, 'float32');
shell_bin.send(data).end((err) => {
if (err){
console.log(err);
}else{
console.log('data sent');
};
});
shell_bin.stdout.on('data', function (data) {
console.log(data);
let myFloat32TypedArray = new Float32Array(data.buffer);
console.log(myFloat32TypedArray);
});
In Python:
import sys, numpy as np
xb = sys.stdin.buffer.read()
input_data = np.frombuffer(xb, dtype=np.float32)
sys.stdout.buffer.write(input_data)

return JSON from python to node via spawn

I have a python script that takes two arguments; a directory and a file name.
The python script will create a JSON object from specific files in the directory provided and save it with the name being the second argument.
However if the second argument is equal to string "stream", the the JSON data is output to STDOUT.
I wrote a node script that spawns a child process to call the python script from terminal and it works as intended.
"use strict";
const spawn = require("child_process").spawn;
const command = "(path to python)";
const loc = "(path to .py script)";
const acct = process.argv[2];
const output = process.argv[3];
let callPy = spawn(command, ["erik.py", acct, output], {
cwd: loc,
stdio: "pipe"
});
callPy.stdout.on("data", (data) => {
if (data.toString() === "success") {
console.log(acct, "generated");
} else {
console.log(data.toString());
}
});
EDIT:
I have unmarked this issue as solved: after spending a bit more time trying to implement this, I have not come to a satisfactory solution that allows me to synchronously call a child process from node, signal the python script to emit JSON data, receive the data, and then send the data to the browser. I tried using a promise chain on the child process:
let child = require("child_process").spawn; // or spawnSync
let spawn = () => {
let spawned = child(command, args, options,(err, stdout, stderr) => {
if (err) { console.log(err) };
});
return spawned
};
let listen = (child) => {
child.stdout.on("data", (data) => {
console.log("PID", child.pid);
console.log("data from listen func: ", data);
return child
});
};
let kill = (child) => {
child.kill( "SIGTERM" );
}
var p = new Promise((res, e) => {
res( spawn() )
e( console.error(e) )
});
p.then(result => {
return listen(result);
})
p.then(result => {
return kill(result);
});
using spawn() the child terminates before any of the data is returned
using spawnSync() the promise chain tries (and fails) to listen on the child's io before the child is spawned
I have yet to try websockets to transmit the data but I doubt that will solve this, the promise is returning an empty object to my router function invocation before the promise chain retrieves the chunks from the python script.
Any further insight is welcome.
So you need at least two things to do this
A way to queue commands to execute with spawn
A async pattern to wait execution of a command and join processes when each executable terminates
A minimalistic examples is
var cmd = new CommandLine({
debug : true,
error : true,
delay : true });
// commandItemsArray is a list of commands list, command options, command arguments
commandItemsArray = [ ['ls','-l','./'], ['ls','-a','./'] ];
cmd.executeCommands( commandItemsArray
, function(results) {
console.log( results );
}
, function(error) {
console.log( error );
});
there are several package on npm to do both (search for node cli, command line, etc), one is this one node-commander that usese a Promise.all pattern to achieve the second task:
function PromiseAll(items, block, done, fail) {
var self=this;
var promises = [], index=0;
items.forEach(function(item) {
promises.push( function(item,i) {
return new Promise(function(resolve, reject) {
return block.apply(this,[item,index,resolve,reject]);
});
}(item,++index))
});
Promise.all(promises).then(function AcceptHandler(results) {
if(done) done( results );
}, function ErrorHandler(error) {
if(fail) fail( error );
});
} //promiseAll
I was able to resolve this issue relatively simply using websockets:
the client submits the request, which is communicated to the server via socket.IO, the request is received and the spawn event is triggered, when the chunks are finished appending a termination event is emitted which triggers killing of the child process and returning the data to the client

Categories

Resources