tRPC Mutation Wait For Data Updated - python

I'm using tRPC, NextJS, and PyShell for my project. I'll send user input to trpc and use that info as input to python script. Wait for python file done and return updated data to trpc, then send back to frontend.
Currently, it takes longer for python file to finish, and trpc does not send back the right info to frontend. Is there a way to fix that?
The code below is for trpc:
.mutation("upload", {
// validate input with Zod
input: z.object({ input: z.string() }).nullish(),
async resolve({ input }) {
var msg = ""
if (input?.input) {
console.log("-----------------------")
pyshell.on('message', async function (message) {
console.log("MSG: ", message);
msg = message;
});
pyshell.send(['sent', input.input]).end( function (err) {
if (err) console.error(err);
console.log("done");
});
}
return msg.length != 0;
},
});
msg is supposed to get updated with info from print statement in python file. It shows empty string in msg now.

Related

Use child_process and sys to send back data from python file

I have a nodejs app that uses child_process to send data to a python file. That aspect works. The issue is I don't quite understand how to return specific new data back to the node that called it.
The nodejs:
const spawn = require("child_process").spawn;
app.get("/", (request, response) => {
var dataToSend;
var nodeData = "jeff";
var obj = {nodeData: nodeData}
const python = spawn('python', ['script1.py'], obj);
python.stdout.on('data', function (data) {
dataToSend = data.toString();
});
python.stderr.on('data', data => {
console.error(`stderr: ${data}`);
console.log(data)
})
python.on('exit', (code) => {
console.log('exited')
console.log(dataToSend);
//Ideally datatosend is the returned image buffer
});
})
The python:
import sys
print('hello' + sys.argv[0][0])
sys.stdout.flush()
In the python, there will be a var "x = imageBufferYThing or Image file"
I need some way to send that imagebuffer from python back to nodejs.
Like
"sys.stdout.flush(imageBuffer)"
If that is a thing.
I know that some data is already sent back to the node as the program runs, but I don't understand that well. Like, all I really need is to go back in nodejs after the program runs and do "var theImageBuffer = theImageBufferFromPythonProgram".
Thanks, and let me know if there is more info needed.

Can not get python file output using node.js

I am trying to fetch python output from Node.js script using postman but unable to get the required output. I am explaining my code below.
app.js:
router.post('/usecase-workflow', async (req, res) => {
try{
let responseUsecase = await usecaseWorkflow.fetchUsecaseWorkflow(req);
res.send(responseUsecase);
}catch(error) {
responseObj = {
status: 'error',
msg: 'Error occurred while downloading ubot file',
body: error
}
res.send(responseObj);
}
})
usecaseWorkflow.js:
const mongoose = require('mongoose');
const axios = require('axios');
const request = require('request');
class DefineUseCase {
fetchUsecaseWorkflow = async(req) => {
try{
const response = await axios.post('http://127.0.0.1:5005/usecase-workflow',req);
//console.log(response);
return response;
}catch(error) {
console.log(error);
}
}
}
module.exports = new DefineUseCase();
When I am doing REST API call from postman the above code is executing. I am giving the screen shot of postman below.
Here my need is i will upload one zip file and one node.js REST API will call. Inside the node script I am calling one python file to get the final output. But as per my code its not giving any result. If I am calling the Python file directly from postman Its giving some result. I am also giving python call postman screen shot below.
So here I need to fetch the same above output via node.js REST API.

How do I correctly make consecutive calls to a child process in Node.js?

I have a Node.js application which is currently a web-based API. For one of my API functions, I make a call to a short Python script that I've written to achieve some extra functionality.
After reading up on communicating between Node and Python using the child_process module, I gave it a try and achieved my desired results. I call my Node function that takes in an email address, sends it to Python through std.in, my Python script performs the necessary external API call using the provided e-mail, and writes the output of the external API call to std.out and sends it back to my Node function.
Everything works properly until I fire off several requests consecutively. Despite Python correctly logging the changed e-mail address and also making the request to the external API with the updated e-mail address, after the first request I make to my API (returning the correct data), I keep receiving the same old data again and again.
My initial guess was that Python's input stream wasn't being flushed, but after testing the Python script I saw that I was correctly updating the e-mail address being received from Node and receiving the proper query results.
I think there's some underlying workings of the child_process module that I may not be understanding... since I'm fairly certain that the corresponding data is being correctly passed back and forth.
Below is the Node function:
exports.callPythonScript = (email)=>
{
let getPythonData = new Promise(function(success,fail){
const spawn = require('child_process').spawn;
const pythonProcess = spawn('python',['./util/emailage_query.py']);
pythonProcess.stdout.on('data', (data) =>{
let dataString = singleToDoubleQuote(data.toString());
let emailageResponse = JSON.parse(dataString);
success(emailageResponse);
})
pythonProcess.stdout.on('end', function(){
console.log("python script done");
})
pythonProcess.stderr.on('data', (data) => {
fail(data);
})
pythonProcess.stdin.write(email);
pythonProcess.stdin.end();
})
return getPythonData;
}
And here is the Python script:
import sys
from emailage.client import EmailageClient
def read_in():
lines = sys.stdin.readlines()
return lines[0]
def main():
client = EmailageClient('key','auth')
email = read_in()
json_response = client.query(email,user_email='authemail#mail.com')
print(json_response)
sys.stdout.flush()
if __name__ == '__main__':
main()
Again, upon making a single call to callPythonScript everything is returned perfectly. It is only upon making multiple calls that I'm stuck returning the same output over and over.
I'm hitting a wall here and any and all help would be appreciated. Thanks all!
I've used a Mutex lock for this kind of example. I can't seem to find the question the code comes from though, as I found it on SO when I had the same kind of issue:
class Lock {
constructor() {
this._locked = false;
this._waiting = [];
}
lock() {
const unlock = () => {
let nextResolve;
if (this._waiting.length > 0) {
nextResolve = this._waiting.pop(0);
nextResolve(unlock);
} else {
this._locked = false;
}
};
if (this._locked) {
return new Promise((resolve) => {
this._waiting.push(resolve);
});
} else {
this._locked = true;
return new Promise((resolve) => {
resolve(unlock);
});
}
}
}
module.exports = Lock;
Where I then call would implement it like this, with your code:
class Email {
constructor(Lock) {
this._lock = new Lock();
}
async callPythonScript(email) {
const unlock = await this._lock.lock();
let getPythonData = new Promise(function(success,fail){
const spawn = require('child_process').spawn;
const pythonProcess = spawn('python',['./util/emailage_query.py']);
pythonProcess.stdout.on('data', (data) =>{
let dataString = singleToDoubleQuote(data.toString());
let emailageResponse = JSON.parse(dataString);
success(emailageResponse);
})
pythonProcess.stdout.on('end', function(){
console.log("python script done");
})
pythonProcess.stderr.on('data', (data) => {
fail(data);
})
pythonProcess.stdin.write(email);
pythonProcess.stdin.end();
})
await unlock();
return getPythonData;
}
}
I haven't tested this code, and i've implemented where i'm dealing with arrays and each array value calling python... but this should at least give you a good start.

return JSON from python to node via spawn

I have a python script that takes two arguments; a directory and a file name.
The python script will create a JSON object from specific files in the directory provided and save it with the name being the second argument.
However if the second argument is equal to string "stream", the the JSON data is output to STDOUT.
I wrote a node script that spawns a child process to call the python script from terminal and it works as intended.
"use strict";
const spawn = require("child_process").spawn;
const command = "(path to python)";
const loc = "(path to .py script)";
const acct = process.argv[2];
const output = process.argv[3];
let callPy = spawn(command, ["erik.py", acct, output], {
cwd: loc,
stdio: "pipe"
});
callPy.stdout.on("data", (data) => {
if (data.toString() === "success") {
console.log(acct, "generated");
} else {
console.log(data.toString());
}
});
EDIT:
I have unmarked this issue as solved: after spending a bit more time trying to implement this, I have not come to a satisfactory solution that allows me to synchronously call a child process from node, signal the python script to emit JSON data, receive the data, and then send the data to the browser. I tried using a promise chain on the child process:
let child = require("child_process").spawn; // or spawnSync
let spawn = () => {
let spawned = child(command, args, options,(err, stdout, stderr) => {
if (err) { console.log(err) };
});
return spawned
};
let listen = (child) => {
child.stdout.on("data", (data) => {
console.log("PID", child.pid);
console.log("data from listen func: ", data);
return child
});
};
let kill = (child) => {
child.kill( "SIGTERM" );
}
var p = new Promise((res, e) => {
res( spawn() )
e( console.error(e) )
});
p.then(result => {
return listen(result);
})
p.then(result => {
return kill(result);
});
using spawn() the child terminates before any of the data is returned
using spawnSync() the promise chain tries (and fails) to listen on the child's io before the child is spawned
I have yet to try websockets to transmit the data but I doubt that will solve this, the promise is returning an empty object to my router function invocation before the promise chain retrieves the chunks from the python script.
Any further insight is welcome.
So you need at least two things to do this
A way to queue commands to execute with spawn
A async pattern to wait execution of a command and join processes when each executable terminates
A minimalistic examples is
var cmd = new CommandLine({
debug : true,
error : true,
delay : true });
// commandItemsArray is a list of commands list, command options, command arguments
commandItemsArray = [ ['ls','-l','./'], ['ls','-a','./'] ];
cmd.executeCommands( commandItemsArray
, function(results) {
console.log( results );
}
, function(error) {
console.log( error );
});
there are several package on npm to do both (search for node cli, command line, etc), one is this one node-commander that usese a Promise.all pattern to achieve the second task:
function PromiseAll(items, block, done, fail) {
var self=this;
var promises = [], index=0;
items.forEach(function(item) {
promises.push( function(item,i) {
return new Promise(function(resolve, reject) {
return block.apply(this,[item,index,resolve,reject]);
});
}(item,++index))
});
Promise.all(promises).then(function AcceptHandler(results) {
if(done) done( results );
}, function ErrorHandler(error) {
if(fail) fail( error );
});
} //promiseAll
I was able to resolve this issue relatively simply using websockets:
the client submits the request, which is communicated to the server via socket.IO, the request is received and the spawn event is triggered, when the chunks are finished appending a termination event is emitted which triggers killing of the child process and returning the data to the client

zeromq ROUTER,DEALER message encoding format

I have written an application which is nodejs (main app) and python (client) app that i want to communicate with each other using zmq Router,Dealer pattern.
the problem is i could not read the messages sent from clients to nodejs (router) app.
its encoded some how.
the code is as simple as below:
var responder = zmq.socket('router');
responder.on('message', function(request) {
console.log(request);
// i could not read the messages here.its obfuscated
});
responder.bind('tcp://127.0.0.1:8000', function(err) {
if (err) {
console.log(err);
} else {
console.log('Listening on 8000...');
}
});
python:
socket = context.socket(zmq.DEALER)
socket.connect("tcp://127.0.0.1:8000")
socket.send('blaaaa')
print 'message sent!'
If you wish to use the DEALER/ROUTER sockets, then the message is actually given as the second argument for the callback function.
var responder = zmq.socket('router');
responder.on('message', function(header, body) {
console.log(body.toString('utf8'));
});
The message is in the format of a Buffer, but you can turn it into a string using .toString(encoding);
The header contains an identity, this allows you to later route the response/answer back to the correct sender/requester that made the original request.
For your application, PUSH-PULL seems more appropriate:
var zmq = require('zmq');
var responder = zmq.socket('pull');
responder.on('message', function(request) {
console.log(request.toString());
// Use `toString` to convert Buffer to string
});
responder.bind('tcp://127.0.0.1:8000', function(err) {
if (err) {
console.log(err);
} else {
console.log('Listening on 8000...');
}
});
import zmq
context = zmq.Context()
socket = context.socket(zmq.PUSH)
socket.connect("tcp://127.0.0.1:8000")
socket.send('blaaaa')
print 'message sent!'

Categories

Resources