Running Python Script From Informatica Cloud - python

I am trying to run a Python script that runs a bunch of queries against my tables on my Snowflake database and based on the results of the queries stores the output in Snowflake tables. This new company that I work for leverage Informatica Cloud as their ETL tool and while my tool works on Microsoft Azure (ADF) and Azure Batch, I cannot figure out for the life of me, how to trigger the Python script from Informatica Cloud Data Integration tool.

I think this can be tricky for cloud implementation.
You can create an executable from your py script. Then put that file in Informatica cloud agent server. Then you can call it using shell command.
You can also put the py file in agent server and run it using shell like
$PYTHON_HOME/python your_script.py
You need to make sure py version is compatible and you have all packages installed in agent server.
You can refer to the below screenshot for how to setup shell command. Then you can run it as part of some workflow. Schedule it if needed.

Related

How to run a robot framework script from azure databricks notebook?

Is it possible to run robot framework test suite using azure data bricks notebook?.
I have a set of robot framework test suite, that uses database library, Operating System library etc.
In my local machine, I install python, pip install all necessary libraries and then run my robot code like
"Python -m robot filename.robot"
I want to do the same using azure notebooks, Is it possible?
Databricks supports 4 Default Language:
Python,
Scala,
SQL,
R
I was unable to find any documentation, which shows use of robot framework on databricks.
However, you can try running commands on Azure databricks which you tried on local machine.
Databricks is simply just a cloud infrastructure provider to run your spark workload with some add on capability.

Firebase Cloud Functions running a python script - needs dependencies

I'm building a website with React and Firebase that utilizes an algorithm I wrote in python. The database and authentication for the project are both handled by Firebase, so I would like to keep the cloud functions in that same ecosystem if possible.
Right now, I'm using the python-shell npm package to send and receive data from NodeJS to my python script.
I have local unit testing set up so I can test the https.onCall functions locally without needing to deploy and test from the client.
When I am testing locally, everything works perfectly.
However, when I push the functions to the cloud and trigger the function from the client, the logs in the Firebase console show that the python script is missing dependencies.
What is the best way to ensure that the script has all the dependencies available to it up on the server?
I have tried:
-Copying the actual dependency folders from my library/.../site-packages and putting them in the same directory under the /functions folder with the python script. This almost works. I just run into an issue with numpy: "No module named 'numpy.core._multiarray_umath'" is printed to the logs in Firebase.
I apologize if this is an obvious answer. I'm new to Python, and the solutions I've found online seem way to elaborate or involve hosting the python code in another ecosystem (like AWS or Heroku). I am especially hesitant to go to all that work because it runs fine locally. If I can just find a way to send the dependencies up with the script I'm good to go.
Please let me know if you need any more information.
the logs in the Firebase console show that the python script is missing dependencies.
That's because the nodejs runtime targeted by the Firebase CLI doesn't have everything you need to run python programs.
If you need to run a function that's primarily written in python, you should not use the Firebase CLI and instead uses the Google Cloud tools to target the python runtime, which should do everything you want. Yes, it might be extra work for you to learn new tools, and you will not be able to use the Firebase CLI, but it will be the right way to run python in Cloud Functions.

Run Python script on Azure and save to SQL database

We have just signed up with Azure and were wondering how to schedule and run Python scripts that extract data from various sources like APIs, web scrape scripts, etc. What is the best tool on Azure that can run and schedule those scripts as well as save to target destination.
The output of the scripts will be saved to either data lakes and/or azure sql database.
Thank you.
There're several services in azure can do this task.
I suggest you can take use of azure webjobs(it supports python as well as support running as per schedule).
The rough guidelines are as below:
1.Develop your python scripts locally, make sure it can work locally(like extract data from other sources, save to azure database).
2.In azure portal, Create a scheduled WebJob. During creation, you need to upload the .py file(zip all the files into a .zip file); For "Type", please select "Triggered"; in the Triggers dropdown, select "Scheduled"; then specify at which time to run the .py file by using CRON Expression.
3.It's done.
You can also consider other azure services like azure function with time trigger. But the webjob is much more easier.
Hope it helps, and also please let me know if you still have more issues about that.

Test Python shell jobs scripts: dev endpoint?

I would like to set up a Python shell job as an ETL job. This job would do some basic data cleaning.
For now, I have a script that runs locally. I would like to test this script (or parts of it). I tried to setup a dev endpoint like it's explained here: https://docs.aws.amazon.com/glue/latest/dg/dev-endpoint-tutorial-repl.html
Everything went fine, I can ssh into the machine. But with this tutorial, I get a "gluepyspark" shell. And this shell isn't compatible with the AWS data wrangler library: https://github.com/awslabs/aws-data-wrangler.
I would like to know if it's possible to setup a dev endpoint to test python shell jobs. Alternatively, I could also accept a workflow to test python shell jobs.

Execute python script on AWS GPU cloud

I have a python script that takes lot of time to execute so we decided to execute it on aws gpu cloud but it is a part of ASP.NET web application.
So what I want to do is From My ASP.NET application, I want to send a text file to python script. The python script executes on GPU when it required and will come back with results.
Is it possible? Can we create python web service on aws GPU?
I an new to amazon aws.
Can Anyone please help me.
My python script running properly on my pc. I have created Amazon aws account.

Categories

Resources