Is it possible to run robot framework test suite using azure data bricks notebook?.
I have a set of robot framework test suite, that uses database library, Operating System library etc.
In my local machine, I install python, pip install all necessary libraries and then run my robot code like
"Python -m robot filename.robot"
I want to do the same using azure notebooks, Is it possible?
Databricks supports 4 Default Language:
Python,
Scala,
SQL,
R
I was unable to find any documentation, which shows use of robot framework on databricks.
However, you can try running commands on Azure databricks which you tried on local machine.
Databricks is simply just a cloud infrastructure provider to run your spark workload with some add on capability.
Related
I have a nodeJS API which uses child_process to run a python file.
I know heroku has a way to add another buildpack with a script.
Is there a way with Microsoft Azure (web app / app services) to use NodeAPI and Python files in an App service.
PS : Python file is not an API, it is just a script that runs from node.
Thank you
When you hosted a Web App in Azure, you already have all runtime environments by default.
Web App Runtime on Azure Portal:
Python Runtime on Kudu Explorer:
NodeJS and PHP on Kudu Explorer:
Check the other stack versions in Console and if it not sufficient, update them.
Note:
When creating the webapp, it is recommended to choose windows platform which supports virtual application because when you find that you cannot use the runtime you required, you can try to install runtime extensions for other programs.
Also, it is possible under Linux but requires command line to operate.
I am trying to run a Python script that runs a bunch of queries against my tables on my Snowflake database and based on the results of the queries stores the output in Snowflake tables. This new company that I work for leverage Informatica Cloud as their ETL tool and while my tool works on Microsoft Azure (ADF) and Azure Batch, I cannot figure out for the life of me, how to trigger the Python script from Informatica Cloud Data Integration tool.
I think this can be tricky for cloud implementation.
You can create an executable from your py script. Then put that file in Informatica cloud agent server. Then you can call it using shell command.
You can also put the py file in agent server and run it using shell like
$PYTHON_HOME/python your_script.py
You need to make sure py version is compatible and you have all packages installed in agent server.
You can refer to the below screenshot for how to setup shell command. Then you can run it as part of some workflow. Schedule it if needed.
I am trying to setup a Development environment for DataBricks, So my developers can write code using VSCODE IDE(or some other IDE) and execute the code against the DataBricks Cluster.
So I went through the Documentation of DataBricks Connect and did the setup as suggested in the document.
https://docs.databricks.com/dev-tools/databricks-connect.html#overview
Post the Setup I am able to execute python code on Azure DataBricks cluster, but not with Scala code
While Running the setup I found that it is saying Skipping scala command test on windows, I am not sure whether I am missing some configuration here.
Please suggest how to resolve this issue.
This is not an error but just a statement that says databricks-connect test is skipping testing scala code on windows you can still execute code from local machine on cluster using databricks-connect, you need to add the jars from databricks-connect get-jar-dir directory to your project structure in IDE as described in this documentation steps https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect#intellij-scala-or-java
Also note that when using azure databricks you enter a generic Databricks Host along with your workspace id(org-id) when you execute databricks-connect configure
eg- https://westeurope.azuredatabricks.net/o=?xxxx instead of https://adb-xxxx.yz.azuredatabricks.net
I have a Python AWS Lambda running on a Linux, but due to some dependencies, I need it to be deployed on a Windows. I have tried using Python Azure Functions and have successfully deployed it on a Linux as well, but found out they cannot be deployed on Windows. Is it possible to do it with AWS Lambda?
Basically my solution has a few .exe that need to be run by a python library (Tesseract OCR and pytesseract)
AWS Lambda and Azure Functions are considered Function as a Service (FaaS) solutions, where the developer worries about the code and the cloud provider worries about availability, scalability and the platform underneath to run the code.
In that aspect, you can't run any of them on a server. If you need specific Windows dependencies, you must create a Python project as you normally would, install the dependencies and configure the Windows Server, being responsible for infrastructure and OS configurations and management.
I'm using Azure Python SDK to deploy a Linux Vm in the Cloud. This Vm has a public IP and ssh enabled. I need to have this Vm running a custom script immediately after it boots . This script would install pip, python, docker etc and start a docker container.
How could I pass this script when deploying the vm ? / How could I instruct the vm to run this script after it has started ?
Cheers,
Steve
According to your scenario, you could use Azure Custom Script Extension.
The Custom Script Extension downloads and executes scripts on Azure
virtual machines. This extension is useful for post deployment
configuration, software installation, or any other configuration /
management task. Scripts can be downloaded from Azure storage or other
accessible internet location, or provided to the extension run time.
If you want to use python to do this, please refer to this python sdk documentation.
Please refer to the similar question.