PowerBI Service Refresh with python script not working - python

A very quick question.
I have built out a PBI Dashboard which runs some python script in the query editor. It works fine on my desktop but when uploaded to PBI Service it errors when the dataset is refreshed.
Data source error: Unable to refresh the model (id=853989) because it references an unsupported data source.
I have read online that since 2019 (I think) PowerBI service now supports python scripting?
Thanks

Related

How to automatically sync json (or any data file like csv) to GitHub

I Have a charting program I wrote in python using Dash-Plotly and published to Heroku (which automatically deploys when there are new changes in the GitHub). To keep the data updated I have to run a script that I wrote daily that downloads the data, which is about 4000 json files. It takes a little while, and then I have publish to GitHub, and the rest is taken care of from the automatic deployment via Heroku. My question is how to I automate this process so that the json files are downloaded nightly (from the cloud as my computer will not be on) and then automatically sync the GitHub repo with the new data sets. I just want to be able to open my app on Heroku and know that the charts are pulling from the latest available data and not have to do this every day. Currently I am using VSCode on Linux.

How do I run a Python script on the Azure server?

I am new to Microsoft Azure. I created a bot in Python that takes a message in Messenger from a page that I created in Facebook, processes each message posted, into a function that produces output to present back to Messenger (through Webhook).
Since I have no web space, my Python script is ran locally on my machine, and I am using local hosting to 'communicate' back and forth in Messenger and using my local hosting address for my Webhook link. I also need the script running locally on my machine and local hosting in order for my bot to work in Facebook Messenger.
I was wondering if there is a way to run my script directly from a server on Azure when and if I get web space, instead of from my machine locally (as I THINK Azure is the place to do this from), and how do I run my script from there instead on a 'constant basis'? Thank you for any assistance.

How can I execute custom code using Flow?

I've made a PowerApps app which uploads an image to SharePoint. When Flow detects that this image is uploaded, I want to run a custom script that can interact with the Excel file. PowerShell should accomplish that, but I'm completely lost when it comes to running the PowerShell code from Flow.
My goal is to use an Excel macro to combine the image and an Excel file that is stored in the same location in SharePoint. PowerShell will execute the macro and delete the picture after.
I've found this guide "https://flow.microsoft.com/en-us/blog/flow-of-the-week-local-code-execution/", but I don't think it will work for me as the app will be running on more devices than just my local computer.
What technology can I use to run code using Flow as a trigger? The code must have access to a specific SharePoint site as well.
I believe you can create an Azure Function that will execute PowerShell. This will execute from the cloud rather than on your local machine.
I'd also like to add a solution that worked great for me: Using Flow to send HTTP requests to a REST API!

Running an exe file on the azure

I am working on an azure web app and inside the web app, I use python code to run an exe file. The webapp recieves certain inputs (numbers) from the user and stores those inputs in in a text file. Afterwards, an exe file would run and read the inputs and generate another text file, called "results". The problem is that although the code works fine on my local computer, as soos as I put it on azure, the exe file does not get triggered by the following line of code:
subprocess.call('process.exe',cwd = case_directory.path, shell= True)
I even tried running the exe file on Azure manually from the Visual Studio Team Services (was Visual Studio Online) by "running from Console" option. It just did not do anything. I'd appreciate if anyone can help me.
Have you looked at using a WebJob to host\run your executable from? A WebJob can be virtually any kind of script or win executable. There are a number of ways to trigger your WebJob. You also get a lot of buit in monitoring and logging for free as well, via the Kudu interface.
#F.K I searched some information which may be helpful for you, please see below.
Accroding to the python document for subprocess module, Using shell=True can be a security hazard. Please see the warning under Frequently Used Arguments for details.
There is a comment in the article which gave a direction for the issue, please see the screenshot below.
However, normally, the recommended way to satisfy your needs is using Azure Queue & Blob Storage & Azure WebJobs to save the input file into a storage queue, and handling the files got from queue and save the result files into blob storage by a continuous webjob.

Remote server: running a Python 2.7 script and making *.csv files publicly available

I have a Python 2.7 script that produces *.csv files. I'd like to run this Python script on a remote server and make the *.csv files publicly available to read.
Can this be done on Heroku? I've gone through the tutorial, but it seems to be geared towards people who want to create a whole web site.
If Heroku isn't the solution for me, what are the alternatives? I tried Google App Engine, but it requires Python 2.5 and won't work with 2.7.
MORE DETAILS:
I have a Python 2.7 script that analyzes all stocks that trade on the AMEX, NYSE, and NASDAQ exchanges and writes the output into *.csv files that can be read with a spreadsheet application. I want the script to automatically run every night on a remote server, and I want the *.csv files it produces to be publicly available.
Web hosting
Ok so you should be able to achieve what you need pretty simply. There are many webhosts that have python support. Your requirement is pretty simple. Just upload your python scripts to the web server. Then you can schedule a cron job to call your script at a specific time every day. Your script will run as scheduled and should save the csv files in the web servers document root. Keep in mind you don't need to your script to run in the web server, just on the same server. The web server will just serve your static csv files for you once you place them in the webserver's document root.
Desktop with dropbox
Another maybe easier option is take any desktop and schedule your python script to run on it each night you can do this in windows, Linux, Mac. Also install dropbox it gives you 2GB free online storage. Then your scripts just have to save the csv fies to the Dropbox/Public directory. When they do this they will automatically get synced to the dropbox servers and can be accessed through your public url like any other web page on the internet. You get 2GB for free which should be more then enough for a whole bunch of CSV files.

Categories

Resources