To start I managed to successfully run pywin32 locally where it opened the Excel workbooks and refreshed the SQL Query then saved and close them.
I had to download those workbooks locally from Sharepoint and have them sync to apply the changes using one drive.
My Question is would this be possible to do within Sharepoint itself ? Have a python script scheduled on a server and have the process occur there in the backend through a command.
I use this program called Alteryx where I can have batch files execute scripts and maybe I could use an API of some sort to accomplish this on a scheduled basis since thats the only server I have access to.
I have tried looking on this site and other sources but I can't find a post where it would reference this specifically.
I use Jupyter Notebooks to write my scripts and Alteryx to build a workflow with those scripts but I can use other IDEs if I need to.
Related
I am developing a python script that downloads some excel files from a web service. These two files are combined with another one stored in my computer locally to produce the final file. This final file is loaded to some database and PowerBI dashboard to finally visualize data.
My question is: How can I schedule this to run it daily if my computer is turned off? As I said, two files are web scraped (so no problem to schedule) but one file is stored locally.
One solution that comes to my mind: Store the local file in Google Drive/OneDrive and download it with the API so my script is not dependent of my computer. But if this was the case, how can I schedule that? What service would you use? Heroku,...?
I am not entirely sure about your context, but I think you could look into using AWS Lambda for this. It is reasonably easy to set it up and also create a schedule for running code.
It is even easier to achieve this using the serverless framework. This link shows an example built with Python that will run on a schedule.
I am running the schedule package for exactly something like that.
It’s easy to setup and works very well.
We have just signed up with Azure and were wondering how to schedule and run Python scripts that extract data from various sources like APIs, web scrape scripts, etc. What is the best tool on Azure that can run and schedule those scripts as well as save to target destination.
The output of the scripts will be saved to either data lakes and/or azure sql database.
Thank you.
There're several services in azure can do this task.
I suggest you can take use of azure webjobs(it supports python as well as support running as per schedule).
The rough guidelines are as below:
1.Develop your python scripts locally, make sure it can work locally(like extract data from other sources, save to azure database).
2.In azure portal, Create a scheduled WebJob. During creation, you need to upload the .py file(zip all the files into a .zip file); For "Type", please select "Triggered"; in the Triggers dropdown, select "Scheduled"; then specify at which time to run the .py file by using CRON Expression.
3.It's done.
You can also consider other azure services like azure function with time trigger. But the webjob is much more easier.
Hope it helps, and also please let me know if you still have more issues about that.
I've made a PowerApps app which uploads an image to SharePoint. When Flow detects that this image is uploaded, I want to run a custom script that can interact with the Excel file. PowerShell should accomplish that, but I'm completely lost when it comes to running the PowerShell code from Flow.
My goal is to use an Excel macro to combine the image and an Excel file that is stored in the same location in SharePoint. PowerShell will execute the macro and delete the picture after.
I've found this guide "https://flow.microsoft.com/en-us/blog/flow-of-the-week-local-code-execution/", but I don't think it will work for me as the app will be running on more devices than just my local computer.
What technology can I use to run code using Flow as a trigger? The code must have access to a specific SharePoint site as well.
I believe you can create an Azure Function that will execute PowerShell. This will execute from the cloud rather than on your local machine.
I'd also like to add a solution that worked great for me: Using Flow to send HTTP requests to a REST API!
I just learned about Google Apps Script and am wondering whether this is a solution for me.
I have a Python script on my desktop which eventually creates a CSV file stored on my computer. In the end it would be great to have the values from this CSV file appended to an existing Google Spreadsheet.
So now I'm wondering: is it possible to create a Google Apps Script which fetches these values from the locally stored CSV, and ideally even to call this Google Apps Script from within my Python script?
Python scripts can invoke Google Apps Scripts via its Execution API. (You'll find a Python Quickstart at that link.)
To go the other way (have Google Apps Script pull the file from your computer) is impossible... but if your Python script puts the csv file into your local Google Drive folder (which syncs to the cloud service), a Google Apps Script can access the sync'd file. You cannot trigger this based on any event - instead, it would need to be time based. (Fine if you know the file is generated periodically, e.g. daily at 3 AM, say.)
Yes. You can go down the execution API route. For a simpler approach, you could have a doGet() function in your script, make a HTTP GET request from Python to the script's URL and pass on the data as an input. Example here: GAS as backend service
I have a Python 2.7 script that produces *.csv files. I'd like to run this Python script on a remote server and make the *.csv files publicly available to read.
Can this be done on Heroku? I've gone through the tutorial, but it seems to be geared towards people who want to create a whole web site.
If Heroku isn't the solution for me, what are the alternatives? I tried Google App Engine, but it requires Python 2.5 and won't work with 2.7.
MORE DETAILS:
I have a Python 2.7 script that analyzes all stocks that trade on the AMEX, NYSE, and NASDAQ exchanges and writes the output into *.csv files that can be read with a spreadsheet application. I want the script to automatically run every night on a remote server, and I want the *.csv files it produces to be publicly available.
Web hosting
Ok so you should be able to achieve what you need pretty simply. There are many webhosts that have python support. Your requirement is pretty simple. Just upload your python scripts to the web server. Then you can schedule a cron job to call your script at a specific time every day. Your script will run as scheduled and should save the csv files in the web servers document root. Keep in mind you don't need to your script to run in the web server, just on the same server. The web server will just serve your static csv files for you once you place them in the webserver's document root.
Desktop with dropbox
Another maybe easier option is take any desktop and schedule your python script to run on it each night you can do this in windows, Linux, Mac. Also install dropbox it gives you 2GB free online storage. Then your scripts just have to save the csv fies to the Dropbox/Public directory. When they do this they will automatically get synced to the dropbox servers and can be accessed through your public url like any other web page on the internet. You get 2GB for free which should be more then enough for a whole bunch of CSV files.