Novice here, apologies if this is a rookie question. I have set up a test Azure VM (using the $200 free credit) and am trying to make API calls to two different websites using Python. Both calls work fine on my local desktop, but when I run them on the Azure VM, one API fails (response code 403) while the other succeeds (response code 200). Neither API request takes long when run on my desktop - each takes about 2 seconds to complete.
two API calls with results when run on VM
In the Azure VM Networking screen, I have added an Inbound rule to allow any HTTPS.
VM>>Networking>>Inbound Port Rule
My goal is to be able to run my Python code on a task scheduler on the Azure VM and write the results to SQL Server on the VM. But I can't get both API calls to be successful. Thanks for any guidance on this.
Related
I would like to run a python script on Azure that connects to an Azure bus queue and saves the data to a database. I installed my script on an Azure service plan app but the script runs only when I am connected by ssh to the app (why?). I have seen many posts that recommend to either use azure functions (but they are triggered by ssh requests which is not my case) or to use runbooks but I have no idea how it works and if I can use that for my case. Do you have any advice?
What should I use between app, functions or runbook?
or
how can I make my script work even when I don't have an ssh connection with my app?
Thank you in advance for your help
I've created a python script that grabs information from an API and sends it in an email. I'd like to automate this process so to run on daily basis let's say at 9AM.
The servers must be asleep when they will not be running this automation.
What is the easiest way to achieve this?
Note: Free version of AWS.
cloud9 is the ide that lets you write, run, and debug your code with just a browser.
"It preconfigures the development environment with all the SDKs, libraries, and plug-ins needed for serverless development. Cloud9 also provides an environment for locally testing and debugging AWS Lambda functions. This allows you to iterate on your code directly, saving you time and improving the quality of your code."
okay for the requirement you have posted :-
there are 2 ways of achieving this
on a local system use cron job scheduler daemon to run the script. a tutorial for cron.tutorial for cron
same thing can also be achieved by using a lambda function. lambda only runs when it is triggered, using compute resources for that particular time when it is invoked so your servers are sleeping for the rest of time( technically you are not provisioning any server for lambda)
convert your script in a function for lambda. and then use event bridge service where you can specify a corn expression to run your script everyday at 9am. wrote an article on the same may it can help.
note :- for email service you can use ses https://aws.amazon.com/ses/. my article uses ses.
To schedule Events you'd need a Lambda function with Cloudwatch events such as follow. https://docs.aws.amazon.com/AmazonCloudWatch/latest/events/RunLambdaSchedule.html
Cloud9 is an IDE.
I have a Python script that takes 20 minutes to run. I need to be able to trigger this script via my Azure .NET application.
I am looking for a possible cloud based host to help me do this. Preferably Azure, but open to other options.
I have tried the following Options:
Azure Functions
Assessment: Found too many limitations on code structure (e.g. have to organize Python files in certain way)
Azure Web App
Assessment: Works to create an endpoint but has timeout issues for long requests
Azure Virtual Machine (VM)
Assessment: I simulated a trigger by scheduling the script frequently on the VM. This is not a bad solution but not ideal either
What other viable options exist?
You can also choose to use Azure Web Jobs to serve this purpose.
It has a setting to specify idle time (WEBJOBS_IDLE_TIMEOUT):
The value must be in seconds, example 3600 which means the idle time before it times out is 1 hour. Note that this option will affect all scheduled web jobs either under web app or azure function.
Reference: https://jtabuloc.wordpress.com/2018/06/05/how-to-avoid-azure-webjob-idle-timeout-exception/
I am new to Microsoft Azure. I created a bot in Python that takes a message in Messenger from a page that I created in Facebook, processes each message posted, into a function that produces output to present back to Messenger (through Webhook).
Since I have no web space, my Python script is ran locally on my machine, and I am using local hosting to 'communicate' back and forth in Messenger and using my local hosting address for my Webhook link. I also need the script running locally on my machine and local hosting in order for my bot to work in Facebook Messenger.
I was wondering if there is a way to run my script directly from a server on Azure when and if I get web space, instead of from my machine locally (as I THINK Azure is the place to do this from), and how do I run my script from there instead on a 'constant basis'? Thank you for any assistance.
I'm getting my feet wet with GCP and GAE, also nodejs and python and networking (I know).
[+] What I have:
Basically I have some nodejs code that takes in some input and is supposed to then send that input to some python code that will do more stuff to it. My first idea was to deploy the nodejs code via GAE, then host the python code in a python server, then make post requests from the nodejs front-end to the python server backend.
[+] What I would like to be able to do:
just deploy both my nodejs code and my python code in the same project and instance of GAE so that the nodejs is the frontend that people see but so that the python server is also running in the same environment and can just communicate with the nodejs without sending anything online.
[+] What I have read
https://www.netguru.co/blog/use-node-js-backend
Google App Engine - Front and Backend Web Development
and countless other google searches for this type of setup but to no avail.
If anyone can point me in the right direction I would really appreciate it.
You can't have both python and nodejs running in the same instance, but they can run as separate services, each with their own instance(s) inside the same GAE app/project. See Service isolation and maybe Deploying different languages services to the same Application [Google App Engine]
Using post requests can work pretty well, but will likely take some effort to ensure no outside access.
Since you intend to use as frontend the nodejs service you're limited to using only the flexible environment for it, which limits the inter-service communication options - you can't use push queues (properly supported only in the standard environment) which IMHO would be a better/more secure solution than post requests.
Another secure communication option would be for the nodejs service to place the data into the datastore and have the python service pick it up from there - the datastore is shared by all instances/versions/services inside the same GAE app. Also more loosely coupled IMHO - each service can function (at least for a while) without the other being alive (not possible if using the post requests).
Maybe of interest: How to tell if a Google App Engine documentation page applies to the standard or the flexible environment
UPDATE:
Node.JS is currently available in the standard environment as well, so you can use those features, see:
Now, you can deploy your Node.js app to App Engine standard environment
Google App Engine Node.js Standard Environment Documentation