I need to make an outbound ftp request to retrieve a number of small
files. There are 6 files each less than 10K and I only need to
retrieve them once every couple of hours.
When I try to do this with urllib2.urlopen("ftp://xxx.xxx.xxx") I get
an exception AttributeError: 'module' object has no attribute
'FTP_PORT'.
I have read through the documentation and see you are only allowed to
make http and https requests from the appengine, unfortunately my
application needs to consume the ftp data, does this requirement mean
I can't use the appengine at all ? I sincerely hope not.
So has anyone else here found a way to make ftp requests, perhaps with
a paid account ? And if not what have other people chosen to do ?
does azure or ec2 allow outbound ftp requests ?
You're correct. Google App Engine does not allow you to make FTP requests. Not even with a paid account.
I had to use a LAMP instance on EC2 that handles FTP'ing through CURL, and make http requests to it from GAE.
This limitation used to drive me nuts; implementing the overhead around dynamically instantiating EC2 slave workers to relay FTP data felt like a real waste of time. Fortunately, as of April 9 this year (SDK 1.7.7) this isn't a problem any longer. Outbound sockets (e.g. FTP) are generally available to all billing-enabled apps.
Sockets API Overview (Python): https://developers.google.com/appengine/docs/python/sockets/
drivehq.com is another option. It provides both a web+ftp server. So a third party I needed to interface with (that spoke only FTP) would upload files via FTP. And then I would urlfetch them on appengine.
Related
I have created a google cloud function to set up static IP as my customer required a specific IP address to access their FTP network. My goal now is to update this cloud function to connect to my customer and retrieve files to my own Google Cloud Storage bucket.
I've read various posts from here and here and it seems that this is not very often done. Could I ask if this is possible? If so, could I ask for a pointer on which package to look into (I'm looking into pysftp but no good documentations on using it via cloud function)? Lastly, am I correct that the Secret Manager is recommended for better privacy, especially since I will receive my customer's FTP password?
Thanks so much everyone for your time.
Per request, I am posting an answer here. Turns out I need to establish a SFTP connection. Both SFTP and FTP terms were floating around which makes it hard to get a better understanding, but I was able to use the paramiko package to connect to the desired server.
I have a Python script that I want to upload to our server so that it can run every day, and part of the script's function is to download a file from an SFTP connection, do some work on it and upload it to our Amazon S3 bucket.* However, I want my program to be as lean as possible so that it doesn't mess with the other daily tasks we have on the server, so I want to use only REST services and not import anything. I want to get my requests working in a REST client (I've been using Insomnia) before I put them into code.
I've searched high and low to find tutorials for how to do this, but I've found nothing. Amazon's S3 docs say things like: "Request syntax: GET / HTTP/1.1", but that doesn't tell me anything about how to actually make the request or even what URL to use to make the request to.
Can anyone give me some guidance (or at least a URL)? Thanks!
*Edit: As pointed out by Martin Prikryl in the comments, I cannot make SFTP requests using REST. I still want to find out how to make S3 requests though.
I’ve got a standard client-server set-up with ReScript (ReasonML) on the front-end and a Python server on the back-end.
The user is running a separate process on localhost:2000 that I’m connecting to from the browser (UI). I can send requests to their server and receive responses.
Now I need to issue those requests from my back-end server, but cannot do so directly. I’m assuming I need some way of doing it through the browser, which can talk to localhost on the user’s computer.
What are some conceptual ways to implement this (ideally with GraphQL)? Do I need to have a subscription or web sockets or something else?
Are there any specific libraries you can recommend for this (perhaps as examples from other programming languages)?
I think the easiest solution with GraphQL would be to use Subscriptions indeed, the most common Rescript GraphQL clients already have such a feature, at least ReasonRelay, Reason Apollo Hooks and Reason-URQL have it.
Is there a way to specify a proxy server when using urlfetch on Google App Engine?
Specifically, every time I make a call using urlfetch, I want GAE to go through a proxy server. I want to do this on production, not just dev.
I want to use a proxy because there are problems with using google's outbound IP addresses (rate limiting, no static outbound IP, sometimes blacklisted, etc.). Setting a proxy is normally easy if you can edit the http message itself, but GAE's API does not appear to let you do this.
You can always roll your own:
In case of fixed destination: just setup a fixed port forwarding on a proxy server. Then send requests from GAE to proxy. If you have multiple destinations, then set forwarding on separate ports, one for each destination.
In case of dynamic destination (too much to handle via fixed port forwarding), your GAE app adds a custom http header (X-Something) containing final destination and then connects to custom proxy. Custom proxy inspects this field and forwards the request to the destination.
We ran into this issue and reached out to Google Cloud support. They suggested we use Google App Engine flexible with some app.yaml settings, custom network, and an ip-forwarding NAT gateway instance.
This didn't work for us because many core features from App Engine Standard are not implemented in App Engine Flexible. In essence we would need to rewrite our product.
So, to make applicable URL fetch requests appear to have a static IP we made a custom proxy: https://github.com/csgactuarial/app-engine-proxy
For redundancy reasons, I suggest implementing this as a multi region, multi zone, load balanced system.
I'm working on a Google App Engine project that needs to access IMAP. Context.IO isn't quite powerful enough for my purposes, but I'd like something in the same spirit: I want to log into, access, and manipulate hundreds of IMAP mailboxes from within Google App Engine, using either a third-party service or an application server I put on a dedicated hosting server.
As you might imagine, this is mostly to work around the opening sockets limitation in GAE.
Any recommendations?
I don't know of any pre-made solution, but rolling your own shouldn't be very difficult or take too long. You can build on IMAPClient and SimpleXMLRPCServer on the server and use xmlrpclib on the client.
You would need to think about a way to retain state between calls though, since XmlRPC is a connectionless protocol (as most other RPC mechanisms are as well) and implement some form of service authentication. I have written a class inherited from SimpleXMLRPCServer, which supports SSL connections and HTTP Basic Auth (xmlrpclib already has support for both items). If you're interested in the code, give me a shout.
Have a look at Mailgun; it offers a robust API and supports IMAP V4.*
* IMAP mailboxes are on Mailgun