Let's say I have a web page, and I want to translate it to another language.
One way to do it would be parse the DOM, and find out which strings need to be translated (like the title), use something like the Google Translate API, and show the result.
However, sending http requests to Google would make the application slow. Is there a better way to do it, especially through python itself?
You could install a machine translation system locally and then use that to translate the strings. One freely available system is Moses at http://www.statmt.org/moses/.
Please note that this system requires training data and training steps.
Related
My company uses Office 365 within OneLogin. Therefore I need to log in to OneLogin first then sign into Office 365 (OneDrive). My question is how do I do the authentication for this in Python? I want to read the excel file stored in OneDrive directly. I tried to retrieve the data directly using python requests module with HTTPBasicAuth and HttpNtlmAuth only to get error 403. I have checked about Python-saml.
https://github.com/onelogin/python-saml
However, I am not sure this is what I need. This seems more like create a server for OneLogin using Python. What is the correct approach for doing this?
Unfortunately, I have not found out a way to do this using Python packages. There is one workaround though. I used Selenium to simulate log in like a real human and click like how the real human could. It requires some scripting but this way it can bypass the OneLogin security (and other types of complicated security login).
i'm looking for a good search engine API for a project, a simple one that i can send a query and get a result set via API.
Google, bing, yahoo all seemed to stop providing with these kind of service and Faroo requires me to have an actual website.
thanks
You can look into duckduckgo. Although not a complete API but this may serve partially the purpose.
Search engines are very large, complex and involved systems. The most valuable brand on this planet is a search engine giant. I am not surprised that this data is not readily available for people to consume and build their own apps.
And I also think Bing also gives you search results via APIs. I personally used it and it works perfectly fine.
I am not sure any of them has a official Python client(duckduckgo seems to have a community one). But with a little bit of work you can build one easily.
I would like to know what is the fastest way to turn a simple Python script into a basic web app.
For example, say I would like to create a web app that takes a keyword from the user and display the most retweeted tweet on Twitter. If I write a python script that is capable of performing that task using Twitter's API, how would I go about turning it into a web app for people to access?
I have looked at frameworks such as Django, but it would take me weeks or months to learn how to use it. I just need something quick and simple. Any such alternatives?
Make a CGI script out of it. You basically get the request information from the webserver via environment variables and you print the desired HTML to stdout. There are helper libraries such as Werkzeug which help with abstracting away the handling of the environment variables by wrapping them in a Request object.
This technique is quite outdated and isn't normally used nowadays as the script has to be run on every request and thus incurs the startup cost all the time.
Nevertheless this may actually be a good solution for you because it is quick and every webserver supports it.
I have zero experience with website development but am working on a group project and am wondering whether it would be possible to create an interaction between a simple html/css website and my python function.
Required functionality:
I have to take in a simple string input from a text box in the website, pass it into my python function which gives me a single list of strings as output. This list of strings is then passed back to the website. I would just like a basic tutorial website to achieve this. Please do not give me a link to the CGI python website as I have already read it and would like a more basic and descriptive view as to how this is done. I would really appreciate your help.
First you will need to understand HTTP. It is a text based protocol.
I assume by "web site" you mean User-Agent, like FireFox.
Now, your talking about an input box, well this will mean that you've already handled an HTTP request for your content. In most web applications this would have been several requests (one for the dynamically generated application HTML, and more for the static css and js files).
CGI is the most basic way to programmatically inspect already parsed HTTP requests and create HTTP responses from objects you've set.
Now your application is simple enough where you can probably do all the HTTP parsing yourself to gain a basic understanding of what's going on, but you will still need to understand how to develop a server that can listen on a socket.
To avoid all that just find a Python application server that has already implemented all of the above and much more. There are many python application servers to choose from. Use one with a small learning curve for something as simple as above. Some are labeled as "micro-frameworks" in this genre.
Have you considered creating an app on Google App Engine (https://developers.google.com/appengine/)?
Their Handling Forms tutorial seems to describe your problem:
https://developers.google.com/appengine/docs/python/gettingstartedpython27/handlingforms
To start off, this desktop app is really to give myself an excuse to learn python and how a gui works.
Im trying to help my clients visualize how much bandwidth they are going through, when its happening and where their visitors are. All of this would be displayed by graphs or whatever would be most convienient. (Down the road, I'd like to add cpu/mem usage)
I was thinking the easiest way would be for the app to connect via sftp, download the specified log and then use regex to filter the necessary information.
I was thinking of using :
Python 2.6
Pyside
Paramiko
to start out with. I was looking at twisted for the sftp part but I though maybe keeping it simple for now would be a better choice.
Does this seem right? Should I be trying to use sftp? Or should I try to interact with some subdomain from my site to push the logs to the client? (i.e app.mysite.com)
How about regular expressions to parse the logs?
sftp or shelling out to rsync seems like a reasonable way to retrieve the logs. As for parsing them, regular expressions are what most people tend to use. However, there are other approaches, too. For instance:
Parse Apache logs to SQLite database
Using pyparsing to parse logs. This one is parsing a different kind of log file, but the approach is still interesting.
Parsing Apache access logs with Python. The author actually wrote a little parser, which is available in an apachelogs module.
You get the idea.