Got a bit of weird request here, however it's one which I can't really figure out the answer to.
I'm writing a python application that displays web pages and locally stored images.
What I need is a way of displaying a web page using python that is really lightweight and quite fast. The reason for this is that it is running on a Raspberry Pi.
Of course I have many options, I can run it through web browser installed on the Raspbian distribution and run it as a separate process in python, I can download an Arch-Linux compatible browser and run it as a separate process in python and finally I can write my own native python file using Gtk or PyQt.
All of these approaches have their downsides as well as serious overheads. The web browser must also be full screen when I have a web page to display, and minimised when I'm displaying an image.
The main issue I have had with Gtk and PyQt is the way they have to be executed on the main thread - which is impossible as it doesn't align with my multithreaded architecture. The downside to using the web browsers that are pre-installed on raspbian, is that from python you lack control and it's slow. And finally, the issue with using an Arch-Linux browser is that it ends up being messy and hard to control.
What I would Ideally need is a web browser that loads a web page almost instantaneously, or a multithreaded web browser that can handle multiple instances. This way I can buffer one web page in the background whilst another browser is being displayed.
Do you guys have any advice to point me in the right direction? I would've thought that there would be a neat multithreaded python based solution by now, and I would think that's either because no one needs to do what I'm doing (less likely) - or I'm missing something big (more likely)!
Any advice would be appreciated.
James.
I'd use PyQT to display the page but if the way PyQT use threads does not fit within you application, you may just write a minimalist (I'm speaking of ~10 lines of code here) web browser using PyQT, and fork it from your main application ?
The solution that I came to was to use a couple of frame buffer browsers for linux called: netsurf-fb and links2 with -.
However after extensive testing - i decided it was not appropriate to use these due to the fact that they don't have javascript support.
Therefore the end solution was to use a script called cutycapt running in a virtual x window frame buffer called xvfb.
I called it from python using the pxpect library like so:
process=pexpect.spawn(xvfb_run_bin+' --server-args "-screen 0, '+self.width_height+'x24" cutycapt --url='+uri+' --out='+temp_path)
process.wait()
This in my implementation goes off, renders the page and saves the screen shot. An image viewer then fetches the image from my cache and displays it.
If anyone has any further questions - feel free to comment on this question...
IMPORT cutecapt information. If you want to render a whole page using javascript it will take longer... specify the --delay 1000 where it will delay the loading by 1 second in this instance. For me it took around 7000 to get it just right...
i have written winks-up in vala.
It's small and fast and compile well on raspbian.
All the code was optimize to reduce memory occupation.
isn't perfect but was better like nothing
Related
We (my team and I) are developing a prototype sensor that uses a raspberry to preprocess sensor readings.
we have a working python script and made a simple GUI using GUIzero to test everything, but this needs a screen to be connected, and that is not possible once the setup is finished and deployed in a field for example.
We now have the raspberry acting as a wifi-hotspot, and after connecting to the local network, we can access the RBPi using VNC-viewer, and interact with the simple (guizero-)GUI. This is fine for continuing testing and developing. But once we will distribute the sensor to some test-users, the VNC-solution is not optimal, as it allows too much snooping around on the RBPi.
To solve this, we were thinking that a nice solution would be to somehow link our python script with a web page, hosted on the RBPi. A user could then walk upto the sensor, connect to the local wireless network, type in an IP in a browser and the page that loads would then allow starting/stopping the sensor, downloading measured data, managing archiving data, ...
Googling points in many directions (Django, flask, ...) and I'm too much of a beginner to choose the path to take (and understand (dis)advantages of all these frameworks/libs/...)
Can someone point me in the correct direction? (we know more python than we know html or PHP or..., so if the solution could be friendly in that sense, that's a plus)
If you are familiar with Python, I would advise you to set up a Django application on your RasPi (their beginner tutorial covers everything you need and the whole framework is documented really well). From there you could go two ways:
Either create a single view (basically a function that gets called when a certain URL is called) that renders some HTML with buttons you can connect to some Python code on your system
Or you could create one view per function (e.g. /api/start_sensor, /api/download_data) and connect these API-calls to a webview.
The latter variant would also allow controlling the sensor programmatically via network.
My project consists of a website where a user inputs a Music XML file and receives a video (similar to synthesia) based on that XML file. I am using python to parse the XML file and get all the useful information. With that information, I am using PyOpenGL with glut to create animations and OpenCV to save each frame to a video.
I am able to run the program locally and it works. Now I am trying to use the program within my Wamp Server. So my question is, how would I go about doing this? My plan was to call the program with PHPs shell_exec() but nothing seems to happen. I've tested shell_exec() on simple test files that returns a string and that works. I have done some research and found I can use xvfb for headless server rendering. Any idea of how I can implement this with PyOpenGL/Glut? Also is it ok to use PHPs shell_exec() or should I be using something else to call my Python program?
First and foremost decide upon you need/want GPU acceleration or not. There's little use in trying to hone out a GPU accelerated OpenGL context creation, if your target system doesn't even have a GPU.
Next you should come to terms with, that you'll no longer be able to use GLUT, because GLUT was implemented with creating on-screen windows in mind.
If you can live without GPUs and rely on software rasterization, you should look into OSMesa https://mesa3d.org/osmesa.html
If you need GPU acceleration, check what GPU you'll be running on. If it's going to be a NVidia one, check out their excellent blog on how to create a headless rendering context with EGL https://devblogs.nvidia.com/egl-eye-opengl-visualization-without-x-server/
If it's an AMD or Intel GPU, then EGL in theory should work as well. However using DRM+GBM will usually yield better results. There's an example project for that to be found at https://github.com/eduble/gl
I have to setup a program which reads in some parameters from a widget/gui, calculates some stuff based on database values and the input, and finally sends some ascii files via ftp to remote servers.
In general, I would suggest a python program to do the tasks. Write a Qt widget as a gui (interactively changing views, putting numbers into tables, setting up check boxes, switching between various layers - never done something as complex in python, but some experience in IDL with event handling etc), set up data classes that have unctions, both to create the ascii files with the given convention, and to send the files via ftp to some remote server.
However, since my company is a bunch of Windows users, each sitting at their personal desktop, installing python and all necessary libraries on each individual machine would be a pain in the ass.
In addition, in a future version the program is supposed to become smart and do some optimization 24/7. Therefore, it makes sense to put it to a server. As I personally rather use Linux, the server is already set up using Ubuntu server.
The idea is now to run my application on the server. But how can the users access and control the program?
The easiest way for everybody to access something like a common control panel would be a browser I guess. I have to make sure only one person at a time is sending signals to the same units at a time, but that should be doable via flags in the database.
After some google-ing, next to QtWebKit, django seems to the first choice for such a task. But...
Can I run a full fledged python program underneath my web application? Is django the right tool to do so?
As mentioned previously, in the (intermediate) future ( ~1 year), we might have to implement some computational expensive tasks. Is it then also possible to utilize C as it is within normal python?
Another question I have is on the development. In order to become productive, we have to advance in small steps. Can I first create regular python classes, which later on can be imported to my web application? (Same question applies for widgets / QT?)
Finally: Is there a better way to go? Any standards, any references?
Django is a good candidate for the website, however:
It is not a good idea to run heavy functionality from a website. it should happen in a separate process.
All functions should be asynchronous, I.E. You should never wait for something to complete.
I would personally recommend writing a separate process with a message queue and the website would only ask that process for statuses and always display a result immediatly to the user
You can use ajax so that the browser will always have the latest result.
ZeroMQ or Celery are useful for implementing the functionality.
You can implement functionality in C pretty easily. I recomment however that you write that functionality as pure c with a SWIG wrapper rather that writing it as an extension module for python. That way the functionality will be portable and not dependent on the python website.
This is part of some preliminary research and I am having a difficult time figuring out what options might be available or if this is even a situation where a solution even exists.
Essentially we have an existing python based simulation that we would like to make available to people via the web. It can be pretty processor intensive, so while we could just run the sim server side and write a client that connects to it, this would not be ideal.
Writing a UI in Flash/Flex or HTML5, not a problem. However, is there any way to keep the core simulation logic in python without having it live server side? Is there any existing way to embed python modules in either of these technologies?
Thanks all.
Pyjamas: Python->Javascript, set of widgets for use in a browser or a desktop
Skulpt: Python written in Javascript
Emscripten: C/C++ -> LLVM -> Javascript
Empythoned: Based on emscripten and cpython, working on a stdlib? There are bugs to file
We have begun upgrading hardware and software to a 64-bit architecture using Apache with mod_jk and four Tomcat servers (the new hardware). We need to be able to test this equipment with a large number of simultaneous connections while still actually doing things in the app (logging in, etc.)
I currently am using Python with the Mechanize library to do this, but it's just not cutting it. Threading is not "real" in Python, and multiprocessing makes the local box work harder than the machines we are trying to test since it has to load so much into memory for Mechanize.
The bottom line is that I need something that will really hammer this thing's connections and hold a session to make sure that the sticky sessions are working in mod_jk. I need to be able to code it quickly, it needs to be lightweight, and being able to do true multithreading would be a perk. Other than that, I am open-minded.
Any input will be greatly appreciated. Thanks.
Open Source Testing Tools
Not knowing the full requirements makes it difficult, however something from the list might fit the bill.
In order to accomplish what I wanted to do, I just went back to basics. Mechanize is somewhat bulky, and there was a lot of bloat involved in the main functionality tests I had before. So I started with a clean slate and just used cookielib.CookieJar and urllib2 to build a linear test and then run them in a while 1 loop. This provided enough strain on the Apache system to see how it would react in the new environment, and for the record, it did VERY well.