The company I work for is looking to run some live dashboards in the reception area to show performance, geographical location of users etc etc - as such, these wouldn't need to be interactive but instead updating relatively quickly (maybe every 5mins?) and would be on permanently.
Is Bokeh a reasonable solution for this? I've used them in the past for personal dashboards and they look nice and are pretty customisable, so it instantly came to mind.
Thanks in advance for any advice...
The apps/dashboards at http://demo.bokeh.org (e.g. one specific one: https://demo.bokeh.org/sliders) have been running continuously without any intervention since 0.12.3 was released, nearly one year ago (early October 2016).
In fact, they run a bit slowly because they are still running on 0.12.3. I have not had an opportunity to re-deploy them using newer, faster Bokeh versions.
Related
I work as a DS in a ver small company, so all of the DS team is very "young" in this field.
We are currently experiencies issues with cooperation, especific at the writing code moment.
We've tried with VScode live share which is a great extension but, due to our pc's limitations goes hard to work when we are working with big df.
I was looking over deepnote, which sounds really great, but is has no support with MSsql server.
so, any alternative? Also we're thinking in cloud migration, like azure or AWS, but I was unable to find a proper way to do it or if there we can work in real time co-editting
so, any help or advice?
I was recently wondering about Time Travel Debugging in relation to Python.
I found information about tools like:
RevPDB - unfortunately the last recorded activity is from 2016
timetravelpdb - unfortunately the last recorded activity is in 2015
Since the projects were updated so long ago, I was wondering if the tools used for TTD had changed for the moment?
I am counting on constructive discussion and advice & suggestions what to use now.
It is all about sharing the knowledge.
General Overview of TTD Research
At this very moment, available solutions are those listed in the description of the question and additionally PyTrace.
As far as RevPDB and timetravelpdb are concerned, I haven't tested these solutions in any way as the activity in these projects is registered a few years ago so I assumed that in case of problems contact with the support will be difficult.
How to start working with it?
To start with, it is worth using an interactive demo to learn about the basic functionalities and the way it works:
PyTrace Interactive Demo
If you need more information, check this site:
PyTrace Official Site
I am impressed with this project and I'm going to start using it in my daily coding routine so I will post my thought and tips about it later.
Stay tuned!
I have just coded a trading algorithm and some analytics software for the stock market which in itself works fine.
Since my computer is not always running or internetconnection is not running perfectly I would like to source the script out and put it on a Webserver for example, where it would run all day and night.
Do you guys now I could do that?
I would also like to build a user interface using django to monitor live performance.
Does anybody know what would be necessary to implement these steps?
Thanks in advance and kind regards
Marcel Kresse
This is very general question and the answer is close to "sky is the limit". As mentioned above, any cloud service provider will do.
Most (if not all) clouds have dedicated images for web servers and Django deployments. Have fun.
I'm a fairly new to web development so this might actually be normal behavior - but when I make logic changes in my views, it can take about an hour for those changes to show up on my production site.
The changes are instant if I fire up the localhost. Server is Windows IIS 7.5. HTML, CSS, and JS changes show up instantly, it's the code in the view that takes a while to filter through. Any ideas on what is causing this and how to fix it?
Have you tried doing a manual reboot of the application pool where the site is sitting in IIS? Documentation might not be exact for the version but it should explain it well enough to give you an idea about what's going:
https://technet.microsoft.com/en-us/library/cc753179(v=ws.10).aspx
Basically, if you have the application pool recycle every 3 hours, when you make a change it could take up to 3 hours for the change to take effect. You also don't want it recycling every 5 minutes either. But you can do a manual recycle if you really want to see your changes.
We have begun upgrading hardware and software to a 64-bit architecture using Apache with mod_jk and four Tomcat servers (the new hardware). We need to be able to test this equipment with a large number of simultaneous connections while still actually doing things in the app (logging in, etc.)
I currently am using Python with the Mechanize library to do this, but it's just not cutting it. Threading is not "real" in Python, and multiprocessing makes the local box work harder than the machines we are trying to test since it has to load so much into memory for Mechanize.
The bottom line is that I need something that will really hammer this thing's connections and hold a session to make sure that the sticky sessions are working in mod_jk. I need to be able to code it quickly, it needs to be lightweight, and being able to do true multithreading would be a perk. Other than that, I am open-minded.
Any input will be greatly appreciated. Thanks.
Open Source Testing Tools
Not knowing the full requirements makes it difficult, however something from the list might fit the bill.
In order to accomplish what I wanted to do, I just went back to basics. Mechanize is somewhat bulky, and there was a lot of bloat involved in the main functionality tests I had before. So I started with a clean slate and just used cookielib.CookieJar and urllib2 to build a linear test and then run them in a while 1 loop. This provided enough strain on the Apache system to see how it would react in the new environment, and for the record, it did VERY well.