I have a main page in Python/Pylons project, which have multiple different blocks (e.g. news/demo/(registration|private zone)/...).
My thought is that each block should be generated in a separate controller.
How can I call another controller method in a main page controller?
What you want to do is HMVC. I'm not sure it is easily doable out of the box with Pylons, since it's MVC.
If you have code that is repeated in multiple controllers, you could move some of this code out of the controller (in the models, or another module).
Also, if you are using Mako templates, you can reuse parts of templates by using inheritance http://www.makotemplates.org/docs/inheritance.html and by using defs http://www.makotemplates.org/docs/defs.html.
This is probably where you start moving chunks of code to library functions, to the /lib part of your Pylons project. "Generated by a separate controller" is probably going too far - you merely need to not repeat yourself. Try using library functions to make sure that the correct data is available, then use Mako's inheritance and namespace features.
Related
I've found an example: https://medium.com/velotio-perspectives/a-comprehensive-tutorial-to-implementing-opentracing-with-jaeger-a01752e1a8ce
I have a pretty large codebase and I really don't want to modify every function by adding a line like ' with tracer.start_span('booking') as span:'. Is there any way to do it?
Thanks in advance.
Jaeger is a distributed tracer, inspired by Google's Dapper paper, and so it is mainly used for tracing communication between different processes in a microservices / distributed system architecture, not so much for portions of code inside an application.
The way Jaeger is introduced into most applications is to integrate it into the part of the application that is receiving requests from the network. For example, if your Python application is receiving HTTP requests using Django or Flask, or other types of requests (e.g. gRPC) using some other framework, there will probably be a project somewhere on the internet that lets you hook Jaeger into your framework with a couple of lines of code. For the most popular frameworks, the Jaeger docs point to opentracing-contrib as a good source for these "client libraries".
While making extra tracing calls inside an application is not unheard of or discouraged with d.tracers, it's not something that tends to happen a lot, because d.tracers are typically used in microservices environments where the interactions between components are more important than what's happening inside the components.
If you do want to create tracing records inside an application, then it would be very unusual to do tracing of every single function. Instead, tracing inside an application would typically be done at the boundary of components in a modular monolith, i.e. when one component calls into another component.
Lastly, if what you really want is performance analysis of your single Python application at the level of each function, and you don't care about it's interaction with other applications in your system (maybe you only have the one?), then Jaeger is probably not the right tool. In that case, you would probably want to look for an Application Performance Monitoring or APM tool that works with Python and suits your needs.
Studying Tkinter and I've only found tutorials on Tkinter without OOP, but looking at the Python.org documentation it looks like it's all in OOP. What's the benefit of using classes? It seems like more work and the syntax looks night and day from what I've learned so far.
This is going to be a really generic answer and most of the answers to this will be opinionated anyways. Speaking of which,the answer will likely be downvoted and closed because of this.
Anyways... Let's say you have a big GUI with a bunch of complicated logic sure you could write one huge file with hundreds, if not thousands of lines, and proxy a bunch of stuff through different functions and make it work. But, the logic is messy.
What if you could compartmentalize different sections of the GUI and all the logic surrounding them. Then, takes those components and aggregate them into the sum which makes the GUI?
This is exactly what you can use classes for in Tkinter. More generally, this is essentially what you use classes for - abstracting things into (reusable - instances) objects which provide a useful utility.
Example:
An app I built ages ago with Tkinter when I first learned it was a file moving program. The file moving program let you select the source / destination directory, had logging capabilities, search functions, monitoring processes for when downloads complete, and regex renaming options, unzipping archives, etcetera. Basically, everything I could think of for moving files.
So, what I did was I split the app up like this (at a high level)
1) Have a main which is the aggregate of the components forming the main GUI
Aggregates were essentially a sidebar, buttons / labels for selection various options split into their own sections as needed, and a scrolled text area for operation logging + search.
So, the main components were split like this:
2) A sidebar which had the following components
Section which contained the options for monitoring processes
Section which contained options for custom regular expressions or premade ones for renaming files
Section for various flag such as unpacking
3) A logging / text area section with search functionality build in + the options to dump (save) log files or view them.
That's a high level description of the "big" components which were comprised from the smaller components which were their own classes. So, by using classes I was able to wrap the complicated logic up into small pieces that were self contained.
Granted, you can do the same thing with functions, but you have "pieces" of a GUI which you can consider objects (classes) which fit together. So, it just makes for cleaner code / logic.
Like what pythonista just said...
OOP makes your GUI code more organized and if you need to create new windows eg.toplevel() you will find it extremely useful because you won't need to write all that code again and again and again... Plus if you have to use variables that are inside another function you will not need to declare it as a global. OOP with Tkinter is the best approach
How does one properly structure a larger django website such as to retain testability and maintainability?
In the best django spirit (I hope) we started out by not caring too much about decoupling between different parts of our website. We did separate it into different apps, but those depend rather directly upon each other, through common use of model classes and direct method calls.
This is getting quite entangled. For example, one of our actions/services looks like this:
def do_apply_for_flat(user, flat, bid_amount):
assert can_apply(user, flat)
application = Application.objects.create(
user=user, flat=flat, amount=bid_amount,
status=Application.STATUS_ACTIVE)
events.logger.application_added(application)
mails.send_applicant_application_added(application)
mails.send_lessor_application_received(application)
return application
The function does not only perform the actual business process, no, it also handles event logging and sending mails to the involved users. I don't think there's something inherently wrong with this approach. Yet, it's getting more and more difficult to properly reason about the code and even test the application, as it's getting harder to separate parts intellectually and programmatically.
So, my question is, how do the big boys structure their applications such that:
Different parts of the application can be tested in isolation
Testing stays fast by only enabling parts that you really need for a specific test
Code coupling is reduced
My take on the problem would be to introduce a centralized signal hub (just a bunch of django signals in a single python file) which the single django apps may publish or subscribe to. The above example function would publish an application_added event, which the mails and events apps would listen to. Then, for efficient testing, I would disconnect the parts I don't need. This also increases decoupling considerably, as services don't need to know about sending mails at all.
But, I'm unsure, and thus very interested in what's the accepted practice for these kind of problems.
For testing, you should mock your dependencies. The logging and mailing component, for example, should be mocked during unit testing of the views. I would usually use python-mock, this allows your views to be tested independently of the logging and mailing component, and vice versa. Just assert that your views are calling the right service calls and mock the return value/side effect of the service call.
You should also avoid touching the database when doing tests. Instead try to use as much in memory objects as possible, instead of Application.objects.create(), defer the save() to the caller, so that you can test the services without having to actually have the Application in the database. Alternatively, patch out the save() method, so it won't actually save, but that's much more tedious.
Transfer some parts of your app to different microservices. This will make some parts of your app focused on doing one or two things right (e.g. event logging, emails). Code coupling is also reduced and different parts of the site can be tested in isolation as well.
The microservice architecture style involves developing a single application as a collection of smaller services that communicates usually via an API.
You might need to use a smaller framework like Flask.
Resources:
For more information on microservices click here:
http://martinfowler.com/articles/microservices.html
http://aurelavramescu.blogspot.com/2014/06/user-microservice-python-way.html
First, try to brake down your big task into smaller classes. Connect them with usual method calls or Django signals.
If you feel that the sub-tasks are independent enough, you can implement them as several Django applications in the same project. See the Django tutorial, which describes relation between applications and projects.
when reading python documentation and various mailing lists I always read what looks a little bit like a dogma. Global variables should be avoided like hell, they are poor design ... OK, why not ? But there are some real lifes situation where I do not how to avoid such a pattern.
Say that I have a GUI from which several files can be loaded from the main menu.
The file objects corresponding to the loaded files may be used througout all the GUI (e.g. an image viewer that will display an image and on which various actions can be performed on via different dialogs/plugins).
Is there something really wrong with building the following design:
Menu.py --> the file will be loaded from here
Main.py --> the loaded file objects can be used here
Dialog1.py --> or here
Dialog2.py --> or there
Dialog3.py --> or there
...
Globals.py
where Globals.py will store a dictionary whose key are the name of the loaded files and the value the corresponding file objects. Then, from there, the various part of the code that needs those data would access it via weak references.
Sorry if my question looks (or is) stupid, but do you see any elegant or global-free alternatives ? One way would be to encapsulate the loaded data dictionary in the main application class of Main.py by considering it as the central access part of the GUI. However, that would also bring some complications as this class should be easily accessible from all the dialogs that needs the data even if they are necesseraly direct children of it.
thank a lot for your help
Global variables should be avoided because they inhibit code reuse. Multiple widgets/applications can nicely live within the same main loop. This allows you to abstract what you now think of as a single GUI into a library that creates such GUI on request, so that (for instance) a single launcher can launch multiple top-level GUIs sharing the same process.
If you use global variables, this is impossible because multiple GUI instances will trump each other's state.
The alternative to global variables is to associate the needed attributes with a top-level widget, and to create sub-widgets that point to the same top-level widgets. Then, for example, a menu action will use its top-level widget to reach the currently opened file in order to operate on it.
I would manage global data by encapsulating the data in one ore more classes and implement the borg pattern for these classes.
See Why is the Borg pattern better than the Singleton pattern in Python
Im working on optimizing my design in terms of mvc, intent on simplifying the api of the view which is quite nested even though Iv built composite widgets(with there own events and/ pubsub messages) in an attempt to simpify things.
For example I have a main top level gui class a wxFrame which has a number of widgets including a notebook, the notebook contains a number of tabs some of which are notebooks that contain composite widgets. So to call the methods of one of these composite widgets from the controller I would have
self.gui.nb.sub_nb.composite_widget.method()
To create a suitable abstraction for the view I have created references to these widgets (whose methods need to be called in the controller) in the view like so
self.composite_widget = self.nb.sub_nb.composite_widget()
so that in the controller the call is now simplified to
self.gui.composite_widget.method()
Is this an acceptable way to create an abstraction layer for the gui?
Well that's definitely one way to handle the issue. I tend to use pubsub to call methods the old fashioned way though. Some people like pyDispatcher better than pubsub. The main problem with using multi-dot method calling is that it's hard to debug if you have to change a method name.