Using Task Scheduler vs Multithreading [closed] - python

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I needed to perform a heavy operation while using a Tkinter GUI. So the GUI would stop responding as soon as the operation began. So, I had two choices(or that's what i think,as I'm new to python & programming as well): MultiThreading or Schtasks .
So, I chose the easier of the two,i.e Schtasks, as I'm working on a deadline(& I dont know much about Multithreading).
What I'm doing is accessing a python file from a different project.
I run batch files which is in this different project(which contains the desired python file that i need to run) to be run by Schtasks
Now the constraint is batch file can access only this python file & not a particular method present in that file(isn't it?) & I need to access only a particular method .
So, my question is:
Is the approach I'm using correct? If not what do you suggest would be better ? Or should I just switch to MultiThreading

Your question opens a huge topic - what you are trying to do is generally not simple and can have large problems which you cannot even foresee if you don't know the topic of multitasking very well. One issue, for example, is synchronizing the access to the file you mention from within different threads or processes or tasks.
However, if you want to start somewhere and just want to write something which separates your GUI code from your computation code, I recommend you start here: http://docs.python.org/2/library/multiprocessing.html .

Related

Dashboard for monitoring the results of an iterative program [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 2 years ago.
Improve this question
I run a Python code in which an iterative process is done. Every few minutes an iteration is performed and the results are stored in a file. Currently, after each iteration I have to run another Python script to plot the recent results to monitor the progress. I want to have a dashboard which plots the recent results whenever the results file is updated.
It's not entirely clear what your question is, but it sounds like you want to monitor the output file for changes and plot them when the file is changed.
If you're using Linux (as the tag suggests), then I'd suggest using inotify, which is a Linux API allows you to monitor filesystem events (like file writes!).
There is a Python wrapper around this, also named inotify: https://pypi.org/project/inotify/. You should be able to add a watch on your log file and run your plotting function when it's modified (perhaps by watching for the IN_CLOSE_WRITE event).

Could you check it's possible (Selenium python automation + PHP) [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 2 years ago.
Improve this question
Our system is developed with PHP and one of our coworkers developed Amazon automation program with Python.
I am wondering if it's possible to integrate together ?
if it is please recommend what ways i can do this
https://github.com/jasonminsookim/order_automation/blob/master/src/amzn.py
Here's code Amazon automation program
Thank you
There are lots of ways to do this, but I would weigh what you have available to you and go from there. The tempfile solution is the most general, and is a common interface pattern for any two or more languages, but you can get more exotic if performance is a major concern with pipes.
Temp-file
I guess the most rudimentary way to do this would be to have the python file output some data to a file that can be read in by php or vice versa.
Something like creating a directory called /orders where php put's in order.json files and python takes those in, reads them and gets the result, then puts it back as a order-result.json. Essentially a temp-file system to communicate between the two.
Pipes
Alternatively depending on your setup you could pipe results into php from python with something like the subprocessing module and a php CLI that interfaces with your DB.

Get Python script state from another script [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I have a python3 program that does some stuff and a Pyramid app that allows users to interact with the program's database. Say, my program may be busy with 2 kinds of activity: 'Task1' and 'Task2'. I would like to be able to get the current activity (better with some other additional info) from my Pyramid app and show it to users. What is the best way to do that? If possible, I would prefer to avoid multithreading and all that stuff.
The only idea I have in mind is having a special file on disk where my program writes it's activity whenever it changes. But I guess that solution doesn't shine at performance, not to mention other drawbacks.
I think you have three options:
1 - As you said, use a special file .. (but that's not practical and kinda 1990 stuff ..)
2 - Use a special process for threads communication using multithreading .. (a complicated solution for a simple language)
3 - Use the database, and put a special key with a special value to indicate whether the value in the database key you choose to put your communications in have changed or not .. Better be json formatted .. (I think that's the most practical and the easiest solution)

Python - First Interface with a Program [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I have spent the last six months learning python as a way to automate my working environment. So far I have automated data extraction and report downloading from various web-based sources through the use of webcrawlers, interacted with excel files, created visual representations of data through matplotlib, and removed almost all the monotony from bank reconciliation.
I now come to a new task which takes up a large amount of my daily workload. We use an accounts program called Sage 50 Accounts. I effectively want to begin to learn how to manipulate the data contained within this program so that my daily thought patterns can be put into Python code.
Because this hasn't been done, there's no pre-made API. So my question is:
When wishing to interact with a new program through Python, how does a programmer begin such an inquiry?
Please accept that this question is only vague and general because I'm incredibly new to such a task.
SData is Sage's general data access API layer and should suit your purposes.
Otherwise you might need to invest in or obtain a Sage Development SDK.

Python R/W to text file network [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
What could happen if multiple users run the same copies of python script which designed to R/W data to a single text file store in network device at the same time?
Will the processes stop working?
If so, what could be the solution?
It can happen many bad things, I don't think the processes stop working, not at least because of concurrent access to file a file, but what could happen is and inconsistent file creation: for example, if one processes write hello, and there is a concurrent access to the file, you might get a line like hhelllolo
A solution I can see is, use a database as suggested, or, create a mechanism for locking the file to concurrent accesses (which might be cumbersome because you're working on network, not the same computer)
Another solution I can think of is create a server side simple script who handle the requests and lock the file for concurrent access. This is almost the same solution as using a database, you'll be creating an storage system from scratch so why bother :)
Hope this helps!

Categories

Resources