Xcode - run script only when source file is added or removed - python

In my project, I have a Python script that scans the source directory and updates a source file with what it finds. I’d like this script to only run when it needs to. At the moment, I have this script in a Run Script build phase with Input Files set to $(PROJECT).xcodeproj/project.pbxproj and Output Files set to the updated source file. This means that the scripts runs when I add new files but it also runs whenever I change project settings. When the script runs unnecessarily, part of the project is recompiled even though none of the source files have changed. This is kind of annoying when all I want to do is tweak some settings.
Is there some way that I can avoid the unnecessary recompilation and just run the script when new source files are added or removed from the project?
I guess I could manually run the script whenever I add or remove a source file.
I think that Xcode is recompiling because the modification date on the file is changed. Python updates the modification date when you flush to a file. So I guess I could just write to the file only when the output is different to the output file. I’m pretty sure reading a file won’t change the modification date. That seems like a lot of fluffing around though. If anyone’s got a better solution, please let me know!

Related

How can I execute .py script in Bitbucket pipeline?

I have a .py script for removing unused tags after certain time, but I want it to be executable in Bitbucket pipeline. It will also be nice if someone can explain a bit more the thing with 'images' (Python Docker images)., which one should I use, why, etc.
Roughly, the main thing that I want to do in pipe is:
fetch all
execute the script
move it to certain .txt file
Let's say that I know the way how to do first and last thing, but the main question is executing .py script. I'm not even sure how to start.

Running a python script and changing git branch

I am trying to find ways to make better use of my time while programming.
I have a python script that does some heavy work (it can take hours) to finish. Now, most of the work it does is network related, so i have plenty of cpu resources to spare.
If the script was a C binary executable, it would be fine to git checkout onto a different branch and do extra work, I could even modify the binary in disk as it has been copied to ram, so until it finishes running I won't affect program output.
But python scripts are translated, not compiled. What happens if I start tampering with the source file, can i corrupt the programs output, or is the text file and associated imports copied to RAM, allowing me to tamper with the source with no risk of changing the behaviour of the running program?
In general, if you have a single Python file which you run as a script, you're fine. When you run the file, it is compiled into bytecode which is then executed. You can change the original script at this point and nothing breaks.
However, we can deliberately break it by writing some horrible but legal code like this:
horrible.py:
from time import sleep
sleep(10)
import silly
silly.thing()
silly.py:
def thing():
print("Wow!")
You can run horrible.py and while it is running you can edit silly.py on disk to make it do something else. When silly.py is finally imported, the updated version will be loaded.
A workaround is to put all your imports at the top of the file, which you probably do anyway.
When a python program is run it is compiled (kinda, more like translated) into a .pyc file that is then run by the python interpreter. When you change a file it should NOT affect the code if it is already running.
Here is a related stackoverflow answer. What will happen if I modify a Python script while it's running?
Why not have another working directory where you make your modifications? Is there a lot of ancillary data or something that makes it hard to set up a working directory? I.e. if your working directory is A, git clone A B, and then work in B. When you're done, you can pull the changes back from B to A:
git remote add B ../B
git pull B master

Sublime/Python: executing a php script via shell in a path that changes

I'm currently using an open-source SCSS compiler, however, every time I make changes to a scss file, I have to run the compiler manually to compile the output. So, to combat this, I edited a Sublime package that allows people to run commands on file saves and it goes like this:
class CommandOnSave(sublime_plugin.EventListener):
def on_post_save(self, view):
settings = view.settings()
current_file = view.file_name()
if current_file.endswith("scss"):
subprocess.call("php path/to/phpscript", shell=True)
Now, the only issue is that one folder in the path changes with every project, and it would be ideal if there was some way to dynamically execute the php script so I don't have to have a settings file with multiple paths in it to accomplish this.
Does anyone know of an easy way to do this without a lot of server overload? The php script always resides four directories down from the scss file. I've tried ../../../.. but that doesn't work (obviously) since the command isn't looking for the file from the current file's path.
Any help is greatly appreciated!

Why does Task Manager not run some lines of code in script?

Python novice here.
I have a Python script that performs some geodatabase management (reconcile/post versions, compress, etc). I have the following line of code in my script:
createLog = open(str(datetime.date.today()) + ".txt", "w")
along each step of the script I add to the text file with the following statements:
createLog.write("Database connections blocked.\n")
When I run the script in my IDE (PyCharm) I get the desired result: A text file with each step written to the .txt file. When I run it in Task Scheduler no .txt file is created and therefore no log. Everything else runs as far as I can tell. I'm able to track edits made to the data.
I have experienced things like this before with task scheduler but have never been able to resolve the problem in the past.
Any ideas?
I think this is a working directory problem. Python's open function opens a file in the current working directory, NOT in the same folder as the script. This is a common misconception! (Which confused me for ages when learning Python...)
So what is a working directory? Well to quote my good friend Wikipedia:
In computing, the working directory of a process is a directory of a hierarchical file system, if any,[1] dynamically associated with each process. When the process refers to a file using a simple file name or relative path (as opposed to a file designated by a full path from a root directory), the reference is interpreted relative to the current working directory of the process. So for example a process with working directory /rabbit-shoes that asks to create the file foo.txt will end up creating the file /rabbit-shoes/foo.txt.
(Source: https://en.wikipedia.org/wiki/Working_directory)
So how is this working directory selected?
Well it is selected by the parent process of that processes! When you run a program from a shell like bash, the shell (the parent process) helpfully sets the working directory of the program you are running (the child process) to the directory you are currently in. (That is, the directory you cd'd to.)
Since your IDE is smart and helpful, it is starting your Python script process and setting the working directory to the same place the script itself is located. The task scheduler is less helpful... I have absolutely no idea what it is setting the working directory to. However if you search your system, I am sure you will find the log file lying about somewhere!

OSX: Creating an automator workflow with Python makes the workflow invalid

I'm making a chat client for OSX, and I wanted it to be able to run as both a .app (for my less technologically inclined users) and as a .py file. I made a workflow app that contained two .py files (an auto-updater and the client itself), run by a python script in the .wflow file. This worked well. However, I couldn't update the updater or workflow script, and the icon was the Python rocket instead of the icon I had chosen. Then, I combined the client .py file with the updater .py file. This still worked, and now I could update the updater. I still couldn't update the python script in the workflow, though, and the icon was still wrong. So, I modified the updater to open the .wflow file, split it into a list (based on python comments in the workflow's python script, such as "#Start") of the stuff before the script, the script's modification time, and the stuff after the script. If the modification time isn't the same as the modification time of the remote file (the one that the updater updates from), then the script downloads the remote .py file, replaces characters (<, >, &) that .wflow files replace ('<' -> "<"), and opens document.wflow with the "w" (write/replace) flag. Then, the stuff that was before the old script, the downloaded script, and the stuff that was after the old script (using file.write(''.join(list))) are all put into document.wflow. This should work, but OSX no longer sees document as an automator file.
As you can see, OSX thinks that the old file is a Workflow, while the new file is a "Microsoft Excel 97-2004 workbook". The IMClient.app (the application that contains document.wflow) gives this message when I try to run it: "The document "IMClient" could not be opened because it is damaged or incomplete." Does anyone know how to fix this?
I'm using python 2.7 and OSX 10.7. The updater is downloading files via FTP.
If clarification is necessary, just ask. I'll post the working and nonworking files if anyone wants them.
EDIT: the file command gives "document.wflow: XML document text" for both the old and new file.

Categories

Resources