Run script on multiple files sequentially? - python

I have this code which along with the rest of it runs on a single file in a folder.
I want to try and run this code on 11 files in the folder. I have to pass parameters to the whole script via an .sh script which I've written.
I've searched on here and found various solutions which have not worked.
def get_m3u_name():
m3u_name = ""
dirs = os.listdir("/tmp/")
for m3u_file in dirs:
if m3u_file.endswith(".m3u") or m3u_file.endswith(".xml"):
m3u_name = m3u_file
return m3u_name
def remove_line(filename, what):
if os.path.isfile(filename):
file_read = open(filename).readlines()
file_write = open(filename, 'w')
for line in file_read:
if what not in line:
file_write.write(line)
file_write.close()
m3ufile = get_m3u_name()
I have tried a different method of deleting the file just processed then looping the script to run again on the next file as I can do this manually but when I use
os.remove(m3ufile)
I get file not found either method of improving my code would be of great help to me. I'm just a newbie at this but pointing me in the right direction would be of great help.

Related

Correct file path with folders

I created a translating script that searches for a file, for example bot.json in this folder structure from the executed file: data/lang/en/. I use the following code to do it:
with open(r'data/lang/' + functions.json_module.get_config('Config')['Language'] + r'/bot.json', 'r') as f:
return json.load(f)['Bot']['starting']
functions.json_module.get_config('Config')['Language'] will return en.
What am I doing wrong?
I tried it with '\data\lang...' too but this didn't work either. I was expecting to get the translation for starting (Starting...).

file upload from container using webdav results into empty file upload

I'm attempting to wrap my brain around this because the identical code generates two different sets of outcomes, implying that there must be a fundamental difference between the settings in which the code is performed.
This is the code I use:
from webdav3.client import Client
if __name__ == "__main__":
client = Client(
{
"webdav_hostname": "http://some_address"
+ "project"
+ "/",
"webdav_login": "somelogin",
"webdav_password": "somepass",
}
)
ci = "someci"
version = "someversion"
directory = f'release-{ci.replace("/", "-")}-{version}'
client.webdav.disable_check = (
True # Need to be disabled as the check can not be performed on the root
)
f = "a.rst"
with open(f, "r") as fh:
contents = fh.read()
print(contents)
evaluated = contents.replace("#PIPELINE_URL#", "DUMMY PIPELINE URL")
with open(f, "w") as fh:
fh.write(evaluated)
print(contents)
client.upload(local_path=f, remote_path=f)
The file a.rst contains some text like:
Please follow instruction link below
#####################################
`Click here for instructions <https://some_website>`_
When I execute this code from macOS, a file with the same contents of a.rst appears on my website.
When I execute this script from within a container with a base image of Python 3.9 and the webdav dependencies, it creates a file on my website, but the content is always empty. I'm not sure why, but it could have something to do with the fact that I'm running it from within a Docker container that on top of it can't handle the special characters in the file (plain text seems to work though)?
Anyone have any ideas as to why this is happening and how to fix it?
EDIT:
It seems that the character ":" is creating the problem..

Script can't save data to file

Based on *.blend file I have to write a script that gets informations about objects and saves them to json. This script can be opened in Blender, or running. The launch should save the json file with the data in the current directory.
So I created this:
import bpy
import json
objects = bpy.context.scene.objects
data = {}
for ob in objects:
item = {}
item['location'] = ob.location
if ob.name == 'Cube':
item['material_name'] = ob.active_material.name
data[ob.name] = item
elif ob.name == 'Camera':
item['camera_type'] = ob.data.type
data[ob.name] = item
elif ob.name == 'Lamp':
item['lamp_type'] = ob.data.type
data[ob.name] = item
with open('scene_objects.json', 'w') as json_file:
json.dump(data, json_file)
However, when I run the script in Blender, I received the following error:
PermissionError: [Errno 13] Permission denied: 'scene_objects.json'
I'm a beginner in using Blender so maybe it's impossible to write to file from Blender? However, if I can do it, I am asking for advice on how?
Your issue isn't with blender, the OS is preventing the creation (or writability) of the file based on file system permissions.
The line -
with open('scene_objects.json', 'w') as json_file:
will create a new file (or open existing) in the current working directory. When running blender that could be one of several options, depending on which OS you are using. It is also possible that starting blender from a GUI can leave you without a valid CWD, or a temporary dir that a user does not have permission to write to.
You can use os.chdir() to change the CWD to one that you know exists and that you can write to. You can also specify a full path instead of just a filename.

Python 2.7 to exe using py2exe issue

I successfully created an .exe using py2exe with a simple test script I found on a tutorials website. The script I am using, however, does not seem to work. My code uses the csv module and dict reader with two .csv inputs.
I run the python setup.py p2exe command, and I get a flash of a command prompt, but that disappears before I can read anything on it. And once it disappears, I do not have the correct .csv file output that I would get if I just ran the script in python.
Can anyone offer any advice or things to try? Or is there a way I could get that pesky cmd window to stay open long enough for me to see what it says?
Thanks. My script is below.
import csv
def main():
iFileName = 'DonorsPlayTesting.csv'
oFileName = iFileName[:-4] + '-Output' + iFileName[-4:]
iFile = csv.DictReader(open(iFileName))
oFile = csv.writer(open(oFileName, 'w'), lineterminator = '\n')
iDirectory = csv.DictReader(open("DonorsDirectory.csv"))
oNames = {}
directory = {}
for line in iDirectory:
directory[line['Number']] = line['Name']
for key in directory.keys():
oNames[directory[key]] = 0
out_header = ['Name', 'Plays']
oFile.writerow(out_header)
for line in iFile:
if line['Type'] == "Test Completion":
if line['Number'] in directory:
oNames[directory[line['Number']]] += 1
elif line['Number'] not in directory:
oNames[line['Number']] = 'Need Name'
oFile.writerows(oNames.items())
main()

weird no such file or directory in python

I'm new to python and the following piece of code is driving me crazy. It lists the files in a directory and for each file does some stuff. I get a IOError: [Errno2] No such file or directory: my_file_that_is_actually_there!
def loadFile(aFile):
f_gz = gzip.open(aFile, 'rb')
data = f_gz.read()
#do some stuff...
f_gz.close()
return data
def main():
inputFolder = '../myFolder/'
for aFile in os.listdir(inputFolder):
data = loadFile(aFile)
#do some more stuff
The file exists and it's not corrupted. I do not understand how it's possible that python first finds the file when it checks the content of myFolder, and then it cannot find itanymore... This happens on the second iteration of my for loop only with any files.
NOTE: Why does this exception happen ONLY at the second iteration of the loop?? The first file in the folder is found and opened without any issues...
This is because open receives the local name (returned from os.listdir). It doesn't know that you mean that it should look in ../myFolder. So it receives a relative path and applies it to the current dir. To fix it, try:
data = loadFile(os.path.join(inputFolder, aFile))

Categories

Resources