I have made a folder using python3 script, and to apply multiple attributes (+h +s) to the folder I have to run ATTRIB command in Command Prompt.
But I want to know how it can be done from the same python3 script.
import os
os.makedir("C:\\AutoSC")
# Now I want the code to give the same result such that I have opned CMD and writen following command
# C:\> attrib +h +s AutoSC
# Also show in the code, necessary imported modules
I want the folder to be created and immediately hidden as system folder.
Which is not visible even after show hidden files.
Use the subprocess module or use os.system to send commands directly to OS.
import subprocess
subprocess.run(["ls","-l"])# in linux, for windows, it may change.
import os
os.system('attrib +h +s AutoSC')
Related
So basically I am running a script in Python that executes other 5 .py scripts as well, just like this:
exec(open('statcprs.py').read())
exec(open('dragndownf.py').read())
exec(open('lowprbubble.py').read())
exec(open('wshearstr.py').read())
exec(open('slices.py').read())
These .py files uses Paraview (another software) to run some stuff, so If I only run "statcprs.py", it will open Paraview's terminal and run the script. The problem is, from the first one "statcprs.py" to the second one "dragndownf.py" it doesn't interrupt the software, and it continue running it, interefering with scripts from both .py files.
I would like to execute the first one, stop and then start the second one from scratch without connection between them. is this somehow possible?
I think the problem is this line (line 1) which opens the terminal:
#!/usr/bin/env pvpython
The following will execute a list of python scripts in the same folder as the driver script:
import os
from pathlib import Path
import subprocess
import sys
scripts = [
'statcprs.py',
'dragndownf.py',
'lowprbubble.py',
'wshearstr.py',
'slices.py',
]
parent = Path(__file__).resolve().parent
for script in scripts:
script_path = parent / script
subprocess.call([sys.executable, script_path])
How can I write a Python program that runs all Python scripts in the current folder? The program should run in Linux, Windows and any other OS in which python is installed.
Here is what I tried:
import glob, importlib
for file in glob.iglob("*.py"):
importlib.import_module(file)
This returns an error: ModuleNotFoundError: No module named 'agents.py'; 'agents' is not a package
(here agents.py is one of the files in the folder; it is indeed not a package and not intended to be a package - it is just a script).
If I change the last line to:
importlib.import_module(file.replace(".py",""))
then I get no error, but also the scripts do not run.
Another attempt:
import glob, os
for file in glob.iglob("*.py"):
os.system(file)
This does not work on Windows - it tries to open each file in Notepad.
You need to specify that you are running the script through the command line. To do this you need to add python3 plus the name of the file that you are running. The following code should work
import os
import glob
for file in glob.iglob("*.py"):
os.system("python3 " + file)
If you are using a version other than python3, just change the argument from python3 to python
Maybe you can make use of the subprocess module; this question shows a few options.
Your code could look like this:
import os
import subprocess
base_path = os.getcwd()
print('base_path', base_path)
# TODO: this might need to be 'python3' in some cases
python_executable = 'python'
print('python_executable', python_executable)
py_file_list = []
for dir_path, _, file_name_list in os.walk(base_path):
for file_name in file_name_list:
if file_name.endswith('.csv'):
# add full path, not just file_name
py_file_list.append(
os.path.join(dir_path, file_name))
print('PY files that were found:')
for i, file_path in enumerate(py_file_list):
print(' {:3d} {}'.format(i, file_path))
# call script
subprocess.run([python_executable, file_path])
Does that work for you?
Note that the docs for os.system() even suggest using subprocess instead:
The subprocess module provides more powerful facilities for spawning new processes and retrieving their results; using that module is preferable to using this function.
If you have control over the content of the scripts, perhaps you might consider using a plugin technique, this would bring the problem more into the Python domain and thus makes it less platform dependent. Take a look at pyPlugin as an example.
This way you could run each "plugin" from within the original process, or using the Python::multiprocessing library you could still seamlessly use sub-processes.
I am trying to use the C&C NLP library in my mac and it uses terminal as its interface. so naturally I'm trying to run the command from my python, but here's what happens:
candc:could not open model configuration file for reading:models/config
turns out candc should not be called from the same directory, and should be called from outside of the binary folder, something like "bin/candc".
how can I make this work?
this is my code:
cmd="candc/bin/candc --models models"
subprocess.check_output('{} | tee /dev/stderr'.format( cmd ), shell=True)
Pass the cwd argument with your desired working directory.
For example, if you want to run it as bin/candc from the candc directory:
import os
cmd="bin/candc --models models"
subprocess.check_output('{} | tee /dev/stderr'.format( cmd ), shell=True, cwd=os.path.abspath('candc'))
(I'm not sure whether you actually need os.path.abspath. Do test both with and without it.)
Use the full path in cmd:
cmd = "/home/your-username/python-programs/cnc/candc/bin/canc --models models
Whatever that full path might be. You can use (if you're on linux) pwd inside the candc directory to find out what it is.
I wish to write a python script that allows me to navigate and git pull multiple repositories. Basically the script should type the following on the command-line:
cd
cd ~/Desktop/Git_Repo
git pull Git_Repo
I am not sure if there is a python library already out there that can perform such a task.
Use subprocess, os, and shlex. This should work, although you might require some minor tweaking:
import subprocess
import shlex
import os
# relative dir seems to work for me, no /'s or ~'s in front though
dir = 'Desktop/Git_Repo'
# I did get fetch (but not pull) to work
cmd = shlex.split('git pull Git_Repo')
# you need to give it a path to find git, this lets you do that.
env = os.environ
subprocess.Popen(cmd, cwd=dir, env=env)
Also, you'll need your login preconfigured.
I want to implement a userland command that will take one of its arguments (path) and change the directory to that dir. After the program completion I would like the shell to be in that directory. So I want to implement cd command, but with external program.
Can it be done in a python script or I have to write bash wrapper?
Example:
tdi#bayes:/home/$>python cd.py tdi
tdi#bayes:/home/tdi$>
Others have pointed out that you can't change the working directory of a parent from a child.
But there is a way you can achieve your goal -- if you cd from a shell function, it can change the working dir. Add this to your ~/.bashrc:
go() {
cd "$(python /path/to/cd.py "$1")"
}
Your script should print the path to the directory that you want to change to. For example, this could be your cd.py:
#!/usr/bin/python
import sys, os.path
if sys.argv[1] == 'tdi': print(os.path.expanduser('~/long/tedious/path/to/tdi'))
elif sys.argv[1] == 'xyz': print(os.path.expanduser('~/long/tedious/path/to/xyz'))
Then you can do:
tdi#bayes:/home/$> go tdi
tdi#bayes:/home/tdi$> go tdi
That is not going to be possible.
Your script runs in a sub-shell spawned by the parent shell where the command was issued.
Any cding done in the sub-shell does not affect the parent shell.
cd is exclusively(?) implemented as a shell internal command, because any external program cannot change parent shell's CWD.
As codaddict writes, what happens in your sub-shell does not affect the parent shell. However, if your goal is to present the user with a shell in a different directory, you could always have Python use os.chdir to change the sub-shell's working directory and then launch a new shell from Python. This will not change the working directory of the original shell, but will leave the user with one in a different directory.
As explained by mrdiskodave
in Equivalent of shell 'cd' command to change the working directory?
there is a hack to achieve the desired behavior in pure Python.
I made some modifications to the answer from mrdiskodave to make it work in Python 3:
The pipes.quote() function has moved to shlex.quote().
To mitigate the issue of user input during execution, you can delete any previous user input with the backspace character "\x08".
So my adaption looks like the following:
import fcntl
import shlex
import termios
from pathlib import Path
def change_directory(path: Path):
quoted_path = shlex.quote(str(path))
# Remove up to 32 characters entered by the user.
backspace = "\x08" * 32
cmd = f"{backspace}cd {quoted_path}\n"
for c in cmd:
fcntl.ioctl(1, termios.TIOCSTI, c)
I shall try to show how to set a Bash terminal's working directory to whatever path a Python program wants in a fairly easy way.
Only Bash can set its working directory, so routines are needed for Python and Bash. The Python program has a routine defined as:
fob=open(somefile,"w")
fob.write(dd)
fob.close()
"Somefile" could for convenience be a RAM disk file. Bash "mount" would show tmpfs mounted somewhere like "/run/user/1000", so somefile might be "/run/user/1000/pythonwkdir". "dd" is the full directory path name desired.
The Bash file would look like:
#!/bin/bash
#pysync ---Command ". pysync" will set bash dir to what Python recorded
cd `cat /run/user/1000/pythonwkdr`