Execute sh in subdirectory from main.py script? - python

I am trying to do the following thing.
I have a python project with this dir-tree:
Project dir
|
|___main.py
|___module
|
|__a.py
|__a.sh
a.py has a class with a method that runs the script:
class A():
def run():
os.system('a.sh')
And a.sh creates a file:
touch a.txt
And main.py instantiates an object of class A and calls run():
a = A()
a.run()
When main is called, I get an error saying that the script a.sh is not found. I get why that happens, it's because the working dir is equal to the project root path, but how can I make it work so that the file created ends up in the root path?
I want to call main.py and end up with this configuration.
Project dir
|
|___a.txt
|___main.py
|___module
|
|__a.py
|__a.sh
I could replace the call to a.sh in a.py to
os.system('module/a.sh')
And it would work, but it does not look clean to me.

Code in the a.py module can extract the name of the directory it is in and use that in the os.system() call. This is what I mean:
File a.py:
import os
class A():
def run(self):
my_directory = os.path.dirname(__file__)
script_path = os.path.join(my_directory, 'a.sh')
os.system(script_path)
a = A()
a.run()

Related

How do I specify inheritence behavior on initialization of an object in Python?

I'm trying to write a class with a method that behaves differently based on data given on the initialization of the object. I want to pass this object code to run stored in a different file. The behavior should be as follows
foo = Myclass(config="return_a.py")
bar = Myclass(config="return_b.py")
foo.baz() # returns a
bar.baz() # returns b
# where return_a.py and return_b.py are files in the directory
The closest I've come to fixing it so far is using exec and having my configured python write to a file which I then read from. I don't know how I'd do this in memory
You can use importlib to import the files dynamically.
Let's say your project has the structure:
.
├── main.py
├── return_a.py
└── return_b.py
you can put in main.py your code:
import importlib
class Myclass:
def __init__(self, config) -> None:
config = importlib.import_module(
config)
self.baz = config.baz
foo = Myclass(config="return_a")
bar = Myclass(config="return_b")
foo.baz()
bar.baz()
This is assuming you have the function baz your return_a and return_b files. For example:
#return_a.py
def baz():
print("I am A")
#return_b.py
def baz():
print("I am B")
Now if you execute main.py you will get:
I am A
I am B

Import in a module fails because __name__ is __main__

Here is my project structure:
Project
main.py
myPackage/
subfolder1/
__init__.py
script11.py
script12.py
subfolder2/
__init__.py
script2.py
__init__.py
in main.pyI import script2.py the following way :
from myPackage.subfolder2 import script2 as script2
then, I call a function from script2.py in main.py with :
bar = script2.foo()
and in script2 I need to import a function from script1 :
from ..subfolder1.script11 import my_function
and it breaks with the error :
attempted relative import with no known parent package
I have inspected the __name__ variable and indeed it has the value __main__. How can I manage that properly ?
All you should have to do is change your import in main.py to from myPackage.subfolder2 import script2. I set up a directory and some files in this way, using that import, and the script runs as expected:
main.py
myPackage/
subfolder1/
script11.py
subfolder2/
script2.py
script11.py
def bar():
return 10
script2.py
from ..subfolder1.script11 import bar
def foo():
x = bar()
print('foo is', x)
main.py
from myPackage.subfolder2 import script2 as s2
s2.foo()
Running:
>>> py .\main.py
foo is 10
Some notes:
I'm assuming Python 3, since Python 2 was deprecated beginning of this year
In Python 3, the __init__.py files aren't necessary to make a package, but having them doesn't hurt anything. You can leave them out if you want.
The as script2 part in from subfolder2 import script2 as script2 is redundant. It will already be imported as script2.

How to inject different environment variable in dev, test, prod in Python

I used to work with Flask which offers an easy way to configure the application running in different modes. (dev, test, prod, ...)
class BaseConfig:
MY_PATH = "Something"
class DevelopmentConfig(BaseConfig):
MY_PATH = "Something else"
# ...
I am trying to build something similar but without using Flask. Here is the structure of the most simple code I could find:
-src
- main.py
- zip2h5
- __init__.py
- foo.py
-test
- __init__.py
- test_foo.py
The object Foo.py has a method path which output "path/to/dev" when in dev mode, "path/to/test" when in test mode. Writing if statements in the code would be messy and hard to test properly. Using environment variable seems much better. How and where do I set the configurations that Flask does?
# foo.py
class Foo():
def __init__(self, name):
self.name = name
def path(self):
return "path/in/dev"
# test_foo.py
class TestFoo(unittest.TestCase):
def test_path(self):
boo = Foo("Boo")
expected = "path/in/test"
self.assertEquals(boo.path(), expected)
Please, do not tell me I can patch the method. As I have said, this is just an example.
The environment for your process is available via the os module.
You can simply inject different environment variables for the path in your dev and test cases. I'm not sure how your running your tests, but usually you can do something like PATH='path/in/test' tests.sh to accomplish what you need.
I use the dotenv and keep .env files in my project root to manage this. I have a base test class that loads .env.test instead of .env for testing configuration.
Do it the same was Flask does it. Have multiple Config classes, then pass env as a parameter e.g
class Foo():
def __init__(self, name, env):
self.name = name
self.env = env
def path(self):
if self.env == 'TEST':
#initialize TestConfig class here
return TestConfigPath
test_foo.py
class TestFoo(unittest.TestCase):
def test_path(self):
boo = Foo("Boo")
expected = "path/in/test"
self.assertEquals(boo.path(), expected)

Python - Intra-package importing when package modules are sometimes used as standalone?

Sorry if this has already been answered using terminology I don't know to search for.
I have one project:
project1/
class1.py
class2.py
Where class2 imports some things from class1, but each has its own if __name__ == '__main__' that uses their respective classes I run frequently. But then, I have a second project which creates a subclass of each of the classes from project1. So I would like project1 to be a package, so that I can import it into project2 nicely:
project2/
project1/
__init__.py
class1.py
class2.py
subclass1.py
subclass2.py
However, I'm having trouble with the importing with this. If I make project1 a package then inside class2.py I would want to import class1.py code using from project1.class1 import class1. This makes project2 code run correctly. But now when I'm trying to use project1 not as a package, but just running code from directly within that directory, the project1 code fails (since it doesn't know what project1 is). If I set it up for project1 to work directly within that directory (i.e. the import in class2 is from class1 import Class1), then this import fails when trying to use project1 as a package from project2.
Is there a way to have it both ways (use project1 both as a package and not as a package)? If there is a way, is it a discouraged way and I should be restructuring my code anyway? Other suggestions on how I should be handling this? Thanks!
EDIT
Just to clarify, the problem arrises because subclass2 imports class2 which in turn imports class1. Depending on which way class2 imports class1 the import will fail from project2 or from project1 because one sees project1 as a package while the other sees it as the working directory.
EDIT 2
I'm using Python 3.5. Apparently this works in Python 2, but not in my current version of python.
EDIT 2: Added code to class2.py to attach the parent directory to the PYTHONPATH to comply with how Python3 module imports work.
import sys
import os
sys.path.append(os.path.dirname(os.path.abspath(__file__)))
Removed relative import of Class1.
Folder structure:
project2
- class3.py
- project1
- __init__.py
- class1.py
- class2.py
project2/project1/class1.py
class Class1(object):
def __init__(self):
super(Class1, self).__init__()
self.name = "DAVE!"
def printname(self):
print(self.name)
def run():
thingamy = Class1()
thingamy.printname()
if __name__ == "__main__":
run()
project2/project1/class2.py
import sys
import os
sys.path.append(os.path.dirname(os.path.abspath(__file__)))
from class1 import Class1
class Class2(Class1):
def childMethod(self):
print('Calling child method')
def run():
thingamy = Class2()
thingamy.printname()
thingamy.childMethod()
if __name__ == "__main__":
run()
project2/class3.py
from project1.class2 import Class2
from project1.class1 import Class1
class Class3(Class2):
def anotherChildMethod(self):
print('Calling another child method')
def run():
thingamy = Class3()
thingamy.printname()
thingamy.anotherChildMethod()
if __name__ == "__main__":
run()
With this setup each of class1, 2 and 3 can be run as standalone scripts.
You could run class2.py from inside the project2 folder, i.e. with the current working directory set to the project2 folder:
user#host:.../project2$ python project1/class2.py
On windows that would look like this:
C:\...project2> python project1/class2.py
Alternatively you could modify the python path inside of class2.py:
import sys
sys.path.append(".../project2")
from project1.class1 import class1
Or modify the PYTHONPATH environment variable similarly.
To be able to extend your project and import for example something in subclass1.py from subclass2.py you should consider starting the import paths always with project2, for example in class2.py:
from project2.project1.class1 import class1
Ofcourse you would need to adjust the methods I just showed to match the new path.

Python - Plugins to Main Program

I need to make a script that calls every .py file in a specific directory. These are plugins to the main program. Each plugin script must be able to access classes and methods from the calling script.
So I have something like this:
mainfile.py:
class MainClass:
def __init__(self):
self.myVar = "a variable"
for f in os.listdir(path):
if f.endswith(".py"):
execfile(path+f)
def callMe(self):
print self.myVar
myMain = MainClass()
myMain.callMe()
And I want to be able to do the following in callee.py
myMain.callMe()
Just using import will not work because mainfile.py must be the program that is running, callee.py can be removed and mainfile will run on its own.
import os
class MainClass:
def __init__(self):
self.myVar = "a variable"
self.listOfLoadedModules = []
for f in os.listdir("."):
fileName, extension = os.path.splitext(f)
if extension == ".py":
self.listOfLoadedModules.append(__import__(fileName))
def callMe(self):
for currentModule in self.listOfLoadedModules:
currentModule.__dict__.get("callMe")(self)
myMain = MainClass()
myMain.callMe()
With this code you should be able to call callMe function of any python file in the current directory. And that function will have access to MainClass, as we are passing it as a parameter to callMe.
Note: If you call callMe of MainClass inside callee.py's callMe, that will create infinite recursion and you will get
RuntimeError: maximum recursion depth exceeded
So, I hope you know what you are doing.

Categories

Resources