Below folder structure of my application:
rootfolder
/subfolder1/
/subfolder2
/subfolder3/test.py
my code inside of the subfolder3. But I want to write output of the code to subfolder1.
script_dir = os.path.dirname(__file__)
full_path = os.path.join(script_dir,'/subfolder1/')
I would like to know how can I do this wihout importing full path to directory.
It sounds like you want something along the lines of
project_root = os.path.dirname(os.path.dirname(__file__))
output_path = os.path.join(project_root, 'subfolder1')
The project_root is set to the folder above your script's parent folder, which matches your description. The output folder then goes to subfolder1 under that.
I would also rephrase my import as
from os.path import dirname, join
That shortens your code to
project_root = dirname(dirname(__file__))
output_path = join(project_root, 'subfolder1')
I find this version to be easier to read.
The best way to get this done is to turn your project into a module. Python uses an __init__.py file to recognize this setup. So we can simply create an empty __init__.py file at the root directory. The structure would look like:
rootfolder
/subfolder1/
/subfolder2
/subfolder3/test.py
__init__.py
Once that is done, you can reference any subfolders like the following:
subfolder1/output.txt
Therefore, your script would look something like this:
f = open("subfolder1/output.txt", "w+")
f.write("works!")
f.close()
Related
So this is a question about how to handle settings files and relative paths in python (probably also something about best practice).
So I have coded a smaller project that i want to deploy to a docker image and everything is set up now except when I try to run the python task (Through cron) I get the error: settings/settings.yml not found.
tree .
├───settings
│ └───settings/settings.yml
└───main.py
And am referencing the yml file as
open('settings/settings.yml', 'r') as f:
config = yaml.load(f, Loader=yaml.FullLoader)
I can see this is what is causing the problem but am unsure about how to fix it. I wish to reference the main file basically by using the entry_points from setuptools in the future so my quick fix with cd'ing before python main.py will not be a lasting solution.
Instead of hardcoding a path as a string, you can find the directories and build the file path with os.path. For example:
import os
import yaml
current_dir = os.path.dirname(os.path.abspath(__file__))
settings_dir = os.path.join(current_dir, "settings")
filename = "settings.yml"
settings_path = os.path.join(settings_dir, filename)
with open(settings_path, "r") as infile:
settings_data = yaml.load(infile)
This way it can be run in any file system and the python file can be called from any directory.
I have a script (dump_ora_shelve.py) which retrieves the data from shelve-storage by specifying the key, e.g.:
def get_shelve_users(field):
import shelve
db = shelve.open('oracle-shelve')
for key in db:
if key == field:
return db[key]
db.close()
The data is retrieved just fine:
print(get_shelve_users('db_users'))
> {'SYS': 'sysdba'}
print(get_shelve_users('oratab'))
> ['orcl:/u01/app']
There is another script which should do the same thing (retrieve the data with key specified) that has dump_ora_shelve imported, but the value returns is Null:
from before_OOP.dump_ora_shelve import get_shelve_users
print(get_shelve_users('db_users'))
> Null
print(get_shelve_users('oratab'))
> Null
The file being imported is located one level above from the file it is importing to.
Please note if I copy both files to the same location import and then function works just fine.
You could provide the full pathname to shelve.open. Remember that inside a module, __file__ is the path of where the source file resides. So you can use that to construct the full pathname.
Typically you will have something like this:
BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
Use os.path.join to concatenate the directory and the filename. Note the use of os.path.dirname and os.path.abspath.
So you can say:
BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
shelve_db = os.path.join(BASE_DIR, 'oracle-shelve')
db = shelve.open('oracle-shelve')
This assumes that the orache-shelve file is in the same folder as the module in which the get_shelve_users function is (dump_ora_shelve.py).
Don't forget the __file__. That is what makes the whole thing tick, i.e. makes your program insulated from whatever its current directory is.
When running the second script your working directory will be the directory where that script is located. That working directory is kept even when you import and use a file from a different package/directory.
So if your dump_ora_shelve.py script and shelve is located in a different directory/package it will not open the correct file.
If you provide the full path to 'oracle-shelve' in dump_ora_shelve.py it should work.
Update:
In your 'dump_ora_shelve.py' file:
ABS_DIR = os.path.dirname(os.path.abspath(__file__))
This gives you the absoulte path of the directory of ''dump_ora_shelve.py'. Join with name of your DB:
shelve_db = os.path.join(ABS_DIR, 'oracle-shelve')
And finally:
db = shelve.open(shelve_db)
This assumes that your 'oracle-shelve' is in the same directory as 'dump_ora_shelve.py'
I have a folder structure like:
Project/Main.py Project/Module/Data.py
Project/Config/config.ini
Edits: Main.py uses Data.py. Only in Data.py I use config.ini. The application is run from Main.py but also from Data.py. The problem is that every time I run it from this separate scripts(one time the path is Config/config.ini, other time is ../Config/config.ini), I need to change this relative path, from Main.py is a path, from Data.py is another path.
How can I achieve to run from Main.py and Data.py and use the same piece of code to identify the config.ini?
Thanks
Put in your Main.py:
import os.path
BASE_DIR = os.path.dirname(__file__)
CONFIG_DIR = os.path.join(BASE_DIR, 'Config', 'config.ini')
And in your Data.py
import os.path
BASE_DIR = os.path.dirname(os.path.dirname(__file__))
CONFIG_DIR = os.path.join(BASE_DIR, 'Config', 'config.ini')
Now you have CONFIG_DIR defined in both scripts, pointing to your config.
I have this problem and do not know how to solve it efficiently.
I've this file structure
THE NUMBER OF FOLDERS NOR THE NAMES ARE GIVEN, IT'S ALL UNKNOWN
app/
__init__.py
modules/
__init__.py
ModuleA/
__init__.py
file.py
otherfile.py
config.ini
ModuleB/
__init__.py
file.py
otherfile.py
config.ini
ModuleC/
__init__.py
file.py
otherfile.py
config.ini
**arbitrary number of modules with same structure*
As you can notiche, app is the main package of my app, but I need an efficient way to import the mods folder and its' content
* My actual solution *
from app import modules ad mods
def load_modules_from_packages(self, pkg = mods):
pkgname = pkg.__name__
pkgpath = dirname(pkg.__file__)
for loader,name,ispkg in pkgutil.walk_packages(pkg.__path__, pkgname+'.'):
if ispkg is True:
__import__(name,globals(),locals(),[],0)
elif ispkg is False:
__import__(name,globals(),locals(),[],0)
This works since pkgutil iterate the structure with the dot notation for names, so import works well.
But now I want load infos in the config file if I am in one of the somemodule folder(the one with own init.py and config.ini
I want to do this to recreate the structure of module package and output it in a JSON rapresentation for another thing
* my other solution does not works*
def load_modules_from_packages(directory)
dir_path = dirname(directory.__file__)
dir_name = directory.__name__
for filename in glob.glob(dir_path + '/**/*.ini', recursive=True):
plugin = {}
plugin['name'] = filename.split('/')[-2]
plugin['path'] = dirname(filename)
plugin['config_file'] = filename
for pyname in glob.glob(dirname(filename)+ '/**/*.py', recursive=True):
importlib.import_module(pyname)
I cant use the solution posted in this thread
How to import a module given the full path?
since I do not know the module name, ad pointed without solutions in the comment.
spec = importlib.util.spec_from_file_location('what.ever', 'foo.py')
module = importlib.util.module_from_spec(spec)
spec.loader.exec_module(module)
I know 'foo.py' but I cant figure out 'what.ever' like pkgutil.walk_package does.
In fact the modules imported in this way have the package and name entry wrong. With this approach I cant figure out where I am in the file structure to create modules dictionary and the relative modules (for the JSON output)
Any help?
There is a file a.py.
The location is /home/user/projects/project1/xxx/a.py.
If I call os.getcwd(), it gives me /home/user/projects/project1/xxx/. But I want to reach /home/user/projects/project1. How can i do this in Python?
Edit : I think i must be more clear. i want this for my Django project.
i use these codes in my settings.py:
PROJECT_PATH = os.path.abspath(os.path.dirname(__file__))
then i use fallowing code to specify where my static file folder is. :
os.path.join(PROJECT_PATH,'statics'),
my settings.py file is under: /home/user/projects/project1/xxx/settings.py
my static file folder is under same dir as settings.py.
now i want to move this folder to /home/user/projects/project1
what should i do with code that in settings.py
thank you
from os.path import dirname
print(dirname(dirname(__file__)))
Each time you call dirname it gives you parent directory. Call as many times as necessary.
Alternatively you can do following:
normpath(join(path1, '..', '..'))
>>> import os
>>> os.getcwd()
'/tmp/test'
>>> os.chdir('..')
>>> os.getcwd()
'/tmp'
>>>
The dot dot (..) represents the parent directory. Because relative path names specify a path starting in the current directory.
See the documentation of os.chdir.