What I want
I'm using Visual Studio Code and Python 3.7.0 and I'm just trying to import another Python file, from another folder, into my python file.
Details
Here is my folder structure
root/
dir1/
data.txt
task11.py
task12.py
dir2/
data.txt
task21.py
task22.py
Helpers/
FileHelper/
ReadHelper.py
So a short explanation:
I use the same function in every "task"-file
Instead of putting the function in every "task"-file, I've created a helper file where the function exists
I want to import the helper file "ReadHelper.py" into my task files
What I've tried
e.g. in the file task11.py:
from Helpers.FileHelper.ReadHelper import *
import os, sys
parentPath = os.path.abspath("../../")
if parentPath not in sys.path:
sys.path.insert(0, parentPath)
from Helpers.FileHelper.ReadHelper import *
import os, sys
sys.path.append('../../')
from Helpers.FileHelper.ReadHelper import *
None of the above solutions works as I always end up with the error:
ModuleNotFoundError: No module named 'Helpers'
I've also tried:
from ..Helpers.FileHelper.ReadHelper import *
But it ends up with the error: ValueError: attempted relative import beyond top-level package
So how can I import the file ReadHelper.py to my task files?
P.S
There are some similar questions to this but they are really old and the answers have not helped me.
Update 1
There is an option in Visual Studio code, If I run this command with this import from Helpers.FileHelper import ReadHelper then no errors are generated and the code executes perfectly.
One downside is that this interactive window is slow at starting and it cannot handle inputs.
I tried the answer of #Omni as well:
$> python -m root.dir1.task11
And it worked! but as he said, there is a downside: it is slow to type in the terminal.
So I tried to create a task in Visual Studio Code that could execute the above shell command for the file that I'm currently in, but did not succeed.
Do you know how to create a task in vscode to run the above command?
I've also tried to add __init__.py-files under every directory so they would be seen as packages Python3 tutorial - 6.4 Module Packages. But this didn't help and the same error occurred.
Update 2
I come up with a way to make it really easy to have a folder structure like this and get the imports to work correctly in the terminal.
Basically what I did was:
created a Python script
created a task in visual studio code
With this, I can now run my python files, with the imports, by only pressing cmd + shift + B.
Explanation
The visual studio task:
{
"version": "2.0.0",
"tasks": [
{
"label": "Run python file",
"type": "shell",
"command": "python3 /PATH_TO_ROOT_FOLDER/run_python_file.py ${file}",
"group": {
"kind": "build",
"isDefault": true
},
"presentation": {
"reveal": "always",
"panel": "new",
"focus": true
}
}
]
}
The part that we want to focus on is this one:
"command": "python3 /PATH_TO_ROOT_FOLDER/run_python_file.py ${file}",
This part runs the new python file I created at the root folder and passes the path, of the file which is active, as a parameter
The python script:
import os, sys
# This is a argument given trough a shell command
PATH_TO_MODULE_TO_RUN = sys.argv[1]
ROOT_FOLDER = "root/"
def run_module_gotten_from_shell():
# Here I take only the part of the path that is needed
relative_path_to_file = PATH_TO_MODULE_TO_RUN.split(ROOT_FOLDER)[1]
# Creating the shell command I want to run
shell_command = createShellCommand(relative_path_to_file)
os.system(shell_command)
# Returning "python3 -m PATH.TO.MODULE"
def createShellCommand(relative_path_to_file):
part1 = "python3"
part2 = "-m"
# Here I change the string "dir1/task11.py" => "dir1.task11"
part3 = relative_path_to_file.replace("/", ".")[:-3]
shell_command = "{:s} {:s} {:s}".format(part1, part2, part3)
return shell_command
run_module_gotten_from_shell()
This python script gets as a parameter the path to the active file
Then it creates a shell command of the path (the shell command is like #kasper-keinänen 's answer)
Then it runs that shell command
With these modifications, I can run any file inside the root directory with imports from any file inside the root directory.
And I can do it by only pressing cmd + shift + B.
You could try running the script with the -m option that allows modules to be located using the Python module namespace docs.python.org.
If you run the task11.py script then:
$ python3 -m dir1.task11
And in the task11.py do the import like:
from Helpers.FileHelper.ReadHelper import *
Adding the full absolute path to the sys.path variable should make it work.
import sys
sys.path.append('/full/path/to/Helpers/FilesHelper/')
from ReadHelper import *
If you're only trying to do this in VSCode, and not during normal run time. You can add the path in the .vscode/settings.json
{
"python.analysis.extraPaths": [
"${workspaceFolder}/webapp"
],
}
NOTE: This does not solve standard Python importing. My use-case was specific to a monolithic project where all editor config files where in the root and thus I couldn't open 'webapp/' as a workspace itself.
a) Execute the task modules as scripts within an environment that knows about the helper functions. That way the code in the taks modules does not have to know anything about the package structure present. It imitates the builtins of the python interpreter.
# cli argument #1 is the task module to execute
import sys
task_to_execute = sys.argv[1]
from Helpers.FileHelper.ReadHelper import *
exec(open(task_to_execute).read())
b) Use relative imports correctly. In order to do so, you have to execute the task code via (this might be a disadvantage of this solution).
$> python -m root.dir1.task11.task11
The problem is your file/folder structure. I would suggest creating a sort of 'control' file in your root folder, which can then work from the top-down to reference all your other modules.
So let's say you had a file in your root folder called MasterTask.py it could look like this:
from dir1.task11.task11 import *
from dir1.task12.task12 import *
from dir2.task21.task21 import *
from dir2.task22.task22 import *
from Helpers.FileHelper.ReadHelper import *
class Master:
#Do your task work here
pass
One other option would be to move the Helpers folder into your Python37\Lib\site-packages folder, which would also allow the use of from Helpers.FileHelper.ReadHelper import * as is - assuming that you are not planning on this to be used on other machines other than your own.
Related
I have a directory that contains sub directories of code that I reuse.
MyBaseDirectory
\genericcodedir1
reuse1.py
\simpleapp1
app1.py
app1.py has the following line
import reuse1
Visual studio will fail to run this since it says it can't find the library.
On windows I simply added the genericcodedir1 to the PYTHONPATH environment variable and all is well.
What should I do on the raspberry pi to allow this to run?
error message:
Exception has occurred: ModuleNotFoundError
No module named 'reuse1'
File "/home/pi/Desktop/Mybasedirectory/simpleapp1/app1.py", line 5, in <module>
import reuse1
I assume you have file structure like this and you open Test folder in VS Code As follows.
You can specify the path by adding the following code above the import statement in app.py:
import sys
sys.path.append("./genericcodedir1")
import reuse1
In addition, you can add the following configuration to the settings.json file to make vscode recognize reuse1.
{
"python.analysis.extraPaths": [
"./genericcodedir1"
]
}
code and result
so if your files looks like :
|_genericcodedir
|_reuse1.py
|_simpleapp1
|_app1.py
you need to add an empty file called __init__.py in your genericcodedir.
another note worthy thing is your working directory (the directory in which your terminal runs)
you may need to append to os path depending on where you are when launching the program
So basically I am running a script in Python that executes other 5 .py scripts as well, just like this:
exec(open('statcprs.py').read())
exec(open('dragndownf.py').read())
exec(open('lowprbubble.py').read())
exec(open('wshearstr.py').read())
exec(open('slices.py').read())
These .py files uses Paraview (another software) to run some stuff, so If I only run "statcprs.py", it will open Paraview's terminal and run the script. The problem is, from the first one "statcprs.py" to the second one "dragndownf.py" it doesn't interrupt the software, and it continue running it, interefering with scripts from both .py files.
I would like to execute the first one, stop and then start the second one from scratch without connection between them. is this somehow possible?
I think the problem is this line (line 1) which opens the terminal:
#!/usr/bin/env pvpython
The following will execute a list of python scripts in the same folder as the driver script:
import os
from pathlib import Path
import subprocess
import sys
scripts = [
'statcprs.py',
'dragndownf.py',
'lowprbubble.py',
'wshearstr.py',
'slices.py',
]
parent = Path(__file__).resolve().parent
for script in scripts:
script_path = parent / script
subprocess.call([sys.executable, script_path])
Here's my file structure
test/
-dir1
-thing.py
-dir2
-__init__.py
-thing2.py
I am using python 3.7 and windows 10.
In thing.py, I'm trying to import a function called foo from thing2.py and have it execute when I run thing.py. My code works perfectly in PyCharm when I press run. However, when I run thing.py from the terminal directly or through code runner in VSCode, I get the following error:
from dir2.thing2 import foo
ERROR: ModuleNotFoundError: No module named 'dir2
Is the issue something to do with my PYTHONPATH or something else?
Based on the information you provided, I reproduced the problem you described. And you could use the following methods to solve it:
Please add the following code at the beginning of the "thing.py" file, which adds the path of the currently opened file to the system path so that VSCode can find "foo" according to "from dir2.thing2 import foo":
import os, sys
sys.path.append('./')
If you don't want to add code, you could add the following setting in "launch.json", which adds the path of the project when debugging the code:
"env": {
"PYTHONPATH": "${workspaceFolder}"
}
I'm try to get a job in crontab to run twice per day at different times. It is a python script that calls other python scripts and a bash script as functions. All of the scripts are located in the path given in the crontab. The crontab looks like this:
PATH=/home/test/Desktop/UntitledFolder/ContinuousTest
0 08 * * 1,2,3,4,5 /home/test/Desktop/UntitledFolder/ContinuousTest/automated.py
46 10 * * * /home/test/Desktop/UntitledFolder/ContinuousTest/automated.py
The code looks like this
#!/usr/bin/env python
import curses
import os
def Move():
os.system("cd /home/test/Desktop/UntitledFolder/ContinuousTest")
def Upgrade():
os.system("python upgrade.py")
os.system("python upgrade.py")
def Setup():
os.system("python setup.py")
os.system("python setup2.py")
def Throughput():
os.system("./test.sh")
def Sleep():
os.system("sleep 320")
Move()
Setup()
Upgrade()
Sleep()
Throughput()
I see that when the script is run from the cronjob, I get this error:
/usr/bin/env: python: No such file or directory
What could be the problem?
/usr/bin/env must search PATH to find the python executable to run. Since you completely replace PATH with only a single directory, and don't include the usual /bin, and /usr/bin paths, env cannot find python to run.
The solution is to either set PATH=/bin:/usr/bin:/home/test/Desktop/UntitledFolder/ContinuousTest, or just dispense with env altogether and put #!/usr/bin/python (or python3 if that is the intention) at the top of your script.
Another reasonable solution would be to not set PATH in your crontab, but put PATH modifications inside the script as necessary instead - that might lead to fewer surprises down the road if you add additional jobs to your crontab.
My problem is that the cronjob seems to be running fine, but not executing the code properly within the .sh files, please see below for details.
I type crontab -e, to bring up cron:
In that file:
30 08 * * 1-5 /home/user/path/backup.sh
45 08 * * 1-5 /home/user/path/runscript.sh >> /home/user/cronlog.log 2>&1
backup.sh:
#!/bin/sh
if [ -e "NEW_BACKUP.sql.gz" ]
then
mv "NEW_BACKUP.sql.gz" "OLD_BACKUP.sql.gz"
fi
mysqldump -u username -ppassword db --max_allowed_packet=99M | gzip -9c > NEW_BACKUP.sql.gz
runscript.sh:
#!/bin/sh
python /home/user/path/uber_sync.py
uber_sync.py:
import keyword_sync
import target_milestone_sync
print "Starting Sync"
keyword_sync.sync()
print "Keyword Synced"
target_milestone_sync.sync()
print "Milestone Synced"
print "Finished Sync"
The problem is, it seems to do the print statements in uber_sync, but not actually execute the code from the import statements... Any ideas?
Also note that keyword_sync and target_milestone_sync are located in the same directory as uber_sync, namely /home/user/path
Thank you for any help.
Your import statements fail because python can not locate your modules. Add them them to your search path and then import your modules, like this (add this to uber_sync.py):
import sys
sys.path.append("/home/user/path")
import keyword_sync
import target_milestone_sync
Python looks for modules in the current directory (the dir the code is executed in), in the $PYTHONPATH environment variable and config files. This all ends up in sys.path which can be edited like any list object. If you want to learn more about the reasons a certain module gets imported or not i suggest also looking into the standard module imp.
In your case you tested your code in /home/user/path via python uber_sync.py and it worked, because your modules were in the current directory. But when execute it in some/other/dir via python /home/user/path/uber_sync.py the current dir becomes some/other/dir and your modules are not found.