I am working on a bunch of Python 3 scripts that are organized like so:
/workspace
/project_one
script_a.py
script_b.py
/project_two
script_c.py
There are many project folders and every once in a while, one is added, with varying numbers of .py files inside. I want to be able to import each script from each folder in each other script, but since I switched to VS Code, I have no idea how to do that. I tried adding __init__.py files in every directory and various syntaxes for the import statements.
From what I've read, this way of importing is not actually supported in Python 3 (which I find weird, isn't it one of the most common use cases of the import system?)--yet it was really easy in Eclipse with PyDev where I could just go into the project options and select "referenced" projects. I was then able to write e.g. import project_one in script_c.py and it worked perfectly. I now assume that maybe the absolute paths were stored in some Eclipse project file that way? Does VS Code have such a feature? If not, how would I continue programming without having to use some of the "dirty hacks" I read that maybe enable this?
Related
I've tried looking online and I'm honestly lost at this point.
I'm trying to find if there's a way to import python scripts and run them AFTER my Python program has been compiled.
For an example, let's say I have a main.py such that:
import modules.NewModule
a = NewModuleClass()
a.showYouWork()
then I compile main.py with pyinstaller so that my directory looks like:
main.exe
modules/NewModule.py
My end goal is to make a program that can dynamically read new Python files in a folder (this will be coded in) and use them (the part I'm struggling with). I know it's possible, since that's how add-ons work in Blender 3D but I've struggled for many hours to figure this out. I think I'm just bad at choosing the correct terms in Google.
Maybe I just need to convert all of the Python files in the modules directory to .pyc files? Then, how would I use them?
Also, if this is a duplicate on here (it probably is), please let me know. I couldn't find this issue on this site either.
You may find no detailed answer simply because there is no problem. PyInstaller does not really compile Python scripts into machine code executables. It just assembles then into a folder along with an embedded Python interpretor, or alternatively creates a compressed single file executable that will automatically uncompress itself at run time into a temporary folder containing that.
From then on, you have an almost standard Python environment, with normal .pyc file which can contain normal Python instructions like calls to importlib to dynamically load other Python modules. You have just to append the directory containing the modules to sys.path before importing them. An other possible caveat, is that pyinstaller only gets required modules and not a full Python installation, so you must either make sure that the dynamic modules do not rely on missing standard modules, or be prepared to face an ImportError.
I have a pretty specific problem regarding the my python setup. I need to reference python libraries made of .py files which are on my HD, but under a peculiar form of cloud-based version control. It is not possible to move those files elsewhere and I cannot add them to a python solution from VS(using VS2017). Basically, these are historically grown .py files right next to each other which reference each other. I would like to use VS2017 to work on and execute these python files and be able to reference the "neighboring" files without including them in a solution.
When I add these files to a python solution test.sln and adapt the search paths, everything works perfectly fine. I can reference anything, intellisense works, all good.
The modifications to the search paths are, as far as I can see, exclusive to test.sln. I added the source directories to the PYTHONPATH environment variable and disabled the "ignore global paths" option in VS, but still, referencing the .py files with each other without adding them to a solution does not work.
I can't find solution-independent reference search paths for VS, which would solve my problem. Is there a way to add default search paths for VS, or something like that?
# references:
import os
# works
import numpy as np
# works
import custom_file
# throws modulenotfound error
# do_stuff...
Example above.
Thank you for your help.
Just add sys.path.extend([path/to/custom_file_dir]) before trying to import the custom_file.
I am working with PyCharm and am trying to create a module from code I've created so that I can import it into new files. In IntelliJ you can start the module creator but in PyCharm this option does not seem to exist.
Without a module when I type:
import my_code
I receive a warning saying "No module named my_code".
I've tried creating packages to replace the module but this does not work.
How do you repackage code in PyCharm so you can import it into a new file?
The project structure is quite simple. I have a number of files I've created as part of a tutorial. I want to make one of the files, "Importing_Files" a module so that I can import it into another file, i.e., "Import_Tester". I've added a picture below to show the tree.
Here's what I would suggest. It looks like you've already tried to set things up correctly, but you need to organize things in Pycharm a bit differently. I ran into a similar problem, which is why I think having an answer to this question is useful.
Your .idea directory is within the package, which makes things awkward. Try this:
Create a new Pycharm project based on the top level of the project.
Make src and test directories within that project, and set them as source root and test root, respectively.
Move the HelloWorld package into src (make sure it's still recognized as a package).
Create new files in src with main sections for any functions you need to run from the command line, add imports for your package, and move your main code into it.
For any main functions that define tests, do the same thing -- create files with main logic in the tests directory. Unit tests are a better way to do that, but this directory structure should work.
Remove the old project (delete the .idea directory in HelloWorld).
The final project layout should look something like this:
CompletePythonMasterClassUdemy
.idea
src
command_line_main.py
HelloWorld
__init__.py
...
test
test_account.py
This is a better way to organize things that should work both within and outside of Pycharm. Unlike the Java world, Python doesn't have as many common conventions for correctly setting up projects. There are very likely better ways to do things, but this works well for me. It should work well for people getting started with Python library development.
This might be a more broad question, and more related to understanding Python's nature and probably good programming practices in general.
I have a file, called util.py. It has a lot of different small functions I've collected over the past few months that are useful when doing various machine learning tasks.
My thinking is this: I'd like to continue adding important functions to this script as I go. As so, I will want to use import util.py often, now and in the future, in many unrelated projects.
But Python seems to feel like I should only be able to access the code in this file if it lives in my current directly, even if the functions in this file are useful for scripts in different directories. I sense some reason behind the way that works that I don't fully grasp; to me, it seems like I'll be forced to make unnecessary copies often.
If I should have to create a new copy of util.py every time I'm working from within a new directory, on a different project, it won't be long until I have many different version / iterations of this file, scattered all over my hard drive, in various states. I don't desire this degree of modularity in my programming -- for the sake of simplicity, repeatability, and clarity, I want only one file in only one location, accessible to many projects.
The question in a nutshell: What is the argument for Python to seemingly frown on importing from different directories?
If your util.py file contains functions you're using in a lot of different projects, then it's actually a library, and you should package it as such so you can install it in any Python environment with a single line (python setup.py install), and update it if required (Python's packaging ecosystem has several features to track and update library versions).
An added benefit is that right now, if you're doing what the other answers suggested, you have to remember to manually have put util.py in your PYTHONPATH (the "dirty" way). If you try to run one of your programs and you haven't done that, you'll get a cryptic ImportError that doesn't explain much: is it a missing dependency? A typo in the program?
Now think about what happens if someone other than you tries to run the program(s) and gets those error messages.
If you have a library, on the other hand, trying to set up your program will either complain in clear, understandable language that the library is missing or out of date, or (if you've taken the appropriate steps) automatically download and install it so things are ready to roll.
On a related topic, having a file/module/namespace called "util" is a sign of bad design. What are these utilities for? It's the programming equivalent of a "miscellaneous" folder: eventually, everything will end up in it and you'll have no way to know what it contains other than opening it and reading it all.
Another way, is adding the directory/you/want/to/import/from to the path from within the scripts that need it.
You should have a file __init__.py in the same folder where utils.py lives, to tell python to treat the folder as a package. The file __init__.py may be empty, or not, you can define other things in there.
Example:
/home/marcos/python/proj1/
__init__.py
utils.py
/home/marcos/school_projects/final_assignment/
my_scrpyt.py
And then inside my_script.py
import sys
sys.path.append('/home/marcos/python/')
from proj1 import utils
MAX_HEIGHT = utils.SOME_CONSTANT
a_value = utils.some_function()
First, define an environment variable. If you are using bash, for example, then put the following in the appropriate startup file:
export PYTHONPATH=/path/to/my/python/utilities
Now, put your util.py and any of your other common modules or packages in that directory. Now you can import util from anywhere and python will find it.
This question already has answers here:
Closed 13 years ago.
Possible Duplicate:
Gather all Python modules used into one folder?
I don't think this has been asked before-I have a folder that has lots of different .py files. The script I've made only uses some-but some call others & I don't know all the ones being used. Is there a program that will get everything needed to make that script run into one folder?
Cheers!
Since Python is not statically linked language, this task would be rather a challenging one. Especially if some of your code uses eval(...) or exec(...).
If your script is not very big, I would just move it out, make sure that your python.exe does not load modules from that directory and would run the script and add missing modules until it works.
I you have multiple scripts like this, then this manual work is not really the way to go. But in this case also having lots of different .py files in a directory is not a good deployment technique and you should think about packaging them into installable modules and install into your python site-packages.
Still you may use snakefood package to find our the dependencies (has already been discussed here). Again, it just cannot be 100% accurate, but should give you an easy start.
you should be able to extract the needed information from a so called call graph
See for example
http://pycallgraph.slowchop.com/ or
http://blog.prashanthellina.com/2007/11/14/generating-call-graphs-for-understanding-and-refactoring-python-code/
Also, py2exe converts a python call into an executable and in this process it gathers all used modules. I think py2exe is cross platform
I'd try 2½ solutions, one elaborate and 1½ quick-and-dirty:
elaborate: a custom import hook, logging all imports
quick and dirty, part a: os.utime the *.py[co]? (re notation, not glob) files to having access times of yesterday, then run the program and collect all recent access times. Prerequisite: a filesystem that marks access times (by itself and by its mount options).
quick and dirty, part b: remove all *.py[co] files (same in glob and re notation), run the program, see which have been created. Prerequisite: user should have write access to the folder.