Python sub package imports - python

[Edit]: My problem is unrelated to what I’m asking here. My machine has python 2.7 and 3.6 installed. Requests module was installed for 2.7 site-packages but not 3.6
I’m trying to import requests into a sub directory. I’m not aware of the best way to do this
My folder structure is
Project
- project1
- sub1
- sub1.py <—- trying to use requests here
- tests
But in the future if I want to use requests across the whole project, what’s the most effective was to import it?

If you're importing a third-party module, it's fine to just have e.g. import requests at the top of any .py file where you want to use it. As long as it's installed for your project (e.g. you've done pip install requests) you can use it wherever you want.
If you're importing your own code from one place in your project to use in another place, that can require a bit more thought, but I don't think that's your question here.

Related

How to use a utils across projects and ensure it is updated

I have made a few Python projects for work that all revolve around extracting data, performing Pandas manipulations, and exporting to Excel. Obviously, there are common functions I've been reusing. I've saved these into utils.py, and I copy paste utils.py into each new project.
Whenever I change utils.py, I need to ensure that I change it in my other project, which is an error-prone process
What would you suggest?
Currently, I create a new directory for each project, so
/PyCharm Projects
--/CollegeBoard
----/venv
----/CollegeBoard.py
----/Utils.py
----/Paths.py
--/BoxTracking
----/venv
----/BoxTracking.py
----/Utils.py
----/Paths.py
I'm wondering if this is the most effective way to structure/version control my work. Since I have many imports in common, too, would a directory like this be better?
/Projects
--/Reporting
----/venv
----/Collegeboard
------/Collegeboard.py
------/paths.py
----/BoxTracking
------/BoxTracking.py
------/paths.py
----/Utils.py
I would appreciate any related resources.
Instead of putting a copy of utils.py into each of your projects, make utils.py into a package with it's own dedicated repository/folder somewhere. I'd recommend renaming it to something less generic, such as "zhous_utils".
In that dedicated repository for zhous_utils, you can create a setup.py file and you can use that setup.py file to install the current version of the zhous_utils into your python install. That way you can import zhous_utils into any other python script on your PC, just like you would import pandas or any other package you've installed to your computer.
Check out this stackoverflow thread: What is setup.py?
When you understand setup.py, then you will understand how to make and install your own packages so that you can import those installed packages to any python script on your PC. That way all source code for zhous_utils is centralized to just one folder on your PC, which you can update whenever you want and re-install the package.
Now, of course, there are some potential challenges/downsides to this. If you install zhous_utils to your computer and then import and use zhous_utils in one of your other projects, then you've just made zhous_utils into a dependency of that project. That means that if you want to share that project with other people and let them work on it as well or use it in some way, then they will need to install zhous_utils. Just be aware of that. This won't be an issue if you're the only one interacting/developing the source code of the projects you intend to import zhous_utils into.

Questions regarding python setup script

I have a few questions regarding python setup scripts or rather how to properly setup a module (as I am doing this for the first time and kind of struggling).
For simplicity I just post a link to the corresponding github repository rather than explaining the project in detail. I am fully aware that the project as it is will not work (e.g. the file constants.py is missing) but for starters I would like the "structure" to work.
There are two main components in this project, i.e. pymap and agb - both dependent on each other (which should not be a problem I guess). I also would like to use scripts located in the bin/ directory which of course use the modules pymap and agb. For installation I use sudo ./setup.py develop which installs the modules as I can now use them in a python3 shell. The line import pymap.pymap_gui will throw an error (since constants.py is not yet in the project) however the import can be resolved.
When - on the other hand - calling the scripts with pymap.py the same import can not even be resolved:
ModuleNotFoundError: No module named 'pymap.pymap_gui'; 'pymap' is not a package
How can this be even though from a python3 shell the import works perfectly fine?
Furthermore - where to improve my project structure? Is my setup the way to go (not minding the messy code and not yet working project itself - I am viewing this from kinda structure-ish perspective).
The problem was simply that the module had the same name as the script (pymap and pymap.py). Sorry for bothering!

Intellij keeps re-ordering my `import os`

I'm having trouble understanding Intellij's import policy for python for import os. As far as I know, the import order is supposed to be standard library first, then third party packages, then company packages, and finally intra-package or relative imports. For the most part Intellij orders everything correctly, but keeps pushing import os into third party packages. Am I missing smth? Isn't import os a standard library package?
It might happen if the corresponding module comes from a virtual environment that itself is located inside a project directory, and it confuses the detection of the right import group. There was a similar request in the tracker, but it was fixed quite a while ago. Which version of Python plugin do you use? Would you mind creating a dedicated issue in YouTrack so that we could investigate the problem further there?
The answer I got from a co-worker a couple of years age is that os was originally a third-party package; IntelliJ left it where it is for some backward compatibility issue.

Proper way of adding module search path on Windows for Python standalone apps?

I am developing a plugin for a multi-platform Python program (Deluge). My plugin needs to import some modules which aren't available by default (in my case, the requests module).
On Linux, everything works flawlessly assuming the required modules are installed beforehand (e.g. via pip).
On Windows, the program makes use of python27.dll which comes as part of the installer, so importing modules - even those available on the local Python installation (verified via interpreter) - yields an import error.
I've seen the answers to this question, but I'd like to know if there is a proper way of adding module search paths for Python on Windows specifically. Is it safe to assume C:\Python27\Lib\site-packages will point me to the local Python installation's modules?
EDIT: Is there a different method I could incorporate for using "external" modules? Could I perhaps package other modules into my final .egg file? Not just plain Python, but more sophisticated modules like requests which need to be properly built and may even rely on other modules.

Python / Git / Module structure best practice

We have a lot small projects that share common utility "projects"
Example:
utility project math contains function add
project A and project B both need math.add
project A has nothing to do with project B
so is it a good idea to have 3 git repositories (project_A,project_B and math) and clone them locally as
/SOMWHERE/workspace/project_A
/SOMWHERE/workspace/math
and have in /SOMWHERE/workspace/project_A/__init__.py something like
import sys
sys.path.append('../math')
import math
math.add()
I have read Structuring Your Project but that doesn't handle SCM and sharing modules.
So to sum up my question: is
sys.path.append('../math')
import math
good practice or is there a more "pythonic" way of doing that?
Submodules are a suboptimal way of sharing modules like you said in your comments. A better way would be to use the tools offered by your language of choice, i.e Python.
First, create virtualenvs to isolate every project python environment. Use pip to install packages and store dependencies in a requirements.txt file.
Then, you can create a specific package for each of your utils library using distutils and share it on Pypi.
If you don't want to release your packages into the wild, you can also host your own Pypi server.
Using this setup, you will be able to use different versions of your libraries and work on them without breaking compatibility with older code bases. You will also avoid using submodules, that are difficult to use with git.
all of what you describe (3 projects) sounds fine except that you shouldn't mess around with sys.path. instead, set the PYTHONPATH environment variable.
also, if you were not aware of distutils i am guessing you may be new to python development, and may not know about virtualenv. you should use that too (it allows you to develope against a "clean" python version that has no packages, or only the packages you install for that env).

Categories

Resources