What's the best way to install two python modules with the same name? I currently depend on two different facebook libraries: pyfacebook and Facebook's new python-sdk. Both of these libraries install themselves as the module 'facebook'. I can think of a bunch of hacky solutions but before I go an hack away I was curious if there was a pythonic way of dealing with this situation.
I'm using virtualenv and pip.
(Yes, I will eventually deprecate one of them, but I had two different engineers working on two different problems and they didn't realize that they were using a different module until integration)
First, I'd suggest you guys go over what other libraries you're all using so you can get a concesus on how you're building your application.
To support this type of thing place each module within it's own folder, put in an __init__.py file, then you can do this:
import Folder1.facebook as pyfacebook
import Folder2.facebook as facebooksdk
The easiest solution would be to include one (or both) of the modules in your project instead of installing it. Then, you can have more control over the module name and importing.
Related
I want to know if I can create a python script with a folder in the same directory with all the assets of a python module, so when someone wants to use it, they would not have to pip install module, because it would import from the directory.
Yes, you can, but it doesn't mean that you should.
First, ask yourself who is suposed to use that code.
If you plan to give it to consumers, it would be a good idea to use a tool like py2exe and create executable file which would include all modules and not allow for code to be changed.
If you plan to share it with another developer, you might want to look into virtual environments and requirements.txt file.
There are multiple reasons why sharing modules is bad idea:
It is harder to update modules later, at least without upgrading whole project.
It uses more space on version control, which can create issues on huge projects with hundreds of modules and branches
It might be illegal as some licenses specifically forbid including their code in your source code.
The pip install of some module might do different things depending on operating system version or installed packages. The modules on your machine might be suboptimal on someone else's machine, and in some instances might not even work.
And probably more that I can't think of right now.
The only situation where I saw this being unavoidable was when the module didn't support python implementation the application was running on. The module was changed, and its source was put under lib folder with the rest of the libraries.
I think you can add the directory with python modules into PYTHONPATH. Then people want to use those modules just need has this envvar set.
https://docs.python.org/3/using/cmdline.html#envvar-PYTHONPATH
I'm buiding an application that depends on some_package (which is rather large) as installed through pip or conda. I would like to reuse parts of some_package directly in the application; to that end, I have forked some_package, installed it locally, and modified its functionality as needed. The application now depends on two (diverging) versions of the same package of the same name for different functionality.
How do I refer to the pip/conda managed ~/anaconda3/envs/my_env/lib/python3.7/site-packages/some_package/ for internal dependency, and the modified ~/my_project/dependencies/some_package/ for use in my application?
There are several questions on Stack Overflow, but they are either quite old or not the same question:
Python: Two packages with the same name; how do you specify which is loaded?
Is it possible to use two Python packages with the same name?
Importing from builtin library when module with same name exists
What I've tried:
conda develop <local package path> : in this case, the site-package is not visible and breaks internal dependencies
changing the name of the local package folder and importing: there are internal references to the package name that would mean renaming everywhere, and create a management mess if I ever wish to pull new code on the fork
a comment suggested import some_package as package_dev: this obviously won't work as I have no way to refer to both packages in the first place
In the linked questions (and others), there are a number of hacks that will kind of work but break the import system in subtle ways (reload, for package updates, etc). Is there a "pythonic"/recommended way to accomplish this?
I am quite new to Python and Django. I have a problem with integrating a python package (openpyxl) to my django app. I'd like to use the methods of these files into my views.py file.
My problem is first that I don't know where's the best place to put the openpyxl folder containing all the files in my file hierarchy.
My hierarchy looks like this:
http://imgur.com/t4iOX98
Is it well placed? Should I put it outside the international folder? inside the carte_interactive folder?
And my biggest problem is inside the __init__.py of openpyxl. I get errors lines like this one:
from openpyxl.xml import LXML
Where there is no resolved reference to LXML, but is actually defined in the xml file of openpyxl.
Is it my bad file placement that caused this? or is it Django?, or is it openpyxl's fault? Do anyone have an idea?
You can see openpyxl's source files here, where I downloaded them:
https://bitbucket.org/openpyxl/openpyxl/src
If you need any more details, please ask!
Thanks in advance!
I applaud your enthusiasm for wanting to learn Django while being new to Python. That said, the way you have things set up right now will make your life unnecessarily difficult to manage.
I would first recommend reading up on best practices for setting up a Django project. Just doing a quick google search for "Django project layout best practices" will give you a lot of resources, but they'll all essentially tell you to do what's in the SO answer above.
The second very basic thing is using pip to install and use other python packages. This is especially important for a django project, where you often have a lot of dependencies outside of Django. Pip is a program to install additionaly python packages. They get installed in your PYTHONPATH, which is just a list of filepaths on disk where python will look for additional packages. If you're on a *NIX system, this is usually in something like /usr/lib/python2.7/. Once you have something in your python path, you can from any piece of code, use other libraries you've installed via the python import system. Essentially, all this more or less does is look through each location in your PYTHONPATHs for the library you're trying to import.
Finally, in regards specifically to lxml, you will want to install it via apt or some other package installer. (e.g. on ubuntu, apt install python-lxml
In order to keep track of all your external python-dependencies, stuff them in a file named "requirements.txt" in the top level directory. This is a pretty standard thing to do for Django projects, so don't worry about shipping code with ALL dependencies inside the project.
Thanks to all of you! I'm using Jetbrains Pycharm and when I wrote import openpyxl, it gave me the choice to install the package. I suppose it does it with pip, which would certainly have worked the same. And I put the package in requirements.txt, so that other users would only have to install this requirement!
It works now! And thanks for the link on the best practices. I'll read that!
We have a lot small projects that share common utility "projects"
Example:
utility project math contains function add
project A and project B both need math.add
project A has nothing to do with project B
so is it a good idea to have 3 git repositories (project_A,project_B and math) and clone them locally as
/SOMWHERE/workspace/project_A
/SOMWHERE/workspace/math
and have in /SOMWHERE/workspace/project_A/__init__.py something like
import sys
sys.path.append('../math')
import math
math.add()
I have read Structuring Your Project but that doesn't handle SCM and sharing modules.
So to sum up my question: is
sys.path.append('../math')
import math
good practice or is there a more "pythonic" way of doing that?
Submodules are a suboptimal way of sharing modules like you said in your comments. A better way would be to use the tools offered by your language of choice, i.e Python.
First, create virtualenvs to isolate every project python environment. Use pip to install packages and store dependencies in a requirements.txt file.
Then, you can create a specific package for each of your utils library using distutils and share it on Pypi.
If you don't want to release your packages into the wild, you can also host your own Pypi server.
Using this setup, you will be able to use different versions of your libraries and work on them without breaking compatibility with older code bases. You will also avoid using submodules, that are difficult to use with git.
all of what you describe (3 projects) sounds fine except that you shouldn't mess around with sys.path. instead, set the PYTHONPATH environment variable.
also, if you were not aware of distutils i am guessing you may be new to python development, and may not know about virtualenv. you should use that too (it allows you to develope against a "clean" python version that has no packages, or only the packages you install for that env).
New to Python, so excuse my lack of specific technical jargon. Pretty simple question really, but I can't seem to grasp or understand the concept.
It seems that a lot of modules require using pip or easy_install and running setup.py to "install" into your python installation or your virtualenv. What is the difference between installing a module and simply taking it and importing the into another script? It seems that you access the modules the same way.
Thanks!
It's like the difference between:
Uploading a photo to the internet
Linking the photo URL inside an HTML page
Installing puts the code somewhere python expects those kinds of things to be, and the import statement says "go look there for something named X now, and make the data available to me for use".
For a single module, it usually doesn't make any difference. For complicated webs of modules, though, an installation program may do many things that wouldn't be immediately obvious. For example, it may also copy data files into locations the new modules can find them, put executables (binary libraries, or DLLs on Windws, for example) where the new modules can find them, do different things depending on which version of Python you have, and so on.
If deploying a web of modules were always easy, nobody would have written setup programs to begin with ;-)