Cannot import name 'market candles' - python

So i'm trying to use this github repository, i put it on my site-packages folder and tested this example, but i got the error cannot import name 'market_candles'. What could be causing this problem? I already made sure that TA-Lib, Pandas and Matplotlib are installed, so where could be the problem? I'm looking at the __init__py and it seems fine.

Cloning the repository directly into your sites-packages will result in nested pyttrex folders. Copy only the pyttrex/pyttrex directory into sites-packages
pyttrex
├── LICENSE
├── README.md
├── pyttrex
│   ├── ADX.py
│   ├── __init__.py
│   ├── average_n
│   ├── average_true_range.py
│   ├── backtest.py
│   └── test.py
└── tgnotifier.py

Related

Trouble organizing Python library for import to behave in expected manner

I've had a lot of trouble figuring out a key point about how the import mechanism works, and how this relates to organizing packages.
Suppose I've written two or more unrelated, reusable libraries. (I'll use "library" informally as a collection of code and resources, including tests and possibly data, as opposed to a "package" in the formal Python sense.) Here are two imaginary libraries in a parent directory called "my_libraries":
my_libraries/
├── audio_studio
│   ├── src
│   │   ├── distortion.py
│   │   ├── filter.py
│   │   └── reverb.py
│   └── test
│   └── test_audio.py
└── picasso_graphics
├── src
│   ├── brushes.py
│   ├── colors.py
│   └── easel.py
└── test
└── test_picasso.py
I'm hoping to accomplish all three of the following, all of which seem to me to be normal practice or expectation:
1. MAIN LIBRARY CODE IN SUBDIRECTORY
For neatness of library organization, I want to put the library's core code in a subdirectory such as "src" rather than at the top-level directory. (My point here isn't to debate whether "src" in particular is a good naming approach; I've read multiple pages pro and con. Some people appear to prefer the form foo/foo, but I think I'd have the same problem I'm describing with that too.)
2. ADD TO $PYTHONPATH JUST ONCE
I'd like to be able to add "my_libraries" to $PYTHONPATH or sys.path just once. If I add a new library to "my_libraries", it's automatically discoverable by my scripts.
3. NORMAL-LOOKING import STATEMENTS
I'd like to be able import from these libraries into other projects in a normal-looking way, without mentioning the "src" directory:
import picasso_graphics.brushes
OR
from picasso_graphics import brushes
HOW TO DO THIS?
Despite much reading and experimentation, I haven't been able to find a solution which satisfies all three of these criteria. The closes I've gotten is to create a picasso_graphics/__init__.py file containing the following:
base_dir = os.path.dirname(__file__)
src_dir = os.path.join(base_dir, "src")
sys.path.insert(0, src_dir)
This almost does what I want, but I have to break up the imports into two statements, so that the __init__.py file executes with the first import:
import picasso_graphics
import brushes
Am I making a wrong assumption here about what's possible? Is there a solution which satisfies all three of these criteria?
What you want, Sean, is most likely what is called a namespace project; Use a tool called pyscaffold to help with writing the boilerplate. Each project should have a setup.cfg with all your project dependencies. Once you have that, create a virtual environment for your project, then install each with ... (inside the environment).
pip install -e audio-studio
pip install -e picasso-graphics
Installing your project into your virtual environment will cause your imports to behave as you want -- between projects.
This is a bit of overhead to get started, I know, but these are skills you want to have sooner or later. Setup.cfg, virtual environments, and pip install -e is a magical pattern that just makes things work where other approaches will drive you mad.
Below is a simple example project I created using pyscaffold. Notice there is a package below src and that src does not have an init.py. That is a decision made by the pyscaffold folks to help ease import confusion - you should likely adopt it.
my_libraries/
├── audio-studio
│   ├── AUTHORS.rst
│   ├── CHANGELOG.rst
│   ├── LICENSE.txt
│   ├── README.rst
│   ├── requirements.txt
│   ├── setup.cfg
│   ├── setup.py
│   ├── src
│   │   └── audio_studio
│   │   ├── __init__.py
│   │   └── skeleton.py
│   └── tests
│   ├── conftest.py
│   └── test_skeleton.py
└── picasso-graphics
├── AUTHORS.rst
├── CHANGELOG.rst
├── LICENSE.txt
├── README.rst
├── requirements.txt
├── setup.cfg
├── setup.py
├── src
│   └── picasso_graphics
│   ├── __init__.py
│   └── skeleton.py
└── tests
├── conftest.py
└── test_skeleton.py

How to install python package namespace from private bitbucket-git repository

I have several related projects that I think will be a good fit for Python's namespace-packages. I'm currently running python 3.8, and have created the following directory structure for testing.
├── namespace-package-test.package1
│   ├── LICENSE.txt
│   ├── README.md
│   ├── setup.cfg
│   ├── setup.py
│   ├── src
│   │   └── pkg1
│   │   ├── cli
│   │   │   ├── __init__.py
│   │   │   └── pkg1_cli.py
│   │   └── __init__.py
│   └── tests
├── namespace-package-test.package2
│   ├── AUTHORS.rst
│   ├── CHANGELOG.rst
│   ├── LICENSE.txt
│   ├── README.md
│   ├── setup.cfg
│   ├── setup.py
│   ├── src
│   │   └── pkg2
│   │   ├── cli
│   │   │   ├── __init__.py
│   │   │   └── pkg2_cli.py
│   │   └── __init__.py
│   └── tests
The entire project is on a private bitbucket (cloud) server at;
git#bitbucket.org:<my-company>/namespace-package-test.git
I would like to install, locally, only package 1. I've tried every iteration I can imagine of the following, but nothing seems to get me there. I either get a repository not found error or a setup.py not found error.
pip install git+ssh://git#bitbucket.org:<my-company>/namespace-package-test.package1.git
Is this possible?
Is my project structure correct for what I am doing?
What should the pip install command look like?
Bonus, what if I only want to install a specific spec using pipx?
pipx install "namespace-package-test.package1[cli] # git+ssh://git#bitbucket.org:<my-company>/namespace-package-test.package1.git"
I think I figured it out ... for posterity sake
Pip install (into virtual environment)
pip install git+ssh://git#bitbucket.org/<company name>/namespace-package-test.git/#subdirectory=namespace-package-test.package1
pipx install - with spec
pipx install "namespace-package-test.package1[cli] # git+ssh://git#bitbucket.org/<company name>/namespace-package-test.git/#subdirectory=namespace-package-test.package1"

ModuleNotFoundError with package installed from github

I installed a package in my anaconda environment by entering the following line:
pip3 install -e git+https://github.com/gauravmm/jupyter-testing.git#egg=jupyter-testing
I keep getting ModuleNotFoundError: No module named 'testing' on line: from testing.testing import test. I have no idea why this is happening, and believe it has something to do with the way my directory structure is set up.
My directory tree looks like this:
├── hw1_get_started.ipynb
├── requirements.txt
└── src
└── jupyter-testing
├── jupyter_testing.egg-info
│   ├── dependency_links.txt
│   ├── PKG-INFO
│   ├── SOURCES.txt
│   └── top_level.txt
├── LICENSE
├── README.md
├── setup.py
└── testing
├── __init__.py
└── testing.py
I am trying to use this module : https://github.com/gauravmm/jupyter-testing.git#egg=jupyter-testing to do some testing in an online class.
I appreciate any help and explanation as to what I am doing wrong! :)

Unit test packages Maven style convention

I want to create a pybuilder project with unit tests and packages. As an example, modified simple python app example, with "helloworld" moved to package "hello".
My first instinct was to match the package structure with "main" and "unittest" sources:
+---src
+---main
| \---python
| \---hello
| helloworld.py
| __init__.py
|
\---unittest
\---python
\---hello
helloworld_tests.py
__init__.py
This does not work because of conflicting "hello" package.
BUILD FAILED - 'module' object has no attribute 'helloworld_tests'
I see pybuilder itself just skips the top-level pybuilder package in unittests, but won't do if there are multiple top-level packages.
My second guess would be to create extra top level package for unittests.
\---unittest
\---python
\---tests
| __init__.py
\---hello
helloworld_tests.py
__init__.py
Is there a better solution or established convention how to organize python tests in packages?
Probably nothing really new for the OP, but I just wanted to collect all options that I could come up with in one place:
1) Just append _tests to names of top-level packages
The easiest way to mirror the structure of src/main/python in the src/test/python almost 1:1 would be by simply appending _tests to the names of the top-level packages. For example, if I have only one top-level package rootPkg, then I can add the corresponding rootPkg_tests to the test/ subdirectory:
src
├── main
│   ├── python
│   │   └── rootPkg
│   │   ├── __init__.py
│   │   ├── pkgA
│   │   │   ├── __init__.py
│   │   │   └── modA.py
│   │   └── pkgB
│   │   ├── __init__.py
│   │   └── modB.py
│   └── scripts
│   └── entryPointScript.py
└── test
└── python
└── rootPkg_tests
├── __init__.py
├── pkgA
│   ├── __init__.py
│   └── modA_tests.py
└── pkgB
├── __init__.py
└── modB_tests.py
This seems to work nicely with PyBuilder 0.11.15 and unittest plugin (notice that I've deviated from PyBuilders convention, and put tests in test instead of unittest, you probably shouldn't do this if you intend to use multiple testing frameworks).
2) If there is only one package: do it like PyBuilder
The PyBuilder is itself built with PyBuilder. This is what its source directory looks like (drastically reduced, unnecessary details omitted):
src
├── main
│   ├── python
│   │   └── pybuilder
│ │    ├── __init__.py
│   │   ├── cli.py
│   │   ├── core.py
│ │      └── plugins
│ │    ├── __init__.py
│   │      ├── core_plugin.py
│   │      └── exec_plugin.py
│   └── scripts
│   └── pyb
└── unittest
└── python
├── cli_tests.py
├── core_tests.py
├── plugins
│   ├── exec_plugin_tests.py
│   ├── __init__.py
│   ├── python
│   │   ├── core_plugin_tests.py
│   │   ├── __init__.py
If I understand it correctly, the tree in unittest mirrors the tree in src, but the directory for the top-level package pybuilder is omitted. That's what you have described in your question as first workaround. The drawback is that it doesn't really work if there are multiple top-level packages.
3) Add one additional tests top-level package
That's what you have proposed as a workaround: mirror the tree in main, but wrap everything in an additional tests-package. This works with many top-level packages in /src/main/python and prevents any package name collisions.
I'm not aware of any convention. Upvote one of the comments below if you have an opinion on that matter.

Installation of MySQL-python in shared hosting

I'm using Hostgator as a testing environment and I had a problem installing MySQL-python, after using:
pip install MySQL-python
Next error raises:
unable to execute gcc: Permission denied
enter code here`error: command 'gcc' failed with exit status 1
I ask technical support about this and they answer me:
This script requires a compiler, which shared accounts do not have
access to. You would need to upload any Python scripts that you want
to use as a precompiled script. You should be able to compile it
elsewhere and then upload to the account to use it.
This is my first project using Python and I have not idea how to do this.
Thanks
======
UPDATE
As André proposed, What I did was using my linux I created two virtual environments (using virtualenv) one with and one without MySQL-python installed.
Checking the file structure, missing files were:
.
├── MySQLdb
│   ├── connections.py
│   ├── connections.pyc
│   ├── constants
│   │   ├── CLIENT.py
│   │   ├── CLIENT.pyc
│   │   ├── CR.py
│   │   ├── CR.pyc
│   │   ├── ER.py
│   │   ├── ER.pyc
│   │   ├── FIELD_TYPE.py
│   │   ├── FIELD_TYPE.pyc
│   │   ├── FLAG.py
│   │   ├── FLAG.pyc
│   │   ├── __init__.py
│   │   ├── __init__.pyc
│   │   ├── REFRESH.py
│   │   └── REFRESH.pyc
│   ├── converters.py
│   ├── converters.pyc
│   ├── cursors.py
│   ├── cursors.pyc
│   ├── __init__.py
│   ├── __init__.pyc
│   ├── release.py
│   ├── release.pyc
│   ├── times.py
│   └── times.pyc
├── _mysql_exceptions.py
├── _mysql_exceptions.pyc
├── MySQL_python-1.2.5-py2.7.egg-info
│   ├── dependency_links.txt
│   ├── installed-files.txt
│   ├── PKG-INFO
│   ├── SOURCES.txt
│   └── top_level.txt
└── _mysql.so
So I copied those files to:
/venv/lib/python2.7/site-packages/
Were /venv/ is the folder of the virtual enviroment created in the hosting.
Thanks again
There is a really simple solution to this. If the root user on the shared hosting has MySQLdb python module installed, then you can create a user specific virtual environment by using the --sytem-site-package flag. This creates a virtual environment with all the modules of the root python installed onto the local venv.
virtualenv --sytem-site-package
You can look up: Make virtualenv inherit specific packages from your global site-packages
You don't have permission to compile things using gcc. You will have to install MySQL-python in another place and then move the files back onto your server.
See py_compile for compiling python scripts
If they mean precompiled I am assuming to make an executable, if your HostGator account is using Windows then you can use py2exe and create an executable. Py2exe makes it so you can run your script on other computers without having to install python.
First create a setup.py which tells what script and all its dependencies, and then run python setup.py py2exe and it will create two folders. You will just need the dist folder with the executable located in there.
There are many nice tutorials on how to do this, good luck!

Categories

Resources