Python built-in module for handling Excel files - python

I know similar questions have been popular in the past, but none refers to my problem. I'm looking for a way to read data from Excel file in Python, but I'm strongly against using non-builtin modules.
The reason why is that in my case Python is a component of another software, so incorporating additional module would require from every user knowledge about how to use pip, which Python installation on your pc should one install module into, etc. The solution must not require any additional actions from user.
I can read CSV files with Python builtin easily, so that could work, but how can I convert Excel to CSV in the first place? Or is there a way to read Excel directly?
Edit: It is Python 2, that is used in this software.
Edit2:
Anyone minds explaining the downvote? I think this isn't a question about a ready solution or module, but rather a method and is well detailed. It is not always possible to use external modules, so this is an actual problem. If it is not possible at all though, then I would simply expect an answer instead of -1.

Not really the prettiest solution, but you could download the complete code repository of one of the excel handling packages for python (openpyxl for example) and put these files in the same directory as the python script that you're going to run. Subsequently you can do an import of these local package files in your script.
Note: if the excel handling package has dependencies on other packages, then you'll need to download these as well.

Related

Dynamically importing .py files after compiling

I've tried looking online and I'm honestly lost at this point.
I'm trying to find if there's a way to import python scripts and run them AFTER my Python program has been compiled.
For an example, let's say I have a main.py such that:
import modules.NewModule
a = NewModuleClass()
a.showYouWork()
then I compile main.py with pyinstaller so that my directory looks like:
main.exe
modules/NewModule.py
My end goal is to make a program that can dynamically read new Python files in a folder (this will be coded in) and use them (the part I'm struggling with). I know it's possible, since that's how add-ons work in Blender 3D but I've struggled for many hours to figure this out. I think I'm just bad at choosing the correct terms in Google.
Maybe I just need to convert all of the Python files in the modules directory to .pyc files? Then, how would I use them?
Also, if this is a duplicate on here (it probably is), please let me know. I couldn't find this issue on this site either.
You may find no detailed answer simply because there is no problem. PyInstaller does not really compile Python scripts into machine code executables. It just assembles then into a folder along with an embedded Python interpretor, or alternatively creates a compressed single file executable that will automatically uncompress itself at run time into a temporary folder containing that.
From then on, you have an almost standard Python environment, with normal .pyc file which can contain normal Python instructions like calls to importlib to dynamically load other Python modules. You have just to append the directory containing the modules to sys.path before importing them. An other possible caveat, is that pyinstaller only gets required modules and not a full Python installation, so you must either make sure that the dynamic modules do not rely on missing standard modules, or be prepared to face an ImportError.

Universalizing my program/Making it accessible to other users

So I am creating a program that takes input, processes the data, then puts it in Excel. In order to do this, I am using the "xlwt" package (and possibly xlrd). How do I then give this program to other people without making them download python and the packages associated with my program? I considered utilizing an online python interpreter and giving the username/password to my coworkers, but xlwt isn't on any of the ones I've tried, and they don't offer a way (that I can see) to download new packages.
You would have to compile the code into an exe file. The py2exe library can help you out with this

Manually adding libraries

Given that every library is a python code itself, I think its logical that instead of using the import command, we can actually copy the whole code of that library and paste it to the top of our main.py.
I'm working on a remote pc, I cannot install libraries, can I use a library by just doing this?
Forgive me if this a very silly question.
Thanks
In some cases yes, you can. But there are (a lot of) libraries that have some of their functionality written in C and compiled to binary (e.g. the famous numpy). You can't just paste those.
Another thing that the pasting might introduce are naming colisions. If you use
import module
than any name in the module module can be safely used in the importing module using module.name even if the name name is already defined somewhere in the importing module. If you just paste the code this won't work.
While pasting the entire library at the top of your main file can work, I don't think it's the best way to go about solving your problem.
One option is to move the library and put it in to the same folder as your main.py file. I believe the import statement will check the current working directory for the library before looking elsewhere.
Another option is to use a virtual environment(virtualenv) and then install all the required libraries within this virtual environment. I'm not sure that this will work for you since you said you cannot install on this libraries and virtualenv requires pep. If you are interested in learning more about python virtual environments, take a look here.
Most modules are actually written in C, like Pygame for example. Python itself is based on C. You can't jump to conclusions, but if the library is pure Python, I'd suggest copying the package into your project directory and importing, rather than copying and pasting code snippets.

Difference between installing and importing modules

New to Python, so excuse my lack of specific technical jargon. Pretty simple question really, but I can't seem to grasp or understand the concept.
It seems that a lot of modules require using pip or easy_install and running setup.py to "install" into your python installation or your virtualenv. What is the difference between installing a module and simply taking it and importing the into another script? It seems that you access the modules the same way.
Thanks!
It's like the difference between:
Uploading a photo to the internet
Linking the photo URL inside an HTML page
Installing puts the code somewhere python expects those kinds of things to be, and the import statement says "go look there for something named X now, and make the data available to me for use".
For a single module, it usually doesn't make any difference. For complicated webs of modules, though, an installation program may do many things that wouldn't be immediately obvious. For example, it may also copy data files into locations the new modules can find them, put executables (binary libraries, or DLLs on Windws, for example) where the new modules can find them, do different things depending on which version of Python you have, and so on.
If deploying a web of modules were always easy, nobody would have written setup programs to begin with ;-)

Gather all Python modules used into one folder? [duplicate]

This question already has answers here:
Closed 13 years ago.
Possible Duplicate:
Gather all Python modules used into one folder?
I don't think this has been asked before-I have a folder that has lots of different .py files. The script I've made only uses some-but some call others & I don't know all the ones being used. Is there a program that will get everything needed to make that script run into one folder?
Cheers!
Since Python is not statically linked language, this task would be rather a challenging one. Especially if some of your code uses eval(...) or exec(...).
If your script is not very big, I would just move it out, make sure that your python.exe does not load modules from that directory and would run the script and add missing modules until it works.
I you have multiple scripts like this, then this manual work is not really the way to go. But in this case also having lots of different .py files in a directory is not a good deployment technique and you should think about packaging them into installable modules and install into your python site-packages.
Still you may use snakefood package to find our the dependencies (has already been discussed here). Again, it just cannot be 100% accurate, but should give you an easy start.
you should be able to extract the needed information from a so called call graph
See for example
http://pycallgraph.slowchop.com/ or
http://blog.prashanthellina.com/2007/11/14/generating-call-graphs-for-understanding-and-refactoring-python-code/
Also, py2exe converts a python call into an executable and in this process it gathers all used modules. I think py2exe is cross platform
I'd try 2½ solutions, one elaborate and 1½ quick-and-dirty:
elaborate: a custom import hook, logging all imports
quick and dirty, part a: os.utime the *.py[co]? (re notation, not glob) files to having access times of yesterday, then run the program and collect all recent access times. Prerequisite: a filesystem that marks access times (by itself and by its mount options).
quick and dirty, part b: remove all *.py[co] files (same in glob and re notation), run the program, see which have been created. Prerequisite: user should have write access to the folder.

Categories

Resources