Hudson unable to navigate relative directories - python

I have a Python project building with Hudson. Most unit tests work correctly, but any tests that require writing to the file system (I have a class that uses tarfiles, for example) can't find the tmp directory I have set up for intermediate processing (my tearDown methods remove any files under the relative tmp directory).
Here is my project structure:
src
tests
fixtures (static files here)
unit (unit tests here)
tmp
Here is an example error:
OSError: [Errno 2] No such file or directory: '../../tmp'
I assume this is happening because Hudson is not processing the files while in the directory unit, but rather some other working directory.
What is Hudson's working directory? Can it be configured? Can relative paths work at all?

Each job in Hudson has it's own working directory, at /path/to/hudson/jobs/[job name]/workspace/
For individual jobs, you can set the "Use custom workspace" option (under "Advanced Project Options") to define where the workspace will be.
I guess it would depend on how your tests are being run, but if you inspect the job's workspace you should be able to find where Hudson is writing the files to.

I don't know how you're initializing your workspace, but typically it's done by checking your project out of version control into the workspace. If this is true in your case, the easiest thing to do is to add your tmp directory to version control (say, with a README file in it, if your version control system doesn't support directories). Then, the tmp directory will get checked out into your workspace and things should work again.

I don't know anyhting about Hudson, but this is what I do to ensure, that relative path are working right:
os.chdir(os.path.dirname(sys.argv[0]))

Related

How to bundle Python apps with asset dependencies for end users?

I have a Tkinter app that uses images included in the same folder as the .py file. pyinstaller script.py produces an executable that runs but does not open any windows. This is because it is looking for images that don't exist in the same subdirectory. When I copy the important images to the dist folder Pyinstaller creates, the application runs correctly.
However, I would like to have a single executable that I can share with other users that doesn't also require them to have the images stored. The images should be bundled with the software somehow, like how commercial software (usually) doesn't require you to download assets separately from the program itself.
Is there a way to bundle Python programs and the assets they use into single-click applications?
Note that I am using Python 3 on Linux Mint. I am also something of a novice, so don't be surprised if I'm missing something obvious here.
It appears I've solved my own problem.
Instead of having the images included in the same folder as main.py and using the resulting short relative filepath to reach them, install the images in an appropriate space in the system directory tree (I used /home/$USER$/.$PROGRAMNAME$/) and have the program access the files using the absolute path to that directory. This will allow you to copy the program anywhere on your computer you want and have it run without a problem.
However, if you want to share it with someone else, you'll need to also include an installation script that places the assets in the correct directory on their computer.

Search through all files including dependencies

I have a folder opened in Visual Studio Code, and a file in that folder that I am debugging. I am trying to search for a specific function that has been imported from different project. Apologies for the lack of proper terms, for better explanation, see below
- folder-with-my-projects
- project1
-file-I-am-debugging.py (function X is imported)
- project2
-file-containing-function-X.py
I want to be able to search for function X, when I have the project 1 opened in the IDE. These are all python projects, if that matters. Right now, the search only looks through files in project1 folder and completely ignores anything else. For example, Pycharm's Find in Files does exactly what I need. Is there a way to have this functionality in Visual Studio Code?
Open VScode in a directory that is a parent to both directories and search from there.
Alternatively, you should be able to use multi-use workspaces to add a folder outside your current directory into your project space which should allow you to search it. https://code.visualstudio.com/docs/editor/multi-root-workspaces

Is there a PyCharm-based workflow for sharing run/test configurations?

Chris and Pat collaborate on a Python project. The source code is under version control. They each choose their own checkout location on their respective drives. The collaborative application resides in <root>/app.py.
They both use PyCharm, and since the .idea folders on their checkout drives have some amount of effort invested, they are reluctant not to check it into the repository.
Yet because the checkout locations are distinct, they cannot share one .idea folder. They have resolved the problem by storing their .idea folders in <root>/pycharm/chris/.idea and <root>/pycharm/pat/.idea. Both folders are checked into the repository. They can go home with the confidence that if their desktops' disks fail, the effort they spent on their PyCharm configuration is safe.
But something is missing. They write (pytest-based) tests for their own respective subsets of the code, and they would like to share those tests.
Is there a workflow, from within PyCharm or without, that enables them to work on a unified set of test configurations? I may be missing something basic. It may be the case, for example, that they ought to make an effort to checkout into exactly the same paths, as well as to use venvs located at exactly the same paths, and share the same .idea all the way. If you've found that that's the only sensible solution, arguing for it would also be an answer.
They should just have one .idea file in the git repo and .gitignore the parts they want to keep unique to their systems.
They can also mark a given Run Configuration as "sharable" by enabling "Share" option in it (top right corner of the configuration window). This way it will be stored in .idea/runConfigurations.
You can check in pytest related configuration in git. Pytest configurations can be stored in pytest.ini file which can be version controlled.
pytest.ini file should be at the root of pytest tests directory. Pytest would pick all the test run configuration at the time of test run from that file.
Example config for increasing pytest log verbosity:
#pytest.ini
[pytest]
addopts= -vv
Documentation here: https://docs.pytest.org/en/latest/reference.html#ini-options-ref

How to package a Pycharm python project for upload to AWS Lambda?

Using pycharm on Windows.
I have created a zip file for upload to AWS Lambda the manual way:
1) Install the modules manually into a directory other than the default directory.
2) Create my .py code file
3) Zip the contents of the project folder
4) Upload that zip folder to Lambda
I am new to Pycharm and with a project I see that there are a whole bunch of files and folders that I do not understand.
I tried to zip the entire Pycharm project contents and upload - that did not work. It looks like I need to run some kind of setup that creates the proper folder structure and files that have the correct content.
Any help would be appreciated.
For all those still stuck with this, I have a few suggestions which could possibly resolve the issue altogether:
Use pip's -t option to specify the Application Directory
Using Pip's -t option, one can specify the Application directory. It's better than using the pycharm's package installer, as we can specify the installation directory with this.
Zip the complete Application directory (Answer's your question)
Go inside your Pycharm project directory -> select all -> Right Click -> send to compressed (zip). This may result in the inclusion of some unneeded directories (__pycache__, .idea), but would not affect the program execution. If needed, you may skip those two directories while creating the zip.
I believe you were zipping the project directory, rather than compressing the contents of the Project directory.
As I also answered here Jetbrains now offers the AWS Toolkit which allows local and remote development of Lambda functions.
Despite some lingering issues it works quite well. Still finding my way with it.
It includes packaging and deploying.
Toolkit page on Jetbrains website

Python Project Structure for Esky

My question is essentially, "How should I structure the files and folders of my frozen, deployed Python-based Windows application." To understand my situation, here's some background:
I'm building a desktop application with Python 2.7 for my workplace. It is a GUI-based application built on PyQt. I am building the application with Esky which is a cross-platform freezing and updating framework. Esky basically wraps/calls py2exe, py2app, bb_freeze, or whatever tool you have installed that is appropriate for the current platform. Esky creates a zipped package that looks like this:
prog.exe - esky bootstrapping executable
appdata/ - container for all the esky magic
appname-X.Y.platform/ - specific version of the application
prog.exe - executable(s) as produced by freezer module
library.zip - pure-python frozen modules
pythonXY.dll - python DLL
esky-files/ - esky control files
bootstrap/ - files not yet moved into bootstrapping env
bootstrap-manifest.txt - list of files expected in bootstrap env
lockfile.txt - lockfile to block removal of in-use versions
...other deps...
updates/ - work area for fetching/unpacking updates
These zipped packages can then be placed on a file server which Esky looks to for updates. A number of methods are provided for managing updates including a very simple auto_update(). When an update occurs, the appname-X.Y.platform folder is essentially replaced with the next version folder... so the myApp.0.1.win32 folder is replaced by a myApp.0.2.win32 folder.
The other aspect of background you should know is that I am distributing the application to my coworkers, who do not have Python installed. I'm not distributing a Python package or library, I'm deploying a desktop application (my coworkers don't particularly care what it's written in, just that it works). I've built an Inno installer which installs the application, provides an uninstaller, and various shortcuts. Because everyone on the team has essentially the same Windows 7 64-bit environment, I'm pretty safe building for just that platform.
So, back to the issue of structure. I've read guides that recommend a certain format for a project skeleton, such as Learn Python the Hard Way, Exercise 46 or the Hitchhiker's Guide to Packaging. However these guides are oriented toward Python package developers, not compiled application developers.
I've also run into problems with Esky's appname-X.Y.platform folder, since it changes names every time the program is updated (to reflect the version number). Because I want some shortcuts in the Start Menu to always refer to documentation, changelog, etc, I have the installer place some of those files under the appdata folder. When the program updates, I have some code to check for newer versions of those files I want to be externally "visible" and copy the newer versions out of the appname-X.Y.platform folder and overwrite the copies in the appdata folder. I then also needed a means of storing persistent user settings, so the program generates and uses an appdata\settings folder (otherwise the settings would be wiped with each update).
Should I continue the process of having the application push new files out to the appdata folder post-update? Should I build my own structure of Docs, Examples, Settings, etc. and let the program populate those folders with newer files whenever necessary? Should I attempt to alter or take better advantage of Esky's behavior to better fit my usage? Perhaps I should rework my application to be destributable as both a Python package and an end-user application?
This question relates to this one about static files with Esky, this one about Python deployed application structure, and numerous generic questions about Python project structure which don't specifically address using Esky. Some videos discussing Esky are also available here and here.
I'm seeking recommendations for "best practice" methods to handle these challenges. If this doesn't fit the StackOverflow Question format, I'll gladly attempt to reword or narrow the focus of my question.
So here's my solution to the problem mentioned about about making files available to shortcuts at a static location despite the fact that Esky's auto-updating changes the name of my application folder every update. The function below I have within a class definition for a QMainWindow.
Logging statements could be replaced with print statements if your application doesn't use the logging module, though I highly recommend logging, especially if deploying a standalone application like this.
import os
import shutil
import logging
def push_updated_files(self):
"""
Manually push auto-updated files from the application folder up to the appdata folder
This enables shortcuts and other features on the computer to point to these files since the application
directory name changes with each update.
"""
logger = logging.getLogger(__name__)
#Verify whether running script or frozen/deployed application
if getattr(sys, 'frozen', False):
logger.info("Verifying Application Integrity...")
#Files which should by copied to appdata directory to be easily referenced by shortcuts, etc.
data_files = ['main.ico',
'uninstall.ico',
'ReadMe.txt',
'changelog.txt',
'WhatsNew.txt',
'copyright.txt',
'Documentation.pdf']
logger.debug(" App Path: {0}".format(self._app_path))
#Get application top directory
logger.debug(" AppData Directory: {0}".format(self._appdata_path))
#Get application internal file path
for f in data_files:
a_file = f
int_path = os.path.join(self._app_path, a_file)
logger.debug(" Internal File Path: {0}".format(int_path))
#Get file's creation time
mtime_int = os.stat(int_path).st_mtime
logger.debug(" Internal File Modified Time: {0}".format(time.ctime(mtime_int)))
#Get external file path
ext_path = os.path.join(self._appdata_path, a_file)
if os.path.exists(ext_path):
mtime_ext = os.stat(ext_path).st_mtime
logger.debug(" External File Modified Time: {0}".format(time.ctime(mtime_ext)))
if mtime_int > mtime_ext:
logger.debug(" Replacing external file with new file...")
try:
os.remove(ext_path)
shutil.copy(int_path, ext_path)
except Exception, e:
logger.error(" Failed to replace the external file...", exc_info=True)
else:
logger.debug(" External file is newer than internal file - all is well.")
else:
logger.debug(" Copying file to appdata to be externally accessible")
shutil.copy(int_path, ext_path)
Also related to this, when dealing with user settings (which currently is only a history.txt file used to populate a recent files list) I have a settings folder under appdata but outside the application folder so that settings aren't lost each update. I may make similar folders for documentation and icons.

Categories

Resources