How to package SC instrument for beta testers? - python

I've built a sample instrument using the following architecture:
A python script reads sample files from a Redis database stored on disk and sends OSC messages to super collider with the path and pitch of a random selection of N samples. On the SC side, the key presses form a midi interface are mapped to select and play one or more of the corresponding samples.
The prototype is functional, and I would like to release a beta for testers, however I have no clue on how to package it. One option that seems plausible is wrap it as a VST, but as far as I understand, there is no stable wrapper for SC and the safest bet would be to re-code the entire instrument into VST.
Another option, that seems more viable, would be to wrap it as a standalone instrument. Would I need to have the beta testers have SC installed, or is there a way to wrap a SC server inside an executable?
Any ideas on this issue--even if they divert from my original approach-- will be highly appreciated.

Fortunately there are lots of options for this in SuperCollider. You may want to start by reviewing this article from the documentation, in which Making Standalone Applications is discussed rather thoroughly.
Alternately, there are some pre-built standalones floating around, frequently on GitHub. I frequently use this repository to package up an installation or instrument and deploy on Raspberry Pi.

I'm not super familiar with VST or Supercollider, but maybe you could try something like Docker. This is a container based solution which might meet your needs
You set up a Dockerfile, which lets you provide instructions to build a container with the SC Server. Then let the person using it decide whether they want a Redis instance inside the same docker container, or if they want to use a separate Redis Container.

Related

Utility to manage multiple python scripts

I saw this post on Medium, and wondered how one might go about managing multiple python scripts.
How I Hacked Amazon's Wifi Button
This describes a system where you need to run one or more scripts continuously to catch and react to events in your network.
My question: Let's say I had multiple python scripts that I wanted to do run while I work on other things. What approaches are available to manage these scripts? I have to imagine there is a better way than having a large number of terminal windows running each script individually.
I am coming back to python, and have no formal training in computer programming, so any guidance you can provide will be greatly appreciated.
Let's say I had multiple python scripts that I wanted to do run. What
approaches are available to manage these scripts? I have to imagine
there is a better way than having a large number of terminal windows
running each script individually.
If you have several .py files in a directory that you want to run, without having a specific order, you can do:
import glob
pyFiles = glob.glob('path/*.py')
for pyFile in pyFiles:
execfile(pyFile)
Your system already runs a large number of background processes, with output to the system log or occasionally to a service-specific log file.
A common arrangement for quick and dirty deployments -- where you don't necessarily want to invest in making the scripts robust and well-behaved enough to run as proper services -- is to start the script inside screen or tmux. You can detach when you don't need to be looking at it, and can reattach at any time -- even from a remote login -- to view the output, or to troubleshoot.
Take a look at luigi (I've not used it).
https://github.com/spotify/luigi
These days (five years after the question was asked) a lot of people use docker compose. But that's a little heavy weight depending on what you want to do.
I just saw today the script server of bugy. Maybe it might be a solution for you or somebody else.
(I am just trying to find a tampermonkey script structure for python..)

Python program on server - control via browser

I have to setup a program which reads in some parameters from a widget/gui, calculates some stuff based on database values and the input, and finally sends some ascii files via ftp to remote servers.
In general, I would suggest a python program to do the tasks. Write a Qt widget as a gui (interactively changing views, putting numbers into tables, setting up check boxes, switching between various layers - never done something as complex in python, but some experience in IDL with event handling etc), set up data classes that have unctions, both to create the ascii files with the given convention, and to send the files via ftp to some remote server.
However, since my company is a bunch of Windows users, each sitting at their personal desktop, installing python and all necessary libraries on each individual machine would be a pain in the ass.
In addition, in a future version the program is supposed to become smart and do some optimization 24/7. Therefore, it makes sense to put it to a server. As I personally rather use Linux, the server is already set up using Ubuntu server.
The idea is now to run my application on the server. But how can the users access and control the program?
The easiest way for everybody to access something like a common control panel would be a browser I guess. I have to make sure only one person at a time is sending signals to the same units at a time, but that should be doable via flags in the database.
After some google-ing, next to QtWebKit, django seems to the first choice for such a task. But...
Can I run a full fledged python program underneath my web application? Is django the right tool to do so?
As mentioned previously, in the (intermediate) future ( ~1 year), we might have to implement some computational expensive tasks. Is it then also possible to utilize C as it is within normal python?
Another question I have is on the development. In order to become productive, we have to advance in small steps. Can I first create regular python classes, which later on can be imported to my web application? (Same question applies for widgets / QT?)
Finally: Is there a better way to go? Any standards, any references?
Django is a good candidate for the website, however:
It is not a good idea to run heavy functionality from a website. it should happen in a separate process.
All functions should be asynchronous, I.E. You should never wait for something to complete.
I would personally recommend writing a separate process with a message queue and the website would only ask that process for statuses and always display a result immediatly to the user
You can use ajax so that the browser will always have the latest result.
ZeroMQ or Celery are useful for implementing the functionality.
You can implement functionality in C pretty easily. I recomment however that you write that functionality as pure c with a SWIG wrapper rather that writing it as an extension module for python. That way the functionality will be portable and not dependent on the python website.

Using python to control a phone with bluetooth

I would like to know if there are any API's for python to programmatically control a phone, like starting and ending calls, but also to record conversations.
I would also like to use the Headphones and Mic of the computer to talk over the phone.
Any info would be great, I tried googling for something, but nothing useful came up.
Be careful when using PyBluez! The results will actually depend on the BT-USB dongle you are using. Depending on the hardware(the BT chip in there), PyBluez will use one or another BT stack - for example there was one from WIDCOMM. Results will vary, as PyBluez is actually wrapping around those stacks - all of which are far from complete.
So, when you have a working project, be sure to know what actual BT stack you were using :)
For Python audio stuff, you could try this.
PyBluez is an effort to create python wrappers around system Bluetooth resources to allow Python developers to easily and quickly create Bluetooth applications.
Unfortunately I've not found a page dedicated to its features, but it could be a good starting point, whether everything you need is in its feature set, or if you could build your application upon it by extending it.
http://code.google.com/p/pybluez/

Testing Apache/mod_jk/Tomcat configuration upgrade

We have begun upgrading hardware and software to a 64-bit architecture using Apache with mod_jk and four Tomcat servers (the new hardware). We need to be able to test this equipment with a large number of simultaneous connections while still actually doing things in the app (logging in, etc.)
I currently am using Python with the Mechanize library to do this, but it's just not cutting it. Threading is not "real" in Python, and multiprocessing makes the local box work harder than the machines we are trying to test since it has to load so much into memory for Mechanize.
The bottom line is that I need something that will really hammer this thing's connections and hold a session to make sure that the sticky sessions are working in mod_jk. I need to be able to code it quickly, it needs to be lightweight, and being able to do true multithreading would be a perk. Other than that, I am open-minded.
Any input will be greatly appreciated. Thanks.
Open Source Testing Tools
Not knowing the full requirements makes it difficult, however something from the list might fit the bill.
In order to accomplish what I wanted to do, I just went back to basics. Mechanize is somewhat bulky, and there was a lot of bloat involved in the main functionality tests I had before. So I started with a clean slate and just used cookielib.CookieJar and urllib2 to build a linear test and then run them in a while 1 loop. This provided enough strain on the Apache system to see how it would react in the new environment, and for the record, it did VERY well.

Accessing a Panatone Huey via Python

I have a Panatone Huey, a monitor calibration probe (device you attach to the monitor, and it gives you colour readings) - I want to get readings from the device in Python.
Having never written such a device driver before, I'm not sure where to start.
I've found are two open-source C/C++ projects that interface with the Heuy - ArgyllCMS and mcalib.
ArgyllCMS comes with a spotread command which returns readings from the device, although it only functions as an interactive command line tool, so running it via subprocess will not (easily) work.
The code ArgyllCMS uses to communicate with the device is in spectro/huey.c
Not tried it (only just found it while writing this question), but mcalib contains much less code, mainly just heuy.cpp - however it has a worrying number of FIXME comments and incomplete methods, and the code appears to have been automatically generated (unhelpful variable names)
There seems to be three options:
Modify spotread to work without any interactive prompts, call it via subprocess
Create a C-based Python module around huey.c or huey.cpp
Re-implement the interface using something like PyUSB
Being much more familiar with Python, I'm tempted to use PyUSB, but will this be substantially more work than wrapping existing code with the Python C API? Is there anything obvious in either of the C implementations that will not be easily doable in PyUSB?
Given the existence of spotread the easiest (though perhaps not the best) way to proceed would be to use pexpect. It allows you to interact with other command-line programs.

Categories

Resources