folks.
I am making a program with Python on RaspberryPi2 that installed I2C modules. But I frustrated to write codes using I2C on RaspberryPi, because it is very slow and it cannot use my favorite editor Sublime Text2. I think if I will be able to emulate I2C on my Macbook Air or Ubuntu laptop, I can write codes faster and efficient.
Could you kindly advise me a way to realize my wish?
What you really want is a way to deploy to the raspberry pi so you can develop locally. There are a number of different solutions(Git push/ pull, scp, ftp etc..) You should look into FabricLink api. This allows you to seamlessly add deployment to your cycle.
Related
I am currently developing an automation project using python in VSCode to be run on a Raspberry Pi. I am interfacing sensors for my data collection which requires libraries and extensions from raspberry pi, as well as some RPi dedicated libraries etc. With that, I would need to develop the program on RPi itself if I would want to debug it.
I am fairly new to VSCode and RPi automation projects in general, I was wondering if there are tools or extensions available so that I could somehow debug my python code without the need of running it on RPi? Like locally running my scripts on my computer before deploying it on actual hardware? I thought of just commenting RPi dedicated parts of my code which I find tedious, but I wonder if there are some VSCode extensions available that could directly interpret my RPi dedicated code so I could easily debug it as it is.
I'm working on an IOT project where multiple users are running a Python package I maintain on a Raspberry Pi Zero. I send the Raspberry Pis out to the users with the software preloaded, but the project is still pretty early in development and updates to the package happen frequently.
The problem is, many of the users are not up to the task of updating a Python package on a Raspberry Pi with a headless OS. I'd like to find a way to set the Pi to automatically upgrade the package with pip whenever I put out a new tagged version.
My initial thought was to use cron or systemd to run "sudo pip3 install my-package --upgrade" on startup. The major downside, though, is that pip takes a long time to run on a Raspberry Pi and using it this way seriously slows down boot time, even when there is no upgrade to install.
Is there a better way I haven't thought of?
You can make an API or mount server where the Raspberry request if there's a new version to update (you have to save what's is the current version installed), if there is, apply the update.
It is not a pythonic solution at all, but I will work for your purpose.
Is it possible to open files from a Raspberry pi in windows for editing (using for example notepad++)?
I am currently using the built in python IDE in Raspbian but i feel that it would speed up the development process if i could use a windows IDE for development. I have also tried using a git repo to share files between the PI and Windows but it is a bit cumbersome to.
Or does anyone have any other ideas about workflow between Windows and Raspberry?
You can run a SAMBA server on your Raspberry Pi, set your python project folder as a network disk. Then you can use any windows IDE you like, just open the file which is on the network disk.
Currently I am using VS2015 + Python Tools for Visual Studio for remote debugging purpose.
Sure. I go through many ways and I found one of the best way is using WinSCP.
It's very easy for you to edit and update file with notepad++ right in the Windows.
Why not just set up a VM on your windows machine with rasbian running? Something like this will get you started: http://www.makeuseof.com/tag/emulate-raspberry-pi-pc/
Otherwise - set up a network share between the two, edit files on your windows computer, & run from the pi.
I am running python scripts on the pi using another voice recognition python script at the moment. I now also want to run these scripts from the internet. According to a little bit of research, one way could be setting up a small webserver on the pi such as lighttpd and create a database on it. Then create another small script which periodically checks a value in the database. This value can be modified over the internet. According to the value I will be using the voice recognition script or using the other values in the database to run the python scripts.
My question is, is this method efficient or is there a simpler method to do this? I am fairly competent at python but I am totally new to web servers and databases. However I do not mind to spend time learning how to use them.
Thanks in advance!
One route that I personally chose, was the configure the Pi for use as a LAMP (Liniux Apache MySQL Python). Some great instructions can be found here: http://www.wikihow.com/Make-a-Raspberry-Pi-Web-Server
If this is overkill, have you considered using cron jobs to automate your pythons scripts? You could then set up times at which your two scripts would run, and with a little inter-process communication you have two entities that are aware of each other. http://www.thesitewizard.com/general/set-cron-job.shtml
I am writing a piece of code that uses the Box.com Python SDK. The SDK uses the requests module to communicate with Box.com as per the API documentation. For my purposes, I need to make several GET and POST requests in a row, some of which could be used to transfer files. The issue that I'm running into is this:
On Linux (Ubuntu 13.10), each request takes a relatively long time (5 to 15 seconds) to get through, though transfer speeds for file transfers are as expected in the context of my network connection.
On Windows 8.1, running the exact same code, the requests go through really fast (sub-second fast).
On both platforms I am using the same version of iPython (1.1.0) and of the requests module (1.2.3) under Python 2.7. This is particularly problematic for me because the code I'm working on will eventually be implemented on Linux machines.
Is this problem someone has encountered before? I would love to hear from anyone with some ideas on what the issue might be. I have yet to try it on a different Linux installation to see if it is a problem with the specific setup.
Thanks.
EDIT 1
So, I decided to check this using virtual machines. Using the same Debian virtual machine under Windows all the responses were fast, but under Ubuntu they were slow. I then made a Ubuntu 12.04 live USB and ran the code on that, and the responses were fast there as well.
So, it's not Python or Linux in general, it's my particular installation and I have no idea how to diagnose the problem :(
Use a tool such as wireshark (which needs to be run with sudo on most distributions) to log the individual network packets when your code makes the API requests, to determine what is taking so long.
My guess is the following possibilities are most likely:
For some reason your Ubuntu installation is picking up the wrong DNS server list, and DNS lookups are timing out.
IPv6 issue (which may appear to be a DNS issue, too). Disable IPv6.