Using BBC MicroBit accelerometer with Python

Posted on Tue 26 January 2016 in Development • Tagged with bbc, microbit, Python, howto

In these days I'm having a bit of fun with BBC MicroBit board and I'm learning how to use the different sensors available. The latest one I wanted to try was the accelerometer. The board can "sense" if you are moving it in any of the 3 dimensional axes: X, Y, Z. According to the documentation there are four methods available that can be used to get these values: microbit.accelerometer.get_values() will return you a tuple with all the 3 values, while  microbit.accelerometer.get_x()microbit.accelerometer.get_y()microbit.accelerometer.get_z() will give you the single values.

The documentation on the official website doesn't explain much and for example I didn't even know what was the range of the values I can get back from these methods (by the way it's between -1024 and 1024), so I decided to play with the code directly and write a very simple example. The small example I wrote, shows a smile on the board display if you keep it straight and shows a sad face if you bend it.

This is the result:

and this is all the needed code of the application:

In the next days I will try to play with more sensors and to publish other examples.


How to write a custom Django Middleware

Posted on Sun 23 August 2015 in Development • Tagged with Django, HowTo, middleware, Python, tutorial

To understand how a Django Middleware works we need to remember that the basic architecture of Django is composed by a request and a response. A middleware is something that stays in the middle. Let's give a look to the next diagram, taken from official Django documentation:

middleware

Important things to know

There are four important things to know about middlewares:

  • You need to write a class that just inherit from object
  • The order where you place your middleware in settings.py is important: middlewares are processed from top to bottom during a request and from bottom to top during a response.
  • You don't need to implement all the available methods of a middleware. For example you can just implement process_request and process_template_response
  • If you implement process_request and you decide to return an HttpResponse, all the other middlewares, views etc... will be ignored and only your response will be returned

Writing a middleware

In my example I wanted to implement a feature that saves the time when a request is made and the time when a request has been processed, then calculates the time delta and exposes this value in the context so that is accessible from our templates. How to implement a similar feature using a middleware? Here is my example:

from datetime import datetime


class BenchmarkMiddleware(object):
    def process_request(self, request):
        request._request_time = datetime.now()

    def process_template_response(self, request, response):
        response_time = request._request_time - datetime.now()
        response.context_data['response_time'] = abs(response_time)
        return response

Please don't care about how I calculated the time. I'm aware that there are better ways to do it, but I just wanted to keep it simple and show how to implement a simple middleware.

If you want to see a complete example of a project that includes and uses this middleware, here you can find the complete source code: https://github.com/andreagrandi/benchmark-middleware-example

References


Automatically pull updated Docker images and restart containers with docker-puller

Posted on Sat 25 October 2014 in HowTo • Tagged with containers, docker, docker.io, Flask, Python, howto, Linux

If you use docker.io (or any similar service) to build your Docker containers, it may be possible that, once the new image is generated, you want your Docker host to automatically pull it and restart the container.

Docker.io gives you the possibility to set a web hook after a successful build. Basically it does a POST on a defined URL and send some informations in JSON format.

docker-puller listens to these web hooks and can be configured to run a particular script, given a specific hook. It's a very simple service I wrote using Python/Flask. It's also my first Flask application, so if you want to improve it, feel free to send me a pull request on GitHub.

Note: this is not the only existing service that is able to do this task. I took inspiration from this article http://nathanleclaire.com/blog/2014/08/17/automagical-deploys-from-docker-hub/ and I really tried to customize https://github.com/cpuguy83/dockerhub-webhook-listener for my own needs, but the problem is that dockerhub-webhook-listener is not ready to be used as is (you have to customize it) and I'm not very good with Golang (yet) to be able to do it in little time. This is why I rewrote the service in Python (that is my daily language). I want to thank Brian Goff for the idea and all the people in #docker @ FreeNode for the support.

How to use docker-puller

Setting up the service should be quite easy. After you clone the repository from https://github.com/glowdigitalmedia/docker-puller there is a config.json file where you define the host, port, a token and a list of hooks you want to react to. For example:

{
    "host": "localhost",
    "port": 8000,
    "token": "abc123",
    "hooks": {
        "hello": "scripts/hello.sh"
    }
}

Create a bash script (in this case it was called hello.sh) and put it under script folder and write the instructions to be executed to pull the new image and restart the container (example):

docker pull andreagrandi/test:latest
docker stop test
docker rm test
docker run --name test -d -p 8000:80 andreagrandi/test:latest

Once configured, I suggest you to setup a Nginx entry (instructions not covered here) that for example redirect yourhost.com/dockerpuller to localhost:8000 (I would advise to enable SSL too, or people could be able to sniff your token). The service can be started with: "python app.py" (or you can setup a Supervisor script).

At this point docker-puller is up and running. Go to docker.io automatic build settings and setup a webhook like this: http://yourhost.com/dockerpuller?token=abc123&hook=hello

Every time docker.io finishes building and pushing your image to the docker registry, it will POST on that URL. docker-puller will catch the POST, check for a valid token, get the hook name and will execute the relative script.

That's all! I hope this very simple service can be useful to other people and once again, if you want to improve it, I will be glad to accept your pull requests on GitHub.


How to configure Edimax EW-7811UN Wifi dongle on Raspbian

Posted on Tue 02 September 2014 in HowTo • Tagged with howto, Linux, RaspberryPI, WIFI, networking, Debian

If you want to connect your RaspberryPi to your home network and you want to avoid cables, I suggest you to use the Edimax wifi adapter. This device is quite cheap (around £8 on Amazon) and it's very easy to configure on Raspbian (I assume you are using a recent version of Raspbian. I'm using the one released on 20/06/2014).

edimax-pi3

Configure the wifi adapter

Edit /etc/network/interfaces and insert these configuration values:

auto lo
iface lo inet loopback
iface eth0 inet dhcp

allow-hotplug wlan0
auto wlan0

iface wlan0 inet dhcp
wpa-ssid YOURESSID
wpa-psk YOURWPAPASSWORD

Power management issue

There is a known "issue" with this adapter default configuration that makes it to turn off if the wlan interface is not in use for some minutes. To avoid this you have to customize the parameters used to load the kernel module. First check that your adapter is using 8192cu module:

sudo lsmod | grep 8192
8192cu 551136 0

Create the file /etc/modprobe.d/8192cu.conf and insert the following lines inside:

# prevent power down of wireless when idle
options 8192cu rtw_power_mgnt=0 rtw_enusbss=0

I also suggest to create a little entry in crontab to make the RaspberryPi ping your router every minute. This will ensure that your wifi connection will stay alive. To edit crontab just type (from pi user, you don't need to be root):

crontab -e

and insert this line at the end:

*/1 * * * * ping -c 1 192.168.0.1

where 192.168.0.1 is the IP of your router (of course substitute this value with the ip of your router).

Keep Alive Script

I created a further script to keep my WIFI alive. This script will ping the router (change the IP using the one of your router) every 5 minutes and if the ping fails it brings down the wlan0 interface, the kernel module for the wifi and bring them up again.

Just put this script in /root/wifi_recover.sh and then execute from root user:

chmod +x wifi_recover.sh
crontab -e

Insert this line inside the crontab editor:

*/5 * * * * /root/wifi_recover.sh

Conclusion

The configuration is done. Just reboot your RaspberryPi and enjoy your wifi connection.