Basic Application

Alright, there's one thing you need to know about me: I'm quite stupid :) So I will start from the start and  play around with Kivy using basics and build up on that. NB: this post may be a bit redundant, Kivy offers a Quickstart already, from which I draw most of the content of this post.

First you'll have to install the Kivy library, there is an installation guide for Ubuntu as well (that's what I use). I've had some problems with the graphic library, preventing some examples included in the library to function. This is due to a bug, but luckily there is a patch to solve the issue for now. The patch's installation is quite long, but it's worth it :)

Then we're ready to start coding. Kivy's Quickstart offers a simple example to run:


Now you should not have problem running this on your system. You could do that from your terminal by running $ python ./main.py in the folder where you saved your Python file. Personally, I use Geany for coding, so I can run it from there. and you should see something like this on your screen:
The window is a black canvas with a large grey button (with rounded corners). The window decoration is not part of the coding and is taken care of by your Window Manager. There may be a way to prevent this decoration by changing a property of the Kivy application, I am not sure of that just yet, but it could also be done by addressing the WM directly (by setting window decoration rules in Compiz for instance).
You can hover and click the button like this:
Now don't forget Kivy is made to function on touch-screen technology, there are three touch events: Down, Move, Up. From what I understand, touch events are passed in a top-down fashion, i.e. from the top-level parent widget, to the children. When the touch-event matches a listener of any of the widgets standing in the way of that event, they will launch their respective callbacks. Events are better explained in the documentation. Down and Up touch-events each match two default event listeners of the Button widget, namely on_press() and on_release(). The default callbacks of the Button widget affect its appearance: the background changes from grey to blue.

If your application does not shut down, you may be affected by the bug mentioned above. Next post will pick up from here and try to tweak this basic application and see what more can be done with App() and Button().



Basic Application

Alright, there's one thing you need to know about me: I'm quite stupid :) So I will start from the start and  play around with Kivy using basics and build up on that. NB: this post may be a bit redundant, Kivy offers a Quickstart already, from which I draw most of the content of this post.

First you'll have to install the Kivy library, there is an installation guide for Ubuntu as well (that's what I use). I've had some problems with the graphic library, preventing some examples included in the library to function. This is due to a bug, but luckily there is a patch to solve the issue for now. The patch's installation is quite long, but it's worth it :)

Then we're ready to start coding. Kivy's Quickstart offers a simple example to run:


Now you should not have problem running this on your system. You could do that from your terminal by running $ python ./main.py in the folder where you saved your Python file. Personally, I use Geany for coding, so I can run it from there. and you should see something like this on your screen:
The window is a black canvas with a large grey button (with rounded corners). The window decoration is not part of the coding and is taken care of by your Window Manager. There may be a way to prevent this decoration by changing a property of the Kivy application, I am not sure of that just yet, but it could also be done by addressing the WM directly (by setting window decoration rules in Compiz for instance).
You can hover and click the button like this:
Now don't forget Kivy is made to function on touch-screen technology, there are three touch events: Down, Move, Up. From what I understand, touch events are passed in a top-down fashion, i.e. from the top-level parent widget, to the children. When the touch-event matches a listener of any of the widgets standing in the way of that event, they will launch their respective callbacks. Events are better explained in the documentation. Down and Up touch-events each match two default event listeners of the Button widget, namely on_press() and on_release(). The default callbacks of the Button widget affect its appearance: the background changes from grey to blue.

If your application does not shut down, you may be affected by the bug mentioned above. Next post will pick up from here and try to tweak this basic application and see what more can be done with App() and Button().



Using microphone peak as input

I’m currently on a project that involve disabled peoples, audio and kinect. Theses boys and girls are doing lot of loud sounds, so the idea is to use their sound as a trigger. We can use gstreamer to make that work quite easily, cause it have everything we need: a audio source, and level calculator.

import pygst
pygst.require('0.10')
import gst, gobject
gobject.threads_init()
 
pipeline = gst.parse_launch(
    'pulsesrc ! audioconvert ! '
    'audio/x-raw-int,channels=2,rate=44100,endianness=1234,'
    'width=32,depth=32,signed=(bool)True !'
    'level name=level interval=10000000 !'
    'fakesink')
 
level = pipeline.get_by_name('level')
bus = pipeline.get_bus()
bus.add_signal_watch()
 
def show_peak(bus, message):
    # filter only on level messages
    if message.src is not level:
        return
    if not message.structure.has_key('peak'):
        return
    # read peak
    print 'peak', message.structure['peak'][0]
 
# connect the callback
bus.connect('message', show_peak)
 
# start the pipeline
pipeline.set_state(gst.STATE_PLAYING)
 
ctx = gobject.gobject.main_context_default()
while ctx:
    ctx.iteration()

The output could be something like this :

peak -35.2370719856
peak -35.0252114393
peak -10.8591229158
peak -4.6007387433
peak -4.85102463679
peak -6.45292575653
peak -6.83102903668
peak -7.39486319074
peak -13.9852340247
peak -17.423901433
peak -35.0852178272
peak -35.8725208237

Next, we can use that information to record their sound, and use it on some scenario. So, instead of use the fakesink, we can use appsink. This module allow you to read the buffer pushed by the previous module. So we can put theses buffers into a list, and use them when needed :)

The state machine will handle the 3 phases :

  1. Wait for a peak > -30db
  2. Recording the sound, stop when the peak is < -32db
  3. Replay the last sound

Note: The -30 / -32 are taken from my tests. If you have more noise, you need to adjust theses triggers.

And here is the final example:

import pygst
pygst.require('0.10')
import gst, gobject
gobject.threads_init()
 
pipeline_play = None
pipeline = gst.parse_launch(
    'pulsesrc ! audioconvert ! '
    'audio/x-raw-int,channels=2,rate=44100,endianness=1234,'
    'width=32,depth=32,signed=(bool)True !'
    'level name=level interval=10000000 !'
    'appsink name=app emit-signals=True')
 
state = 'wait'
peak = -99
buffers = []
level = pipeline.get_by_name('level')
app = pipeline.get_by_name('app')
bus = pipeline.get_bus()
bus.add_signal_watch()
 
def show_peak(bus, message):
    global peak
    # filter only on level messages
    if message.src is not level:
        return
    if not message.structure.has_key('peak'):
        return
    # read peak
    peak = message.structure['peak'][0]
 
def enqueue_audio_buffer(app):
    buffers.append(str(app.emit('pull-buffer')))
 
def play_sample(sample):
    global pipeline_play
    with open('audio.dat', 'wb') as fd:
        fd.write(sample)
    if pipeline_play is None:
        pipeline_play = gst.parse_launch(
            'filesrc location=audio.dat !'
            'audio/x-raw-int,channels=2,rate=44100,endianness=1234,'
            'width=32,depth=32,signed=(bool)True !'
            'audioamplify amplification=2 ! autoaudiosink')
    pipeline_play.set_state(gst.STATE_NULL)
    pipeline_play.set_state(gst.STATE_PLAYING)
 
# connect the callback
bus.connect('message', show_peak)
app.connect('new-buffer', enqueue_audio_buffer)
 
# start the pipeline
pipeline.set_state(gst.STATE_PLAYING)
 
# main loop
ctx = gobject.gobject.main_context_default()
while ctx:
    ctx.iteration()
    print state, peak
 
    # wait for somebody to make a sound
    if state == 'wait':
        if peak > -30:
            state = 'record'
            continue
        # discard any buffer
        buffers = []
 
    # record the current audio, until the peak is going down
    elif state == 'record':
        if peak < -32:
            state = 'replay'
            continue
 
    # replay last sound
    elif state == 'replay':
        play_sample(''.join(buffers))
        state = 'wait'

Usage: Just make a sound… and it will replay just after.

Kivy on Android, part 2

Hi guys,

Look like people are following my blog and waiting for android version of Kivy.
We have a launcher that you can already use. Check the :

Maybe during the next release, or a little bit after, i’ll release a software to create an Android package of a Kivy application. The code is already on launchpad, but it’s still a work in progress. As soon as i have finished, i’ll publish it on kivy-dev mailing list. If you didn’t subscribe yet, do it now ! :)

More to come by the end of that week… !

Kivy (next PyMT) on Android, step 1 done !

Tonight is a wonderful night.

I know that i didn't announce Kivy officially yet, but i'll do it in another blog post very soon. You just need to know that Kivy is the next PyMT version. From 2 years ago with thomas, we have regulary doubts and reflections about using Python for PyMT. And i've started to look more at the future, and i was deeply convince that for our sake, we must be able to run on a Webbrowser. The goal is simple: same code for every platform, at least what we use every day: Linux / Windows / Macosx / Android / iOS.

Android and iOS are new OS, and we was thinking that except running in webbrowser, we will be never able to run on it. And we have started to target a futur with fewer dependencies, OpenGL ES 2.0 compatible, and so on. This vision have been named Kivy. Theses last days, i've removed numpy and pyopengl dependencies. Pygame is the only library required for running an application with widgets. (minimal don't mean full featured).

And i've started to look at the android platform, since Tom from Renpy library have deliver a pygame subset for android. He just made an awesome work. My part was just to understand how it work, and get Kivy compilation done.

For now, here is what i've got :

Ok, but what i got exactly ?

  • Python/Pygame running from renpytom project
  • Failed attempt to use numpy on android
  • Kivy adapation for android (opengl debug mode, removing numpy and pyopengl, link on opengl es 2.0...)
  • Pygame change to create OpenGL ES 2.0
  • Various patch on the build system

And here is my step 2 :

  • Send to upstream all the patch on the build system
  • Resolve symbol conflict when 2 compiled module have the same name (kivy.event and pygame.event... nice naming.)
  • Add a way of detecting Android platform from python
  • Add multitouch support to pygame and/or kivy
  • Add android sleep/wakeup in kivy
  • Write documentation about how to compile a kivy application on android

For now, sleep time ! Enjoy.