Kivy: Tight Mac OS X Integration (GSoC Conclusion)

With this Google Summer of Code’s firm pencils down deadline approaching rapidly, I thought I’d write a status report for what is mostly going to be the ‘official’ result of my work. I won’t drop dead right after the deadline passes or this posting is published, but this is just one of the formalities I have to take care of, the sooner the better.

During the course of this Google Summer of Code, I’ve been extensively working on the OS X support of the Kivy framework. As I have explained previously, Kivy’s set of features is based on the concept of providers, meaning that every functionality required by the framework is encapsulated in a provider that defines an abstract interface to the user (i.e. developers) of the framework. Each such interface then has a number of specialized implementations using (in almost all cases) a third-party or system supplied software library that was designed to do work in the field that the respective provider operates in.

Now my task of this GSoC was titled “Tight Mac OS X Integration”, which means that we wanted to reduce the number of external libraries (not system libraries) that we were using in Kivy, as those have to be compiled for and bundled with our OS X executable. Not only does this add to the overall memory requirements (storage wise), it also requires extra steps to be taken when it comes to maintenance and deployment.

In concrete terms, during this GSoC I implemented the following providers:

  • Window (based on SDL; this was an internal decision tito and I made as this would be reusable on iOS as well)
  • Image (based on Apple’s Core Graphics and Quartz APIs; I actually provided two versions of this, see below)
  • Text (based on Apple’s Quartz APIs; I too provided two implementations for this)
  • Audio (based on Apple’s Cocoa APIs)
  • Video (based on Apple’s QTKit APIs)

Now for the image and text providers, I wrote two versions, respectively: One Python based version that uses PyObjC (available by default on OS X) and one ObjC based version to which I bridge using Cython. The advantage of the PyObjC versions are that they’re just single Python files that can be run on any recent Mac without inquiring any bundling or presence of additional tools, or even compile-link cycles. The drawback is that I cannot use them on iOS, as PyObjC is way too bloated for what I need, not well maintained and not functional on iOS anyway. That’s why I have a second version of those providers that actually works on iOS, as can be witnessed in my last blog posting. I will branch these off into the iOS support branch and bring them in back later into master when Kivy support for iOS is officially supported.

So what have we got now? Seven new provider implementations for 5 different core tasks. One of which (Window, SDL) will actually be reusable on all our supported platforms, not just OS X. The other four use native system APIs that already exist on any mac, so there is absolutely no memory footprint (storage wise) added and as soon as this hits master, we will see a dramatic reduction of size and bloat of the Kivy installer (and it will make my life as the OS X maintainer a whole lot easier). In numbers (and this is a back-of-the-envelope calculation), we’ll probably be reducing the size from 100 MB (uncompressed) to about 10 MB (uncompressed). They also benefit from the functionality that OS X inherently provides, such as audio and video codecs (the list of which can be added to by the user by installing things like Perian).

Since midterm, that means we’ve got audio and video providers, a new audio example, text and image PyObjC-based providers, significant visual improvements in terms of text rendering and under the hood changes for text and image display. This is a before/after shot for text rendering. There have also been some other artifacts around the characters that I’ve also gotten rid of. This and a couple fixes from tito now also make it work in the text input widget.

Here’s a video showing the video provider in action (The stutter comes from the screen capture. That video displays smoothly on my Mac).

This was the first time I’ve worked with Objective C or Apple’s APIs, so naturally a lot of work went into researching the different APIs, learning how Objective C works and how I can make use of it (I will write another posting to describe an alternative approach to using Objective C from Python that I came up with). I have a great sense of personal accomplishment in terms of teaching myself and therefore learning new things in this area and this is one of the major things I really like about Google’s Summer of Code project.

As a side note, I recently moved to the US for the remainder of the year to write my master’s thesis and I had to take care of a huge amount of (paper)work for that (this is kind of a pioneer project). So there’s some glitches remaining that I’ll certainly be looking into as soon as I get the time and then merge all of this new code into master and make it available to the user.

For instance, amongst other things, the text display isn’t a pixel perfect match with the other platforms yet and the video provider has problems with larger files that gstreamer handles properly (but at least it handles some other formats that gstreamer doesn’t, I just don’t want to break existing apps that rely on gstreamer supported formats to work at this point).

Anyway I’ll continue to dig into these remaining tasks before and after the deadline and I’d like to take a moment to thank a few people for what has been a terrific Google Summer of Code (probably my last as a student, unfortunately): Paweł Sołyga for being my mentor, Christian Moore for taking care this project could exist under the NUIGroup organisation umbrella and Mathieu Virbel for all the discussions and the help.