Kivy: Tight Mac OS X Integration (GSoC Conclusion)

With this Google Summer of Code’s firm pencils down deadline approaching rapidly, I thought I’d write a status report for what is mostly going to be the ‘official’ result of my work. I won’t drop dead right after the deadline passes or this posting is published, but this is just one of the formalities I have to take care of, the sooner the better.

During the course of this Google Summer of Code, I’ve been extensively working on the OS X support of the Kivy framework. As I have explained previously, Kivy’s set of features is based on the concept of providers, meaning that every functionality required by the framework is encapsulated in a provider that defines an abstract interface to the user (i.e. developers) of the framework. Each such interface then has a number of specialized implementations using (in almost all cases) a third-party or system supplied software library that was designed to do work in the field that the respective provider operates in.

Now my task of this GSoC was titled “Tight Mac OS X Integration”, which means that we wanted to reduce the number of external libraries (not system libraries) that we were using in Kivy, as those have to be compiled for and bundled with our OS X executable. Not only does this add to the overall memory requirements (storage wise), it also requires extra steps to be taken when it comes to maintenance and deployment.

In concrete terms, during this GSoC I implemented the following providers:

  • Window (based on SDL; this was an internal decision tito and I made as this would be reusable on iOS as well)
  • Image (based on Apple’s Core Graphics and Quartz APIs; I actually provided two versions of this, see below)
  • Text (based on Apple’s Quartz APIs; I too provided two implementations for this)
  • Audio (based on Apple’s Cocoa APIs)
  • Video (based on Apple’s QTKit APIs)

Now for the image and text providers, I wrote two versions, respectively: One Python based version that uses PyObjC (available by default on OS X) and one ObjC based version to which I bridge using Cython. The advantage of the PyObjC versions are that they’re just single Python files that can be run on any recent Mac without inquiring any bundling or presence of additional tools, or even compile-link cycles. The drawback is that I cannot use them on iOS, as PyObjC is way too bloated for what I need, not well maintained and not functional on iOS anyway. That’s why I have a second version of those providers that actually works on iOS, as can be witnessed in my last blog posting. I will branch these off into the iOS support branch and bring them in back later into master when Kivy support for iOS is officially supported.

So what have we got now? Seven new provider implementations for 5 different core tasks. One of which (Window, SDL) will actually be reusable on all our supported platforms, not just OS X. The other four use native system APIs that already exist on any mac, so there is absolutely no memory footprint (storage wise) added and as soon as this hits master, we will see a dramatic reduction of size and bloat of the Kivy installer (and it will make my life as the OS X maintainer a whole lot easier). In numbers (and this is a back-of-the-envelope calculation), we’ll probably be reducing the size from 100 MB (uncompressed) to about 10 MB (uncompressed). They also benefit from the functionality that OS X inherently provides, such as audio and video codecs (the list of which can be added to by the user by installing things like Perian).

Since midterm, that means we’ve got audio and video providers, a new audio example, text and image PyObjC-based providers, significant visual improvements in terms of text rendering and under the hood changes for text and image display. This is a before/after shot for text rendering. There have also been some other artifacts around the characters that I’ve also gotten rid of. This and a couple fixes from tito now also make it work in the text input widget.

Here’s a video showing the video provider in action (The stutter comes from the screen capture. That video displays smoothly on my Mac).

This was the first time I’ve worked with Objective C or Apple’s APIs, so naturally a lot of work went into researching the different APIs, learning how Objective C works and how I can make use of it (I will write another posting to describe an alternative approach to using Objective C from Python that I came up with). I have a great sense of personal accomplishment in terms of teaching myself and therefore learning new things in this area and this is one of the major things I really like about Google’s Summer of Code project.

As a side note, I recently moved to the US for the remainder of the year to write my master’s thesis and I had to take care of a huge amount of (paper)work for that (this is kind of a pioneer project). So there’s some glitches remaining that I’ll certainly be looking into as soon as I get the time and then merge all of this new code into master and make it available to the user.

For instance, amongst other things, the text display isn’t a pixel perfect match with the other platforms yet and the video provider has problems with larger files that gstreamer handles properly (but at least it handles some other formats that gstreamer doesn’t, I just don’t want to break existing apps that rely on gstreamer supported formats to work at this point).

Anyway I’ll continue to dig into these remaining tasks before and after the deadline and I’d like to take a moment to thank a few people for what has been a terrific Google Summer of Code (probably my last as a student, unfortunately): Paweł Sołyga for being my mentor, Christian Moore for taking care this project could exist under the NUIGroup organisation umbrella and Mathieu Virbel for all the discussions and the help.

Python on iPhone & iPad

I recently had the opportunity to do some research with the goal of being able to run Python on any iOS device (iPhone, iPad, iPod touch). The idea is to only write some Python code (and nothing else) and deploy that to different platforms without changing it (e.g. Windows, Linux, Mac OS X, Android, iOS).

If you’re interested, here’s a preview/draft document that at a very high and easy to understand level very roughly summarizes what had to be done.

Now I’m not saying that this is THE way to develop cross-platform software, especially for devices such as tablets. The goal just was to see whether or not it’s technically possible and feasible to write applications for iOS using Python only. Fortunately, it seems possible and actually the programs run pretty snappy. They also use the GPU for rendering using OpenGL ES 2.0. Also, there was no jailbreak necessary.

Consider this work in progress. There’s still many things on the TODO list, I just wanted to share the early results with you and let you know that it is in fact possible. The code is on github and I’m using the kivy framework. I’m looking for opportunities to present this in much more depth in a journal or at a conference. If you know of any opportunities, please send me a mail (address in the PDF).

python on ipad python on ipad

Update: I mentioned the code to be on github, but didn’t provide any actual links as I was in a hurry when I wrote the blog post. Here are the links: Python for iOS repo (compiles Python 2.7 for ARM, based off of cobbal’s repo): https://github.com/dennda/python-for-iphone Kivy iOS support branch: https://github.com/tito/kivy/tree/ios-support Objective C test app that embeds Python and runs a Kivy example: https://github.com/dennda/python-for-iphone-test You will also need SDL 1.3.

And last but not least I’d like to repeat what I wrote in the PDF and thank my friend Mathieu Virbel (from the kivy team) for all the help. I especially enjoyed the hack session we had at UDS.

Kivy: Tight Mac OS X Integration

I am very happy that I am allowed to participate in yet another Google Summer of Code in 2011. This will be my fourth consecutive and probably last GSoC – at least as a student. I am participating on behalf of the Kivy project for the second time now and my objective is to improve the integration of Kivy on Mac OS X. Let me explain what that means: In order for Kivy to function with its full set of features, which includes things like audio and video playback, window creation with OpenGL content, image display, spelling correction and more, we need to make use of some external software libraries that are dedicated to handle the respective tasks, such as video decoding, etc. Implementing a cross-platform video decoder for example is much work and beyond the scope of our project, which is why we use the work of other brilliant minds instead of reinventing the wheel. That said, there are a lot of different implementations for the kind of things we require. Since our goal is not to restrict the user to a certain set of those solutions, we abstract from those third party APIs and provide the necessary functionality through a common interface; both cross-platform and cross-API. This approach also enables us to more easily port Kivy to new platforms, such as Android, as we don’t have to modify the code that is using the abstractions, but just add a new implementation (which we refer to as a ‘provider’) for the new platform. Taking video as an example, we currently rely on gstreamer for the implementation. While gstreamer has proven to be a good choice on Linux (as it’s often included in Gnome based distributions anyway) it’s not an ideal choice from my point of view for other platforms like Mac OS X. This is because it adds additional packaging and maintenance overhead and also significantly increases the size of the Kivy distribution, as we have to compile and ship gstreamer with Kivy for the Mac (which is not easy anyway) because it doesn’t exist there by default. This is especially true because there is a native API to do the same things you’d use gstreamer for on the Mac. That API is always available on the Mac and using it would mean that we can still provide video playback support without shipping gstreamer. In fact, this is not only true for video, but for all of Kivy’s so-called ‘core providers’. Long story short, the goal of this GSoC is to use as many native Mac APIs (as is reasonable) instead of external libraries. Here is a list of Kivy’s core providers:
  • Window
  • OpenGL
  • Image
  • Text
  • Audio
  • Video
  • Spelling
  • Clipboard
  • Camera
OpenGL already works and so does Spelling because I had already taken care of those previously. The first things I tackled during this GSoC were the window and image providers. Obviously you need a window to display the other things and have some feedback while developing, so this was a pretty obvious starting point. The original plan was to base the new window provider on Cocoa’s APIs directly, but while looking into that Mathieu pointed out that it might be preferable at the start to base it on the development version of SDL instead (which is also what pygame uses) as they already have this abstraction for every platform under the sun, so other platforms might benefit as well. Also, SDL is pretty easy to compile and package. So after some hacking, what do we have right now? There is a working SDL window provider. You can use it to open a Window and display OpenGL content. Still missing is handling of certain SDL events such as proper mouse/touch integration. Wile you can already draw a line with it, it’s not finished the way it should be yet. That is a TODO item. Next up was the image provider. Why? Simply because Kivy loads a default texture for use in its GL operations, so we figured instead of patching that out temporarily I might as well implement the image provider directly. So what we have right now is a working image provider based on CoreGraphics that should be able to handle all image formats that the CG APIs natively handle. There are only two things missing. Firstly it doesn’t seem to be handling alpha channels properly yet and secondly it currently reports a fake list of supported formats. These are TODO items. After finishing what I pointed out before, next up would be text and video providers. Starting tomorrow (May 30th) I will be in the US for a couple of days to visit the university where I’ll hopefully be doing my Master’s thesis. Unfortunately I had to sign a form that I will not engage in any GSoC related coding activities while I am in the US (you have to do that if you’re non-US based), so my next report will have to be a little delayed. This is what it currently looks like with the pictures example: Kivy on OS X via SDL and CoreGraphics The code is available here and right now very much work in progress.