EuroPython 2008 wrap up
EuroPython 2008 was fun. I presented two talks (My God, it's Full of Files -- Pythonic filesystem abstractions and Version Control for Du^H^HDevelopers ) and one lightning talk (RST+S5 for your slides), participated in a bunch of open space sessions, listened to about 13 talks, took a bunch of pictures, but most importantly had interesting hallway conversations with interesting people.
As usual, PyPy
was heavily represented, and seems to be making nice
progress toward being the nice and featureful default Python
implementation of the future. I especially liked the restricted
execution features and the LLVM
backend. The zc.buildout
me decide I will try to replace (part of?) one custom deploy mechanism
zc.buildout -- most likely I'll end up rewriting most of the
current things as
zc.buildout recipes, but hopefully some of the
pre-existing recipes will be useful, and hopefully I can then later
reuse the recipies I create for this setup.
Personally, I think my talks went ok. I understand videos will be available later, as soon as transcoding etc are finished. I'm anxious to see them myself, as I'm still finetuning my public speaking skills. I'm learning, though -- this year I had no trouble staying within my time slot, even when I was adjusting verbosity on the fly.
For some reason, I felt underprepared for the filesystem API talk , but ultimately people liked the idea of a consistent Pythonic filesystem API enough that we had an open space session on it, and people were enthusiastic about a sprint to prototype the API. Which is what we ended up doing, too -- I'll blog separately about the results of that.
My decentralized version control talk
seemed to me to go over more
smoothly; I guess that's just because I've been thinking about version
control and project management a lot lately, so it was easy to talk
about the topic in a relaxed way. On the other hand, it wasn't as
much a call to action, and it really was overly generic, so I didn't
get as strong audience participation there. We did have an interesting
conversation about branch management strategies and such, though. I
consciously tried to keep the talk on a generic level, as I felt a
git talk would have alienated some listeners, but I did end
up feeling restricted by that. There was some interest on a
Teach me git -style session, but what we ended up doing was just talking one
on one about getting started with
git, during the sprints. Sorry
if I missed any one of you -- grab me on
#git to continue, or find
me in future conferences ;)
I was requested to organize an open space session for Twisted Q&A, and that is exactly what we did. We went through a bunch of things related to asynchronous programming concepts, Deferreds, working with blocking code and libraries, database interfaces, debugging and unit testing.
I was also pulled in to another Twisted open space session, that was
mostly about what greenlets are and how to use them. I tried to
explain the differences between classical Deferreds,
inlineCallbacks, and greenlets, to the best
of my understanding. As a summary, with greenlets any function you
call can co-operatively yield execution (I mean yield in the
scheduling meaning, giving away your turn to run, not in the Python
generator meaning -- interestingly
inlineCallbacks etc actually
make those be the same thing... my kernel instincts make me want to
say "sleep"). Yielding in any subroutine means anything you do may end
up mutating your objects -- which is the root evil behind threading we
wanted to get away from. All the other mechanisms keep the top-level
function in explicit control of yielding. Around that time, most
people left for lunch, but about three of us stayed and talked about
debugging Deferreds and network packet processing with
twisted.pair and friends.
One of the interesting hallway conversations was about what happens when upstream web hosting listed on PyPI is failing. It seems PyPI already does some sort of mirroring, but even that might not be enough. Many companies seem to be bundling eggs of their dependencies in their installation package, which sounds like a good setup for commercial click-to-install deployment. But it would still be good to see a CPAN -style mirror network for PyPI, and at least some people seemed even motivated to donating servers and bandwidth. Personally, I'm mostly spoiled by the combination of Debian/Ubuntu and decentralized version control, and my level of paranoia is too high to automatically install unverified software from the internet anyway. My primary motivation in the conversation was to point out that PyPI already has some sort of mirroring/upload setup, and that you'd really want to specify exact versions and SHA-1 hashes of your dependencies. Optionally, you could delegate the known good hash storage to PyPI (assuming you trusted PyPI not to attack you), but that would require a full Debian-style signature chain from a trusted key, or you'd be owned by anyone capable of MITM attacks, DNS forgery, or cracking a PyPI mirror.