Author: Thomas Keller
Simple script to build Dovecot with sieve / managesieve support
One important principle in software engineering is DRY – Don’t Repeat Yourself. This can of course also be applied on tasks a system administrator has to do over and over again. One of this tasks for me was recently to have a working dovecot installation with sieve support.
Unfortunately the authors of dovecot and its sieve plugin decided to make the process of getting there not quite easy, but at least straight forward enough to automate it. I’d still rather like to use some packaged debs, but I could not find any (of course stock Debian Lenny debs do not support it), but for now the process for me is as easy as picking up the new version strings, edit the following script and hit run. I hope its useful for somebody else:
#!/bin/bash
DOVECOT_VERSION=1.2.3
SIEVE_VERSION=0.1.11
MANAGESIEVE_VERSION=0.11.8
BUILDDIR=”$(dirname $(readlink -f $0))/build”
rm -rf $BUILDDIR
mkdir $BUILDDIR
cd $BUILDDIR
#
# step 1: fetch dovecot, patch it with managesieve and build it
#
echo “fetching, patching, building and installing dovecot-$DOVECOT_VERSION”
wget -qO – “http://www.dovecot.org/releases/${DOVECOT_VERSION:0:3}/dovecot-$DOVECOT_VERSION.tar.gz” |
tar -xzf –
cd dovecot-$DOVECOT_VERSION
wget -qO – “http://www.rename-it.nl/dovecot/${DOVECOT_VERSION:0:3}/dovecot-$DOVECOT_VERSION-managesieve-$MANAGESIEVE_VERSION.diff.gz” |
gzip -dc | patch -p1 >/dev/null
./configure –enable-header-install >>$BUILDDIR/build.log
make install >>$BUILDDIR/build.log
cd $BUILDDIR
#
# step 2: fetch and build dovecot-sieve
#
echo “fetching, building and installing dovecot-sieve-$SIEVE_VERSION”
wget -qO – “http://www.rename-it.nl/dovecot/${DOVECOT_VERSION:0:3}/dovecot-${DOVECOT_VERSION:0:3}-sieve-$SIEVE_VERSION.tar.gz” |
tar -xzf –
cd dovecot-${DOVECOT_VERSION:0:3}-sieve-$SIEVE_VERSION
./configure –with-dovecot=$BUILDDIR/dovecot-$DOVECOT_VERSION >>$BUILDDIR/build.log
make install >>$BUILDDIR/build.log
cd $BUILDDIR
#
# step 3: fetch and build dovecot-managesieve
#
echo “fetching, building and installing dovecot-managesieve-$MANAGESIEVE_VERSION”
wget -qO – “http://www.rename-it.nl/dovecot/${DOVECOT_VERSION:0:3}/dovecot-${DOVECOT_VERSION:0:3}-managesieve-$MANAGESIEVE_VERSION.tar.gz” |
tar -xzf –
cd dovecot-${DOVECOT_VERSION:0:3}-managesieve-$MANAGESIEVE_VERSION
./configure –with-dovecot=$BUILDDIR/dovecot-$DOVECOT_VERSION \
–with-dovecot-sieve=$BUILDDIR/dovecot-${DOVECOT_VERSION:0:3}-sieve-$SIEVE_VERSION >>$BUILDDIR/build.log
make install >>$BUILDDIR/build.log
cd $BUILDDIR
echo “all done – build log is available in $BUILDDIR/build.log”
Überwachung ist cool… ähem…
Die Jungs und Mädels vom AK Vorrat Leipzig haben einige sehr tolle Videos ins Netz gestellt, die um längen besser funktionieren als der kürzlich misslungene Antiwahlwerbespot von polit-digital.de / probono.de (obwohl deren Vorlage wirklich sehr gut war):
Alle Videos gibt es im brandneuen YouTube-Channel des AK Vorrat Leipzig. Weiterverbreiten ausdrücklich erwünscht!
Eine Bombenbauanleitung
Endlich kommt es zur langersehnten Strafrechtsverschärfung für den Besuch von Terrorcamps und der Veröffentlichung von Bombenbauanleitungen im Internet, nachdem nun kurz vor der Sommerpause auch der Bundesrat zugestimmt hat. Damit die Herren vom BKA und LKA wissen, wonach sie fahnden müssen, hier schon einmal eine kleine Beispielanleitung von mir:
Stupid boy…
Is cloud gaming ecological?
I’ve recently stumbled upon a couple of new companys like Onlive or Gaikai (demo) whose primary business model is to stream video games hosted in huge server farms (the “clouds”) over broadband networks to everyones low-powered home computer. And this business model makes me think, not only if I remind myself that today’s video platforms like YouTube already take a huge piece of the global bandwidth usage (somebody once calculated this for youtube last year before they started the high quality video steaming and estimated they stream about 126 Petabytes a month).
No, it also makes me think about ecological issues. Let us compare the possible energy consumption between a “traditional” gamer and a (possible) future online gamer who is using one of these services. I won’t and can’t give you detailed numbers here, but you can probably get an idea where I am heading if you once read Saul Griffith’s game plan – its all about getting the full picture of things.
Let’s start out with the traditional gamer, who has a stationary PC or Laptop with a built-in 3D graphics card, processor and sound system. If he plays video games all his components are very busy: The CPU is calculating the AI and game logic, the graphics card is processing the pixel and vertex shaders rendering billions of pixels, vertexes and polygons every second into a digital image stream, which is then sent to the user’s monitor at the highest possible frame rate. A sound system outputs the game’s voices, music and sound effects with the help of the computer’s built-in sound card. As I said I can’t give you a final number here, every setup is a little different to the other, but you can probably get an idea how much power is used even for an average gamer setup – several hundreds of watt.
How does the online gamer compare to that? Well, the first look is good. The only things this gamer’s computer has to process here are video and sound, and the video actually only has to be decoded from a regular encoded digital format. Most PCs even with a lower GHz rate will be able to accomplish this task. The sound will be, by today’s standards, probably only simple stereo, so no need for a custom sound processor or big sound setup either. I’d guess the usual consumption for this setup would be less than one hundred watts. Sounds great? Maybe, but maybe not.
The thing is that the video signal itself has to be generated first – on a high-end machine or “cloud” of computers. This means that the needed graphics and CPU power consumption is moved from the “client” – the gamer’s PC – to a “server” component – it did not simply vanish. There is not a single computer involved which consumes energy to let the user play, but maybe a huge ball park. And the parts of the ball park which process the game’s contents need extra power. I don’t know how much, but I bet it won’t be little.
Ok, server farms might be better suited for these kind of tasks, you might say, because virtualizing these computing-intensive tasks would mean you could use serveral server instances in parallel and therefor also use their power consumption more ecological… But wait, this is not a simple web server idling most of the time which gets virtualized here, we’re speaking of game virtualization. Remember how the single users PC was under full load while computing the game’s contents? And, how much can the program code of a game which is used to run on a single PC really be virtualized and parallalized? Does every of these online gaming clients needs dedicated hardware in the end…?
Now, lets assume the services managed to work around these problems somehow smartly – the online gamer’s power consumption footprint of course raised already because we learned that his video signal needed to be created somewhere else first which might have costed a lot of power. But we’re still not there – the signal is still in the “cloud” – and its huge! Uncompressed video in true color even with a – by todays standards – lower resolution of 1024 by 768 pixels takes for a smooth experience 75 Megabytes per second! Hell, If I get a 1 MB/s download rate today I’m already happy…
So, of course the video signal needs to be compressed. While the later decompression is not as costly, the compression, especially for real-time video, is and it takes lots of processing power and a very good codec like H.264. Special, dedicated hardware might do this task faster than an average Joe’s PC hardware components, but this hardware still needs extra power which we need to consider.
Are we done with the online gamer? Well, almost, the video signal is created, compressed and ready for transport, but it hasn’t yet been transported. We need the good old internet for this and send the constant huge stream of packets over dozens of hops and myriads of cables to him. Every extra hardware which is needed for this extra network load again needs hundreds, if not thousands of watts. Of course not exclusive, but the partial bandwidth and power consumption of these components is surely different if you browse a website, listen to online radio or stream a full-screen video.
As I said multiple times, I can’t give you any detailed numbers, but I really, really have the bad feeling that the whole idea of game virtualization is just a big, fat waste of resources – especially energy.
Von der Leyen beim Leipziger Gespräch
…und endlich, nach etwa einer Stunde teils trögen Talks, konnte ich durch meine Zuschauerfrage auf das Thema Websperren aufmerksam machen und es nahm seinen Lauf. Leider fiel es einigen “Zensursula”-Gegnern aus unserem Umfeld nicht leicht, den Dialog sachlich fortzuführen und so wurde durch deren Gegröhl die Zuschauermenge eher (von uns) angewidert als angestachelt, mehr zu erfahren und zu hinterfragen. Wir konnten dann aber wenigestens im Anschluss noch einige Gespräche führen und unser Aufklärungsfaltblatt an die Frau / den Mann bringen – ohne kreischende Dummbeutel wäre es wohl aber noch ein größerer Erfolg gewesen.
Links: Kurzbericht von FeFe, Kurzbericht von metafrog, Audio von MDR Info, Video
Zweierlei Maß
Spiegel meldet jüngst die Ablehnung der Bundesregierung, Staatshilfen bzw. Notkredite für Arcandor, der angeschlagenen Karstadt-Mutter, zu bewilligen. Ungeprüft geflossen ist dagegen noch im letzten Herbst und bis heute das Geld für die HRE-Rettung. Eine Bank ist halt systemrelevant, so ein Kaufhaus-Multi mit Zigtausenden Beschäftigten eben nicht.
Nur damit wir uns nicht falsch verstehen, ich bin genauso wenig für das Ausbaden von Management-Fehlern durch den Steuerzahler, aber irgendwie werde ich das Gefühl nicht los, dass hier mit zweierlei Maß gemessen wird. Kann es sein, dass Bund, Länder und Gemeinden (= die öffentliche Hand) einfach zu viele wertlose HRE-Anteile gebunkert hatten, so dass die Bank “in jedem Fall” schnell gerettet werden musste, um einige öffentliche Haushalte nicht insolvent werden zu lassen…?
Schade, Chance verpasst, einer Bank ohne standhaftes Geschäftsmodell den Garaus zu machen – lassen wir sie doch einfach weiter misswirtschaften. Die nächste Hausse auf dem US-Aktienmarkt fängt gerade an – da können die Jungs von der HRE doch bestimmt wieder gut mitspekulieren…
Schade auch, dass durch diese Maßnahme der Bundesregierung nun noch mehr Leute bei der Bundestagswahl in extreme Lager flüchten werden. Auf der Straße werden jedenfalls die Wenigsten die Entscheidung verstehen.
Home
A must-see for every human – especially for those from the western world. No strings attached, the movie speaks for itself.
Grundrechtsfest des AK-Vorrat im Johannapark
Der Arbeitskreis Vorratsdatenspeicherung Leipzig läd zum Grundrechtsfest am Samstag, dem 23. Mai 2009 im Johannapark in Leipzig ein. Gefeiert wird zwischen 10 und 16 Uhr der 60. Jahrestag des Grundgesetzes bei einem leckerem Picknick im Freien. Ganz nebenbei kann man sich bei dieser Gelegenheit über den aktuellen Stand unserer Grund-, Informations- und Freiheitsrechte informieren…
Auf Planungsseite der Aktion finden sich weitere Details, Unterstützung für einige Bereiche wird noch gesucht. Danke an Johannes, Stina, Christian und all die anderen, die das Fest organisieren!
Also dann, bis Samstag im Johannapark!