Faster than Bresenham’s Algorithm?


There’s always a number of graphics primitives you will be using in all kinds of projects. If you’re lucky, you’re working on a “real” platform that comes with a large number of graphics libraries that offer abstractions to the graphics primitives of the OS and, ultimately, to its hardware. However, it’s always good to know what’s involved in a particular graphics primitives and how to recode it yourself should you be in need to do so, either because you do not have a library to help you, or because it would contaminate your project badly to include a library only to draw, say, a line, within an image buffer.


Lines are something we do a lot. Perfectly horizontal or vertical lines have very simple algorithms. Arbitrary lines are a bit more complicated, that is, to get just right. This week, let us take a look at a few algorithms to draw lines. First, we’ll discuss a naïve algorithm using floating point. We’ll also have a look at Bresenham’s algorithm that uses only integer arithmetic. Finally, we’ll show that we can do better than Bresenham if we used fixed point arithmetic.

Read the rest of this entry »

Suggested Reading: The Complete Manual of Typography


James Felici — The Complete Manual of Typography: A guide to setting perfect type — Adobe Press, 361 pp. ISBN 978-0-321-12730-3

(Buy at Amazon)

(Buy at Amazon)

With this very well written book, Felici introduces the main concepts of computer assisted typography—the only one that still exists. A large part of the book is devoted to the technological aspects of typesettings, such as software and font formats like TrueType or OpenType, but the most interesting part presents the language of typefaces (and not fonts, there’s a fundamental difference) and how a careful use of typeface makes a text not only beautiful but easy to read.

This is book is for typography what the Gang of Four‘s Design Pattern is to computer engineering. Many things you will read realizing that you knew that already, maybe if only intuitively, but now you have an extended vocabulary to designate very limited and precise concepts.

Checksums (part I)


I once worked in a company specializing in embedded electronics for industrial applications. In one particular project, the device communicated through a RS-422 cable to the computer and seemed to return weird data once in a while, causing unwanted behavior in the control computer whose programming did not provide for this unexpected data. So I took upon myself to test the communication channel as it seemed that the on-board software was operating properly and did not contain serious bugs. I added a check-sum to the data packet and it turned out that some packets came in indeed corrupted despite the supposedly superior electrical characteristics of the RS-422 link.

After a few days’ work, I implemented the communication protocol that could detect and repair certain errors while reverting to a request to retransmit if the data was too damaged. I then started gathering statistics on error rate, number of retransmit, etc, and the spurious behavior on the controller’s side went away. My (metaphorically) pointy-haired boss opposed the modification because “we didn’t have any of these damn transmission errors until you put your fancy code in there“. Of course, this was an epic facepalm moment. I tried to explain that the errors have always been there, except that now they’re caught and repaired. Needless to say, it ended badly.


Notwithstanding this absurd episode, I kept using check-sum to validate data whenever no other layer of the protocol took care of transmission errors. So, this week, let us discuss check-sums and other error detection algorithms.

Read the rest of this entry »

Linux Symposium 2009: Finalization


The Linux Symposium ended on Friday (yesterday, that is) and I now post our paper. I’ve added it to my publication page but you can get the pdf of the paper as well as the slides in Open Office 3.0 format. The slides are really nothing fancy, but they’re to the point, I think.

I suppose the complete proceedings will appear sometimes soon on the Symposium’s Archive page. I think some of the links are broken.

* *

The Symposium is really not what I expected, and it’s all for the better. I thought I would get more “OpenSource or Die!!!1!” talks, but most of the talks were down-to-earth in a very sane way; some were technically deep, other left the real goodies out, but all were interesting in their own way—not all topics interested me, of course. I think I will also attend next year.

Linux Symposium 2009: Day 4


I wanted to attend to Load Balancing Using Free Software, by Mathieu Trudel, a special, last-minute added talk, but I couldn’t. Unfortunately his paper did not make it into the proceedings, so I’ll have to ask him if he got one or not.

I also got the proceedings and I must say that I am generally pleased with the breadth of subjects as well as the quality of the papers. Makes for good reading. It took me about an hour to read the abstracts and skim over to decide which I will be reading in depth.

* *

While we were discussing the symposium over coffee this morning, my friend Christopher ended up calling it the Linux synopsium for some reason known only to himself.

Linux Symposium 2009: Day 3


Did not attend yesterday (Day 2) either. However, this morning François-Denis delivered a most cromulent speech despite being quite nervous. There was a couple of very interesting talks, like the current state of kernel development, but I must say that the talk about GStreamer ported to the DaVinci DSP series was a bit disappointing as all the tasty technical details of using SIMD on the DSP for video processing were left out as proprietary—so much for Open Source!

Another thing that quite surprised me is that they had no coffee! How can hackers do anything without coffee? But to do justice to the organisators, they have been otherwise very concerned with their guests’ comfort: wireless access was very good, there were water pitchers in every room, and they were quite quick to intervene should some technical difficulty arise.

_S for Sneaky


Ensuring that one’s costumer base remains loyal, also known as lock-in, is an important part of many software and hardware manufacturers’ business plan. Recently, I came across an especially displeasing example of sneaky and subtle customer lock-in strategy from our friends at Microsoft.

Sneaky Cat is Sneaky

Sneaky Cat is Sneaky

Read the rest of this entry »

Linux Symposium 2009: Day 1


Today opened in Montréal the Linux Symposium 2009. Not much excitement so far as today was a day filled with tutorials and unfortunately I could not attend. Still got my OLS’09 burgundy T-shirt, though. I am also quite looking forward to wednesday morning where my co-author and good friend François-Denis (blog) will be presenting our paper on non-privileged user package management. I’ll be putting the paper on my publication page (with the presentation’s slides) sometime next week.

Give it away, Give it away, Give it away now


Old computers are not always ready for the scrap pile the second you don’t have any use for them. Of course there’s always recycling—your local area most certainly has a computer and electronics recycling facility—but there are better things to do with your old computers, provided they’re still functional and usable.

A Commodore c64sx. Photo (c) Erik S. Klein

A Commodore c64sx. Photo © Erik S. Klein

Read the rest of this entry »

Suggested Reading: Infotopia: How many minds produce Knowledge


Cass R. Sunstein — Infotopia: How Many Minds produce Knowledge — Oxford University Press, 2006, 273 pp. ISBN 978-0-19-534067-9

(Buy at Amazon)

(Buy at Amazon)

In this short book, Cass Sunstein explains how collaborative and deliberation processes affects the propagation and the use of information (especially in reaching a decision) in a group.

Read the rest of this entry »