December 23, 2014
In my discrete mathematics class, I often use the Fibonacci rabbits example, here to show how to resolve a recurrence, there a variant where some rabbits go away, here again for rewrite systems.

What are rewrite systems? Not unlike context-free grammars, they provide rules generate “words” in a “language”. Turns out the Fibonacci rabbit population problem is super-easy to model as a rewrite system.

Read the rest of this entry »

Leave a Comment » | algorithms, Mathematics, theoretical computer science | Tagged: Fibonacci, mitosis, Rabbits, rewrite systems | Permalink

Posted by Steven Pigeon

December 27, 2011
In a previous installment, about filtering noise, we discussed how to use a moving average to even out irregularities in a sequence. Averaging over a window is tricky. First, the window size must make sense in terms of the speed at which the signal changes (in samples), and the average is (overly) sensitive to outliers.

One way to limit the influence of the outliers for denoising is to use the median. However, computing the median is usually more tricky than computing the average, or mean, and this first post (in a series of three, in the next few weeks), discusses how to compute the median efficiently using the *selection* algorithm.

Read the rest of this entry »

4 Comments | algorithms, C, C-plus-plus, data structures, programming, theoretical computer science | Tagged: Lomuto, median, QuickSort, selection, Wirth | Permalink

Posted by Steven Pigeon

March 16, 2010
QuickSort, due to Hoare, is one of the best comparison-based sorting algorithms around. There are a number of variations, but the vanilla QuickSort does a great job… most of the times.

Indeed, there are occasions where QuickSort leaves its expected run-time to reach run-time, its worst case. But what does it take to push QuickSort into its worst behaviour?

Read the rest of this entry »

4 Comments | algorithms, hacks, Mathematics, programming, rants, theoretical computer science, wtf | Tagged: bubble sort, C, comparison sort, computational complexity, evil, Hoare, malicious data, QuickSort, sort, sorts, Worst Case, Worst Case Behavior | Permalink

Posted by Steven Pigeon

March 7, 2010
Ulrich Meyer, Peter Sanders, Jop Sibeyn (eds.) — *Algorithms for Memory Hierarchies* — Springer, (Lectures Notes on Computer Science LNCS # 2625), 2003, 428 pp. ISBN 978-3540-00883-5

(Buy at Amazon.com)

This book is a collection of chapters on various topics pertaining to memory hierarchies and their algorithms, but written by several different authors, without any special uniformity in tone or topics; but that’s OK because it allows the reader to have a broad view of algorithms for memory hierarchies.

Read the rest of this entry »

Leave a Comment » | algorithms, data structures, Mathematics, Operating System, programming, Suggested Reading, theoretical computer science | Permalink

Posted by Steven Pigeon

March 7, 2010
Peter Brass — *Advanced Data Structures* — Cambridge University Press, 2008, 492 pp. ISBN 978-0521-88037-4

(Buy at Amazon.com)

The first part of the book concentrates on search trees and variants, whether balanced trees, interval trees, or heaps. Chapters are dedicated to connected components and like algorithms, one to algorithms for strings, and one for hash tables. Follows appendices on computation, cache oblivious algorithms, etc.

Read the rest of this entry »

1 Comment | algorithms, data structures, Mathematics, Object Oriented Programming, programming, Suggested Reading, theoretical computer science | Permalink

Posted by Steven Pigeon

November 24, 2009
There’s a theorem that, although its formulation is trivial, is of paramount importance in many things, including data compression. I’m talking about the frivolous theorem of arithmetic, of course. The theorem takes many forms, but one being:

Almost all natural numbers are very, very, very large.

The converse implies that there are a lot more big numbers than there are smaller numbers. Of course, this is trivially self-evident. But this trivial theorem can serve as a brutal reality check for many hypotheses. For example, one can use the frivolous theorem of arithmetic to disprove the existence of a lossless data compression method that compresses *all* inputs to smaller bit strings.

Read the rest of this entry »

8 Comments | data compression, Mathematics, theoretical computer science | Tagged: bit strings, common sense, counting argument, data compression, frivolous theorem of arithmetic, numbers, pidgeonhole principle | Permalink

Posted by Steven Pigeon

October 13, 2009
Programmers aren’t always the very rational beings they please themselves to believe. Very often, we close our eyes and take decisions based on what we *think* we know, and based on what have been told by more or less reliable sources. Such as, for example, taking red-black trees rather than AVL trees because they are faster, while not being able to quite justify in the details why it must be so. Programming using this kind of decision I call *cargo cult programming*.

Originally, I wanted to talk about red-black *vs.* AVL trees and how they compare, but I’ll rather talk about the STL `std::map` that is implemented using red-black trees with G++ 4.2, and `std::unordered_map`, a hash-table based container introduced in TR1.

Read the rest of this entry »

3 Comments | algorithms, data structures, Design, hacks, Mathematics, programming, theoretical computer science | Tagged: algorithmics, assumptions, AVL tree, balancing tree, C, cargo, cargo cult, complexity, hash table, magic, magic thinking, red-black tree, Scrabble, Splay Tree, std::map, std::unordered_map, stl, TR1 | Permalink

Posted by Steven Pigeon