Serializing Trees

September 6, 2016

Quite a while ago I discussed using flat arrays and address calculations to store a tree in a simple array. The trick wasn’t new (I think it’s due to Williams, 1964 [1]), but is only really practical when we consider heaps, or otherwise very well balanced trees. If you have a misshapen tree, that trick doesn’t help you. It doesn’t help you either if you try to serialize a misshapen tree to disk.

But what if we do want to serialize arbitrarily-shaped trees to disk? Is it painful? Fortunately, no! Let’s see how.

Read the rest of this entry »

Making a good random table

April 5, 2016

I am still experimenting with hash functions, and I was toying with the Zobrist hash function[1] that is best known for its use in chess engines. The hash function is conceptually simple: you need a large table of random numbers, indexed, in a chess application, by the position on the board of the piece and by the piece itself. To compute a hash for a whole board configuration, you simply xor all the random numbers together. The hard part is choosing the random numbers.


Read the rest of this entry »

Storage size from bits

March 29, 2016

Last week, we had a look at the computation of Log2 using templates and constexpr. Of course, I had ulterior motives. In particular, I was interested in allocating just the right number of bits for a field in a bit field, but rather than hard-coding it, having it deduced from a template argument. Let’s see how we can do that.

Read the rest of this entry »

Log2 (with C++ metaprogramming)

March 22, 2016

C++ meta-programming a powerful tool. Not only can you use it to build generic types (such as the STL’s std::list), you can also use it to have compile-time evaluation. Let’s have a look at a simple problem that can be solved in two very different ways: computing the Log base 2 of an integer.

Read the rest of this entry »

Rotating Arrays (part II)

February 16, 2016

Last week we had a look at Kernighan’s algorithm to rotate an array k position and concluded that it might not be optimal, as each element was moved twice. This week, we’ll have a look at another algorithm that moves some items more than once, but overall will do less than two exchanges per items.

Read the rest of this entry »

Rotating Arrays (Part I)

February 9, 2016

To “rotate” an array k position to the left (or to the right, doesn’t really matter) we could repeat k times a shift of one, using only one temporary variable. This method doesn’t use much auxiliary memory but is inefficient: it will do kn copies if we apply it to an array of size n. Surely we can do better.

Read the rest of this entry »

Selection, Revisited.

February 2, 2016

When we think of searching, we generally think of searching a value in a sorted collection of some sort. For a simple array, this implies the array is sorted and that we use binary search or a variant. But what if we want to search by rank? In a sorted array, that’s not very hard: the kth item is in slot k. But what if the array is not sorted?

Read the rest of this entry »