A Suspicious Series

December 30, 2014

Does the series

\displaystyle \sum_{k=1}^\infty \frac{\sin k}{k}

converge?

At first, you may be reminded of the harmonic series that diverges, because of the divisor k following the same progression, and may conclude that this suspicious series diverges because its terms do not go to zero fast enough. But we need to investigate how the \sin k part behaves.

Read the rest of this entry »


Lissajous Curves.

December 9, 2014

Many of this blog’s entries seem … random and unconnected. This is another one, despite it being quite connected to some research I’m presently conducting. This week, we discuss Lissajous curves.

lissajous

We’ll see the formulas, and how to select “nice” parameters.

Read the rest of this entry »


Trigonometric Tables Reconsidered

February 28, 2012

A couple of months ago (already!) 0xjfdube produced an excellent piece on table-based trigonometric function computations. The basic idea was that you can look-up the value of a trigonometric function rather than actually computing it, on the premise that computing such functions directly is inherently slow. Turns out that’s actually the case, even on fancy CPUs. He discusses the precision of the estimate based on the size of the table and shows that you needn’t a terrible number of entries to get decent precision. He muses over the possibility of using interpolation to augment precision, speculating that it might be slow anyway.

I started thinking about how to interpolate efficiently between table entries but then I realized that it’s not fundamentally more complicated to compute a polynomial approximation of the functions than to search the table then use polynomial interpolation between entries.

Read the rest of this entry »


Sohcahtoa!

February 9, 2010

Mathematics can ask you to remember things that have no obvious connection to common sense. Either because it’s arbitrary (the name of a function in respect to what it computes) or because you haven’t quite figured all the details out. In either cases, a few mnemonics are always useful!

Read the rest of this entry »


Deriving the 1 bit = 6 dB rule of thumb

December 9, 2008

This week, a more mathematical topic. Sometime ago, we—friends and I—were discussing the fidelity of various signals, and how many bits were needed for an optimal digitization of the signal, given known characteristics such as spectrum and signal-to-noise ratio.

Indeed, at some point, when adding bits, you only add more power to represent noise in the signal. There’s a rule of thumb that say that for every bit you add, you can represent a signal with \approx 6 dB more of signal to noise ratio. Let me show you how you derive such a result.

Read the rest of this entry »