Quantcast

Posts tagged with sequences

Count.

Count.




One

Counting generates from the programmer’s successor function ++ and the number one. (You might argue that to get out to infinity requires also repetition. Well every category comes with composition by default, which includes composition of ƒ∘ƒ∘ƒ∘….)

But getting to one is nontrivial. Besides the mystical implications of 1, it’s not always easy to draw a boundary around “one thing”. Looking at snow (without the advantage of modern optical science) I couldn’t find “one snow”. Even where it is cut off by a plowed street it’s still from the same snowfall.
a larger &lsquot;thing&rsquot; with holes in it ... like the snow has &lsquot;road holes&rsquot; in it
And if you got around on skis a lot of your life you wouldn’t care about one snow-flake (a reductive way to define “one” snow), at least not for transport, because one flake amounts to zero ability to travel anywhere. Could we talk about one inch of snow? One hour of snow? One night of snow?

http://2.bp.blogspot.com/_sf5B7n5Avcg/THFKAnY0TCI/AAAAAAAAAmI/x9E4slMs0uQ/s1600/IMGP1553.JPG

Speaking of the cold, how about temperature? It has no inherent units; all of our human scales pick endpoints and define a continuum in between. That’s the same as in measure theory which gave (along with martingales) at least an illusion of technical respectability to the science of chances. If you use Kolmogorov’s axioms then the difficult (impossible?) questions—what the “likelihood” of a one-shot event (like a US presidential election) actually means or how you could measure it—can be swept under the rug whilst one computes random walks on trees or Gaussian copulæ. Meanwhile the sum-total of everything that could possibly happen Ω is called 1.

With water or other liquids as well. Or gases. You can have one grain of powder or grain (granular solids can flow like a fluid) but you can’t have one gas or one water. (Well, again you can with modern science—but with even more moderner science you can’t, because you just find a QCD dynamical field balancing (see video) and anyway none of the “one” things are strictly local.)

And in my more favourite realm, the realm of ideas. I have a really hard time figuring out where I can break off one idea for a blogpost. These paragraphs were a stalactite growth off a blobular self-rant that keeps jackhammering away inside my head on the topic of mathematical modelling and equivalence classes. I’ve been trying to write something called “To equivalence class” and I’ve also been trying to write something called “Statistics for People Who Program Computers” and as I was talking this out to myself, another rant squeezed out between my fingers and I knew if I dropped the other two I could pull One off it could be sculpted into a readable microtract. Leaving “To Equivalence Class”, like so many of the harder-to-write things, in the refrigerator—to marinate or to mould, I don’t know which.

But notice that I couldn’t fully disconnect this one from other shared-or-not-shared referents. (Shared being English language and maybe a lot of unspoken assumptions we both hold. Unshared being my own personal jargon—some of which I’ve tried to share in this space—and rants that continually obsess me such as the fallaciousness of probabilistic statements and of certain economic debates.) This is why I like writing on the Web: I can plug in a picture from Wikipedia or point back to somewhere else I’ve talked on the other tangent so I don’t ride off on the connecting track and end up away from where I tried to head.

The difficulty of drawing a firm boundary of "one" to begin the process of counting may be an inverse of the "full" paradox or it may be that certain things (like liquid) don’t lend themselves to counting in an obvious way—in jargon, they don’t map nicely onto the natural numbers (the simplest kind of number). If that’s a motivation to move from discrete things to continuous when necessary, then I feel a similar motivation to move from Euclidean to Hausdorff, or from line to poset. Not that the simpler things don’t deserve as well a place at the table.

We thinkers are fairly free to look at things in different ways—to quotient and equivalence-class creatively or at varying scales. And that’s also a truth of mathematical modelling. Even if maths seems one-right-answer from the classroom, the same piece of reality can bear multiple models—some refining each other, some partially overlapping, some mutually disjoint.




φ, stop it! You’re too crazy, φ! You’re going to get us all in trouble! Calm down now!
Edit: The top is off by +1. Sorry for the typo!

φ, stop it! You’re too crazy, φ! You’re going to get us all in trouble! Calm down now!

Edit: The top is off by +1. Sorry for the typo!


hi-res




One reason polynomials are interesting is that you can use them to encode sequences.

image

In fact some of the theory of abstract algebra (the theory of rings) deals specifically with how your perspective changes when you erase all of the x^297 and x^16 terms and think instead about a sequence of numbers, which actually doesn’t represent a sequence at all but one single functional.

image

When you put that together with observations about polynomials

  • Every sequence is a functional. (OK, can be made into a functional / corresponds to a functional)
     
  • So plain-old sequences like 2, 8, 197, 1780, … actually represent curvy, warped things.
    image 
  • Sequences of infinite length are just as admissible as sequences that finish.
    (After all, you see infinite series all the time in maths: Laurent series, Taylor series, Fourier series, convergent series for pi, and on and on.)
  • Any questions about analyticity, meromorphicity, convergence-of-series, etc, and any tools used to answer them, now apply to plain old sequences-of-numbers.
  • Remember Taylor polynomials? There’s a calculus connection here.
  • Derivatives and integrals can be performed on any sequence of plain-old-numbers. They correspond (modulo k!) to a left-shift and right-shift of the sequence.
  • You can take the Fourier transform of a sequence of numbers.
      
  • How about integer sequences from the OEIS? What do those functions look like? How about once they’re Taylored down? (each term divided by k!.)
      
  • Sequences are lists. Sequences are polynomials. Vectors are lists. Ergo—polynomials are vectors?!
  • Yes, they are, and due to Taylor’s theorem sequences-as-vectors constitute a basis for all smooth ℝ→ℝ functionals.
  • The first question of algebraic geometry arises from this viewpoint as well. A sequence of "numbers" instantiates a polynomial, which has “zeroes”. (The places where the weighted x^1192 terms sum to 0.)

    So middle-school algebra instantiates a natural mapping from one sequence to another. For example (1, 1−2−1, 1 (−1, 1−φ, 1, φ). Look, I don’t make the rules. That correspondence just is there, because of logic.

    Instead of thinking sequence → polynomial → curve on a graph → places where the curve passes through a horizontal line, you can think sequence → sequence. How are sequences→ connected to →sequences? Here’s an example sequence (0.0, 1.1, 2.2, 3.3, 4.4, 0, 0, 7.7) to start playing with on WolframAlpha. Try to understand how the roots dance around when you change sequence.
    image 
  • Looking at sequences as polynomials explains the partition function (how many ways can you split up 7?) As explained here.
  • Also, general combinatorics http://en.wikipedia.org/wiki/Enumerative_combinatorics problems besides the partition example, are often answered by a polynomial-as-sequence.
  • Did I mention that combinatorics are the basis for Algorithms that make computers run faster?
  • Did I mention that Algorithms class is one of the two fundae that set hunky Computer Scientists above the rest of us dipsh_t programmers?
  • There is a connection to knots as well.
  • Which means that group theory / braid theory / knot theory can be used to simplify any problem that reduces to “some polynomial”.
  • Which means that, if complicated systems of particles, financial patterns, whatever, can be reduced to a polynomial, then I can use a much simpler (and more visual) way of reasoning about the complicated thing.
  • I think this stuff also relates to Gödel numbers, which encode mathematical proofs.
  • You can encode all of the outputs of a ℕ→ℕ function as a sequence. Which means you may be able to factor a sequence into the product of other sequences. In other words, maybe you can multiply simple sequences together to get the complicated sequence—or function—you’re looking for.

This is an example of when the kind of language mathematics is, is quite nice. Every author’s sprawling thoughts coming from here and going to there while taking a detour to la-la land, are condensed by uniformity of notation. Then by force of reasoning, analogies are held fast, concrete is poured over them, and eventually you can walk across the bridge to Tarabithia. Try nailing down parallels between Marx & Engels, it’s much harder.

All of these connections give one an archaeological feeling, like … What exactly am I unearthing here? 




3 × 3 × 3 × 3 × 3 × 3 × 3 × 3 × 3 × 3 × 3 × 3 × 3 … modulo 89

  • An application of group theory to Navy sonar — can you generate a sequence whose 1-differences are random? (whose 1-autocorrelations are nil)


  • Also the patterns look a bit like low-discrepancy sequences:
     

     




If you’re taking Calculus II and just learning about sums of sequences — aka series — here’s how I heuristically guess at a problem before I break it down:
\sum_{\infty} {1 \over n}	=	\infty    \\ \sum_{\infty} {1 \over n^{1.1} }	<	\infty   \\ \sum_{\infty} {1 \over n \, \log n}	 =	\infty

The last one is the most surprising.  Just remember: log is really, really … really, slow!

The Logarithm

It also never stops — look at log x / x, i.e. log versus straight-line, i.e. log per unit.

Log x / x

Of course, you already knew that!  Because

D[ log x ] = 1 / x.

So just like {1/3, 1/4, …, 1/66, …, 1/7293, …} never settles down to zero, thus log never stops increasing.  But all the while, log is increasing ever more slowly.



NOTE TO PEDANTS: You might object that ∞ is “not a number” so I can’t use the equals sign.  To you I say,
(a) consider using hyperreal or surreal numbers;
(b) consider projective geometry;
(c) consider the Riemann sphere.

All three use the point ∞ as an element of the set of numbers.