Interesting Stuff

The Responsibility of Intellectuals

The only paper that tells the truth
 
T
hinking about publishing a 1% effect?  Think again!  and again!

Worried that the world will end when world's largest atom-smasher starts collisions?
Read this.

A good review on Bayesian statistics as applied to cosmology by Roberto Trotta.

Really nice introduction to the Legendre Transform

Non-parametric comparisions of distributions:
Baskerville and Solomon
Conover's "A Kolmogorov Goodness-of-Fit Test for Discontinuous Distributions"
seeks a different (more general?) formulation of the KS-test.
I'm currently working with a method proposed in Williams to non-parametrically
compare supernovae with a training set.  Further, I'm looking at a method proposed
by Boutsia et al.  
I'm a big fan of Bayesian methods, but I think some of the simpler, ad hoc
frequentist approaches have some merit, particularly when it comes to
computational time.

Edwin Pliny Seaver's Mathematical Handbook - includes very old but good integral tables.

Are Ants Conscious?  This paper claims to show mathematically that they aren't!

Information entropy:
Pretty much anything written by Ariel Caticha is interesting - in particular,
I like his review on Information and Entropy, which is a nice overview of
not only entropy but statistical mechanics as well.
although Chaundy and McLeod has a nice derivation of the function in "On a Functional Equation".  

Become a card-carrying Bayesian

"Bayesian Analysis of Multi-Source Data" by P.C. Bhat, H.B. Prosper and S.S. Snyder - one can do a lot of analysis with this paper

Here's a basic version of what Bhat, Prosper and Snyder did in the "Bayesian Analysis..." paper above by
R. Barlow and C. Beeston (Computer Physics Communications 77 (1993) 219).  Both papers are worth the read.
The advantage of the Barlow+Beeston paper is that it is analytic - however it is hard to see how one fits in additional
prior information.
(Beware, the two papers seem to contradict each other where it concerns finding the best estimate for x by maximizing L(x). 
Barlow+Beeston say one can find the best estimate for x by simply maximizing L(x).  Bhat et al. say one needs a correction factor
and, in practice, it is necessary to calculate L(x+N) where N is the number of bins. 
Barlow+Beeston's method is analytic where Bhat et al.'s is not - I'm not sure why this would make a difference.  Both papers
give good arguments for where the best estimate should be.  I've found Bhat et al. to work with the correction. 
I've never tried Barlow+Beeston's method, but ROOT apparently uses it in TFractionFitter without complaints.)

Check out Mathematical Statistics with Applications by Asha Seth Kapadia Wenyaw Chan and Lemuel Moye
This is book was written to bridge the gap between purely mathematical statistics like
Jun Shao's Mathematical Statistics and more application-centered texts like Glen
Cowan's Statistical Data Analysis, D.S. Sivia's Data Analysis: A Bayesian Tutorial,
Sir Harold Jeffreys's classic, Theory of Probability and Edwin T. Jaynes' Probability Theory: The Logic
of Science
.  The explanations in Kapadia et al.'s book are clear and the book is loaded with examples. 
Plus, it's nicely written -- you could almost read it cover-to-cover!

For an alternative (some may argue dated) point of view on the Bayesian approach
see  A.W.F. Edwards' book, "Likelihood".  Edwards, a student/colleague of Fisher's, gives a
thought-provoking critique of the Bayesian approac.  I
f  you look at the dust jacket, the book as sold as containing
some serious philosophical points on probability - I didn't find anything philosophically
earth-shattering, although this might be due to the fact that the book was written at a different time.

The Gutenberg Project

Speaking of Information Criteria, there is an excellent review of the Bayes Factor (i.e. marginalized likelihood ratio) by Kass and Raftery here.  If you ever wondered where the Bayesian Information Criteria came from, this is your paper.  A more self-contained, extensive derivation
can be found in Burnham and Anderson's Model Selection and Multimodel Inference.  In fact,
forget Kass and Raftery.  Just read Burnham and Anderson's book.

Also, here is a paper written by an all-star cast of authors (Sellke, Bayarri and Berger) on relating the Bayes Factor and p-values.

Excellent Review of Hadronic Generators for Extensive Air Showers by J. Knapp et al.

JStor

Discussions of string theory: Peter Woit (anti-string) and Lubos Motl (pro-string) - beware of Lubos' site - I find that
it crashes my redhat OS when I run it on an old version of mozilla.

The Fastest Human Ever

Tommie Smith and John Carlos - 1968 Mexico City

The First Four Minute Mile

Don Connolly's Racing Web Page

Instant Run-Off Voting

Math Geneology Project:  Who was your advisor's advisor's advisor?  Click here to find out.

Film of Emil Zatopek, greatest distance runner of all time.

Speaking of greats from Czechoslovakia, here's someone who has been robbed of the literature Nobel Prize for 30 years.


Abraham Wald was one of the earlier developers of modern sequential analysis techniques - along
with the great statistician Milton Friedman.  The idea of the technique is as follows.  Suppose you have
an auto assembly line and you want to know the % of defective cars coming out of my factory.  If the % of defects is larger than (say)
10% then you shut down the plant down and re-think how you're producing cars.  Now you could say that you're going
to produce 100 cars and if the number of defects is greater than 10, you shut the plant down, but what if the
first 11 cars are defects?  What's the point of producing more cars?  Also, what do you do about the fact
that you could produce 11/100 defects in one run, but if you produced another 100, there would be 2/100?  That is, what if the
11 were a statistical fluctuation? 
Sequential analysis allows one to make  a decision to reject or accept the null hypothesis using as few `events' (in this
case, cars) as possible, accounting for statistical fluctuations in the sample.  The sequential analysis technique he developed was so
powerful that it was classified during WWII to decide at what point an aircraft has sustained too many
hits and should turn back (or never return home).  This technique is used in medical studies, where
it is simply unethical not to.  You can read his semi-readable paper on his sequential
analysis technique here or buy is even more readable book.  The final pieces of Wald's contribution to
the war effort were declassified in the 90's, and a very historical, highly technical and, moreover, 
extremely interesting account of his work is found here and here.  Here's our paper on applying the
method to the problem of finding the origins of ultrahigh energy cosmic rays!

Speaking of aircraft, here's an interesting paper about fighter aces.  It turns out
that the best fighter aces of WWI may not have been that great after all, but
merely really lucky.
Coolest name for a paper ever: "Separating Hyperplanes and the Authorship of the Disputed Federalist Papers"
The idea is that the authorship of 12 federalist papers were
under dispute.  The disputed papers were authored by James Madison, Alexander Hamilton or John Jay.  Madison
and Hamilton authored most of them, so the debate was really whether they were Madison's or Hamilton's. 
Mosteller and Wallace wrote a book (the book) about a statistical study they did looking at the frequency of
certain words and found that they were Madison's (yea).   Anyway, in the "Separating Hyperplanes.." paper,
they re-analyze this result, associating "each paper with a point in 70-dimensional space.  
This is done by computing how many times, per 1000 words of text, each
of 70 differnt function words (commonly used prepositions, adverbs, pronouns,
articles, and the like) appears in eachpaper.  In the second step of the method,
we search for a hyperplane that has all of the undiputed Hamilton points
on one side and all of the undisputed Madison points on the other side."
In other words, it's sort of a poor-man's neural network... but nevertheless interesting.

Everyone has heard of MacArthur's farwell speech to congress, but I think
his farwell speech to West Point is much more poetic. It's uber-sad, but it's brilliantly written.
 Be sure to click on the link and hear the ghostly voice of the old general himself say:

"The shadows are lengthening for me. The twilight is here. My days of old have vanished, tone and tint. They have gone glimmering through the dreams of things that were. Their memory is one of wondrous beauty, watered by tears, and coaxed and caressed by the smiles of yesterday. I listen vainly, but with thirsty ears, for the witching melody of faint bugles blowing reveille, of far drums beating the long roll. In my dreams I hear again the crash of guns, the rattle of musketry, the strange, mournful mutter of the battlefield."
The simpsons made fun of this speech in "The Secret War of Lisa" episode.  The commandant says:
"The wars of the future will not be fought on the battlefield or at sea.
They will be fought in space, or possibly on top of a very tall
mountain. In either case, most of the actual fighting will be done by
small robots. And as you go forth today remember always your duty is
clear: To build and maintain those robots. Thank you."

MacArthur says in his speech:
"We speak in strange terms: of harnessing the cosmic energy; of making winds
and tides work for us; of creating unheard synthetic materials to supplement or even replace
our old standard basics; to purify sea water for our drink; of mining ocean floors for new
fields of wealth and food; of disease preventatives to expand life into the hundreds of years;
of controlling the weather for a more equitable distribution of heat and cold, of rain
and shine; of space ships to the moon; of the primary target in war, no longer
limited to the armed forces of an enemy, but instead to include his civil populations; of ultimate conflict
between a united human race and the sinister forces of some other planetary galaxy;
of such dreams and fantasies as to make life the most exciting of all time.

And through all this welter of change and development, your mission
remains fixed, determined, inviolable: it is to win our wars.
"
Here's my favorite monster truck.  Watch megasaurus "come to life",
breath fire on a compact car, and munch on it.

Sloane Tanen is one of my favorite authors -- she documents the lives of chicks.

Using the fractal dimension to measure the order in a system has been used for many things.
It has been everywhere from studying the structure of the universe to authenticating Jackson Pollock
paintings.  The latter has caused some debate.  See, for instance, these articles and this article.

Send e-mail to 

Go Home