Archive for the ‘Timing’ Category.
We organized a Special Session on Time Series in High Energy Astrophysics: Techniques Applicable to Multi-Dimensional Analysis on Sep 7, 2011, at the AAS-HEAD conference at Newport, RI. The talks presented at the session are archived at http://hea-www.harvard.edu/AstroStat/#head2011
A tremendous amount of information is contained within the temporal variations of various measurable quantities, such as the energy distributions of the incident photons, the overall intensity of the source, and the spatial coherence of the variations. While the detection and interpretation of periodic variations is well studied, the same cannot be said for non-periodic behavior in a multi-dimensional domain. Methods to deal with such problems are still primitive, and any attempts at sophisticated analyses are carried out on a case-by-case basis. Some of the issues we seek to focus on are methods to deal with are:
* Stochastic variability
* Chaotic Quasi-periodic variability
* Irregular data gaps/unevenly sampled data
* Multi-dimensional analysis
* Transient classification
Our goal is to present some basic questions that require sophisticated temporal analysis in order for progress to be made. We plan to bring together astronomers and statisticians who are working in many different subfields so that an exchange of ideas can occur to motivate the development of sophisticated and generally applicable algorithms to astronomical time series data. We will review the problems and issues with current methodology from an algorithmic and statistical perspective and then look for improvements or for new methods and techniques.
The schedule for the mini-Workshop on Computational AstroStatistics is set: http://hea-www.harvard.edu/AstroStat/CAS2010/#schedule
ARCH (autoregressive conditional heteroscedasticity) is a statistical model that considers the variance of the current error term to be a function of the variances of the previous time periods’ error terms. I heard that this model made Prof. Engle a Nobel prize recipient. Continue reading ‘[MADS] ARCH’ »
We have seen the word “bipartisan” often during the election and during the on-going recession period. Sometimes, I think that the bipartisanship is not driven by politicians but it’s driven by media, commentator, and interpreters. Continue reading ‘Bipartisanship’ »
The CfA is celebrating the 100th anniversary of the discovery of the Cepheid period-luminosity relation on Nov 6, 2008. See http://www.cfa.harvard.edu/events/2008/leavitt/ for details.
[Update 10/03] For a nice introduction to the story of Henrietta Swan Leavitt, listen to this Perimeter Institute talk by George Johnson: http://pirsa.org/06050003/
[Update 11/06] The full program is now available. The symposium begins at Noon today.
Avalanches are a common process, occuring anywhere that a system can store stress temporarily without “snapping”. It can happen on sand dunes and solar flares as easily as on the snow bound Alps.
Melatos, Peralta, & Wyithe (arXiv:0710.1021) have a nice summary of avalanche processes in the context of pulsar glitches. Their primary purpose is to show that the glitches are indeed consistent with an avalanche, and along the way they give a highly readable description of what an avalanche is and what it entails. Briefly, avalanches result in event parameters that are distributed in scale invariant fashion (read: power laws) with exponential waiting time distributions (i.e., Poisson).
Hence the title of this post: the “Avalanche distribution” (indulge me! I’m using stats notation to bury complications!) can be thought to have two parameters, both describing the indices of power-law distributions that control the event sizes, a, and the event durations, b, and where the event separations are distributed as an exponential decay. Is there a canned statistical distribution that describes all this already? (In our work modeling stellar flares, we assumed that b=0 and found that
a>2 a<-2, which has all sorts of nice consequences for coronal heating processes.)
[arXiv:0709.3093v1] Short Timescale Coronal Variability in Capella (Kashyap & Posson-Brown)
We recently submitted that paper to AJ, and rather ironically, I did the analysis during the same time frame as this discussion was going on, about how astronomers cannot rely on repeating observations. Ironic because the result reported there hinges on the existence of small, but persistent signal that is found in repeated observations of the same source. Doubly ironic in fact, in that just as we were backing and forthing about cultural differences I seemed to have gone and done something completely contrary to my heritage! Continue reading ‘Betraying your heritage’ »
X-ray summer school is on going. Numerous interesting topics were presented but not much about statistics (Only advice so far, “use implemented statistics in x-ray data reduction/analysis tools” and “it’s just a tool”). Nevertheless, I happened to talk two students extensively on their research topics, finding features from light curves. One was very empirical from comparing gamma ray burst trigger time to 24kHz observations and the other was statistical and algorithmic by using Bayesian Block. Sadly, I could not give them answers but the latter one dragged my attention.
Continue reading ‘Change Point Problem’ »
The complete catalogue of gamma-ray bursts observed by the Wide Field Cameras on board BeppoSAX by Vetere, et.al.
This paper intend to publicize the largest data set of Gamma Ray Burst (GRB) X-ray afterglows (right curves after the event), which is available from http://www.asdc.asi.it. It is claimed to be a complete on-line catalog of GRB observed by two wide-Field Cameras on board BeppoSAX (Click for its Wiki) in the period of 1996-2002. It is comprised with 77 bursts and 56 GRBs with Xray light curves, covering the energy range 40-700keV. A brief introduction to the instrument, data reduction, and catalog description is given.
This is from the very interesting Ingrid Daubechies interview by Dorian Devins,
www.nasonline.org/interviews_daubechies, National Academy of Sciences, U.S.A., 2004. It is from part 6, where Ingrid Daubechies speaks of her early mathematics paper on wavelets. She tries to put the impact into context:
I really explained in the paper where things came from. Because, well, the mathematicians wouldn’t have known. I mean, to them this would have been a question that really came out of nowhere. So, I had to explain it …
I was very happy with [the paper]; I had no inkling that it would take off like that… [Of course] the wavelets themselves are used. I mean, more than even that. I explained in the paper how I came to that. I explained both [a] mathematicians way of looking at it and then to some extent the applications way of looking at it. And I think engineers who read that had been emphasizing a lot the use of Fourier transforms. And I had been looking at the spatial domain. It generated a different way of considering this type of construction. I think, that was the major impact. Because then other constructions were made as well. But I looked at it differently. A change of paradigm. Well, paradigm, I never know what that means. A change of … a way of seeing it. A way of paying attention.
Jeff Scargle (in person [top] and in wavelet transform [bottom], left) weighs in on our continuing discussion on how well “automated fitting”/”Machine Learning” can really work (private communication, June 28, 2007):
It is clearly wrong to say that automated fitting of models to data is impossible. Such a view ignores progress made in the area of machine learning and data mining. Of course there can be problems, I believe mostly connected with two related issues:
* Models that are too fragile (that is, easily broken by unusual data)
* Unusual data (that is, data that lie in some sense outside the arena that one expects)
The antidotes are:
(1) careful study of model sensitivity
(2) if the context warrants, preprocessing to remove “bad” points
(3) lots and lots of trial and error experiments, with both data sets that are as realistic as possible and ones that have extremes (outliers, large errors, errors with unusual properties, etc.)
Trial … error … fix error … retry …
You can quote me on that.
This ilustration is from Jeff Scargle’s First GLAST Symposium (June 2007) talk, pg 14, demonstrating the use of inverse area of Voroni tesselations, weighted by the PSF density, as an automated measure of the density of Poisson Gamma-Ray counts on the sky.
I don’t know why astro-ph thought this article on the statistics of football dynamics (Mendes, Malacarne, Anteneodo 2007; physics/0706.1758) was relevant to me and emailed the abstract, but I’m glad they did, because they deal with a question I have wrestled with for a long time: how to figure out the underlying distribution that controls a stochastic process. In 2002ApJ…580.1118K, we dealt with modeling the photon arrival time differences as due to flares occuring at random times but with a power-law intensity distribution with index alpha. physics/0706.1758 deals with time-between-touches and tries to characterize that distribution itself in terms of a number of “phases” beta. From a quick reading, it appears that their beta are our flares, and they restrict all flares to have the same intensity. Despite the restriction, this is interesting because it is an analytical estimation that points a way towards speeding up our flare distribution fitting process, which currently is based on a Monte-Carlo grid search method, not the fastest way to do things.
John Rice is visiting IIC. The meeting and his talk is scheduled on Friday, June 8, at 11:30am (room 403 at 60 Oxford St.).
Title: Event Weighted Tests for Periodicity in a Sequence of Photon Arrival Times:
Detecting Gamma-ray Pulsars.
[Added] Another meeting is scheduled at the stat dept. located in Science Center, 4-6pm, Wednesday (June 6th).