User:Richard T. Meyers/Notebook/Phys307l/Poisson Statistics Lab
Steve Koch 20:36, 21 December 2010 (EST):Good primary lab notebook.
The online lab procedure is out of date but does still have useful information linked here. We did ask Professor Koch about the lab and found that much of it is just having the computer do the counting of the events for multiple time intervals this is easily done through the software. Once the data is gathered we need only analyze it with histograms. This will be done in the lab.
Note Blue is the Data Overlay, Red is the Poisson Overlay and Orange is the Gaussian Overlay
I looked up what the Poisson Function is:
Where k is the number of events per time interval and where λ is the expected value of the number of events per interval.
I looked up what the Gaussian Function is:
Where μ is the mean, σ is the standard deviation and x is the number of occurrences at a point.
Also for the Preamp plots, 40pre and 100 pre, I used a Q-Q Plot to determine if they were normally distributed.
The Quantiles being defined as n being the sample size, N being the number of intervals at k, and k the point in the sample
The Quantile column is:
The Data-Quantile column is:
I then plotted the Data-Quantile versus the Quantile to see if they were linear. By definition, taken from Wiki, cited below, for the graph to be normal or effectively Poisson the Q-Q plot should be linear. By observation neither the 100pre nor the 40pre are linear thus not Normal or Poisson. Furthermore if one uses the standard deviation and mean calculated in the google docs spreadsheet. We notice on the 40pre that there are 6 points outside the third deviation from the mean and 8 points outside three deviation from the mean for the 100pre. For the 40pre this comes to inside of but for the 100pre the 8 data points represent a probability outside of . This is unusual.
Either way I have decided to discard the preamp data for this experiment, mostly because it is not normally distributed and is thus not a Poisson distribution. (Steve Koch 20:31, 21 December 2010 (EST):This is not necessarily true, but for large expected counts, it is.)
The Next step is to go through the non preamp data and see if anything else should be discarded. I see no data that should be discarded, additionally, however it should be noted that there is a large variation in the 800ms data that will introduce error but because it follows the Poisson distribution, albeit roughly, it can still be used.
From the lab manual we find that the standard deviation should equal the square root of the mean:
Standard Deviation calculated in Google Docs
The Difference between the square root of the mean and the standard deviation.
I can see from the Graphs of the Poisson Overlay of each of the data sets, not the preamp sets, that they follow Poisson distributions. The preamp sets, from a Q-Q graph, are shown to not be Poisson. I also notice that as the time interval increases from 10ms to 800ms we tend to get a graph closer to a Gaussian rather than a Poisson. This just means that as the time interval increases the graphs go towards a normal distribution.
I overlay the Gaussian Distribution onto both the 400ms and 800ms plots. I noticed that they are both close to the data and the Poisson Distributions.Lastly the differences between the square root of the mean and the standard deviation increase and the time intervals increase. So we can say the data is closer to a Poisson for short time intervals and closer to a Gaussian for larger time intervals. SJK 20:33, 21 December 2010 (EST)
1) I got the information for the Q-Q plot here
2) I got the information for the Poisson Distribution here
3) I got the information for the Gaussian Distribution here
1)To Steve Koch for assistance in the lab specifically telling me about the COUNTIF function
2)To Katie Richardson for assistance with google docs and the amusing Chicago Piano Tuners Problem.
3)To Nathan for assistance in the lab and being a good lab partner.