User:Brian P. Josey/Notebook/Junior Lab/2010/10/25: Difference between revisions

From OpenWetWare
Jump to navigationJump to search
No edit summary
Line 109: Line 109:


==Conclusions==
==Conclusions==
From our data we were able to determine two things about the background radiation. The first was that there was on average 30 ± 5 radiation events of significant energy in the lab. The sources of these could be from high energy particles, like gluons from the upper atmosphere, high energy photons, or any number of other natural sources. The second thing was that the radiation follows a Poisson distribution. We can say this with confidence because our data showed two trends. The first was that as the window size that we measured the number of radiations from grew, the distribution of number of events per window spread out over time, and its peak decreased in size. This trend is characteristic of the Poisson distribution, and is explained by the second trend. From our data, it was clear that the standard error calculated directly from the data was very close to the square root of the average for each window. The two numbers never differed more than 2%. This is important because the Poisson distribution is unique in that its standard deviation is the square root of the average. Because we were able to directly show this from our data, we were able to show that background radiation follows the Poisson distribution.
From our data we were able to determine two things about the background radiation. The first was that there was on average 30 ± 5 radiation events of significant energy in the lab. The sources of these could be from high energy particles, like gluons (SJK: muons) from the upper atmosphere, high energy photons, or any number of other natural sources. The second thing was that the radiation follows a Poisson distribution. We can say this with confidence because our data showed two trends. The first was that as the window size that we measured the number of radiations from grew, the distribution of number of events per window spread out over time, and its peak decreased in size. This trend is characteristic of the Poisson distribution, and is explained by the second trend. From our data, it was clear that the standard error calculated directly from the data was very close to the square root of the average for each window. The two numbers never differed more than 2%. This is important because the Poisson distribution is unique in that its standard deviation is the square root of the average. Because we were able to directly show this from our data, we were able to show that background radiation follows the Poisson distribution.


==Acknowledgments and References==
==Acknowledgments and References==

Revision as of 01:07, 21 December 2010

Project name <html><img src="/images/9/94/Report.png" border="0" /></html> Main project page
<html><img src="/images/c/c3/Resultset_previous.png" border="0" /></html>Previous entry<html>&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;</html>Next entry<html><img src="/images/5/5c/Resultset_next.png" border="0" /></html>

Poisson Distribution

This week, my lab partner, Kirstin, and I did the Poisson distribution experiment. This is a fairly straight forward experiment that is used to demonstrate the Poisson distribution. The Poisson distribution is used to describe when an event that occurs at random times independent of the last occurrence, but with an overall average rate. Examples of when this can be useful is counting the radiation off of a sample, or the number of births per day in a maternity ward. For this experiment, we counted the number of background radiation events in the lab. We used a combined scintillator-PMT to detect the events, and counted them using the UCS 30 software on the computer. From this, we were able to generate a series of data sets that contained the number of events in a given window of time, and then analyze them.

Equipment

Power Supply
Detector

Because most of this experiment was performed on the computer, there was very little equipment needed for the experiment. The first piece of equipment was the combined scintillator-PMT. We used this to detect the background radiation in the lab. When the scintillator absorbs radiation, it would fire a beam of ultraviolet light down the tube to the PMT. The PMT, then creates a signal voltage that is picked up by a card in the computer. The card then sends this information to the UCS 30 software, that then counts the number of radiation events in a given window of time. We also used a Spectech Universal Computer Spectrometer power supply to give a bias voltage to the detector. This voltage determines the sensitivity of the detector.

Set-Up and Procedure

The set-up was exceptionally simple:

  1. Turn on the computer and log-in if necessary,
  2. Turn on the Spechtech, it has to be turned on first before the software,
  3. Double click on the icon for the software on the desktop of the computer,

Like the set up, the procedure is pretty basic, the only issue is that the user interface on the computer doesn't make much sense. To set up the data collection, you want to set the cut off voltage fairly high before collecting the data. So the step by step process for collecting data is as follows:

  1. Under mode select "PHA (Amp In)"
  2. Under Settings Select "High Voltage On", and set it to an appropriate value, we used 1200 V. This value is used to adjust the sensitivity of the detector, and a higher voltage will decrease the sensitivity to only the most energetic radiation
  3. Under mode, select "MCS (Internal)"
  4. Under Settings, select MCS, and then pick your appropriate dwell time, which is how large each bin is for the number of events counted.
  5. To collect data, hit the green "Go button" and let it run its course
  6. When it stops, save it to a file or USB drive, but save it as a "comma separated variable (*.csv)"
  7. Import it into Google docs

From this point, the procedure is actually in the data analysis. We did every single dwell time between 10 ms and 1 second. The values for these are then, 10, 20, 40, 80, 100, 200, 400, 800 ms and 1 s.

Data and Results

Here is the data that I collected. Each spreadsheet contains all the individual data points per window on the first page, then the maximum and minimum values, number of windows with a given number of events, averages, and errors on the second page of each table. How I calculated this is described below in the data analysis section, but I included it on these tables for simplicity. The tables are arranged in increasing window size, starting at 10 ms, and concluding with 1 s at the bottom.

{{#widget:Google Spreadsheet

key=0AjJAt7upwcA4dEpndTZuY0xocjJRbThhUWowQVlSN2c width=600 height=325

}}

{{#widget:Google Spreadsheet

key=0AjJAt7upwcA4dFRYVHRaTDZfbGIyYzZYUTFWdTB5Wnc width=600 height=325

}}

{{#widget:Google Spreadsheet

key=0AjJAt7upwcA4dHlkYy1UVFdjWEVnZWt5TXZPOFhEVmc width=600 height=325

}}

{{#widget:Google Spreadsheet

key=0AjJAt7upwcA4dGI5ZXo3dzhjZjZ4Z0ZJMXpLQVhaamc width=600 height=325

}}

{{#widget:Google Spreadsheet

key=0AjJAt7upwcA4dDBWS3FKbWdwamtKcEYzTzMxTVlMR3c width=600 height=325

}}

{{#widget:Google Spreadsheet

key=0AjJAt7upwcA4dGN3WGpjZXNwZmxqTU50ZEp4NVV1V0E width=600 height=325

}}

{{#widget:Google Spreadsheet

key=0AjJAt7upwcA4dEhLTmkzdF9rdWtwd3NaN3FSeDRjeGc width=600 height=325

}}

{{#widget:Google Spreadsheet

key=0AjJAt7upwcA4dEdfVU55MVdDMmY4VUp2UVNnY0t0MGc width=600 height=325

}}

{{#widget:Google Spreadsheet

key=0AjJAt7upwcA4dC1xNFAtSXFPckhLN29PODR5SXAwbEE width=600 height=325

}}

Because she was sick the first week and I was sick the second, Kirstin and I will actually work from different data sets even though we worked together on the lab. So our individual data points will not be the same, but our end results should be very similar.

Data Analysis

Our primary goal is to show how our data illustrates a Poisson distribution. To do this, I need to create histograms for the data and illustrate that they move as the window size increases. To do this, I loaded the results from the data into MATLAB, found their maximums and minimums, averages, and standard deviations. When I knew what the maximum and minimum values were, I was then able to create a new string of data that was the number of times that a specific number of events occurred in a window. In my tables, this is marked as "Number of Radiation" and "Number of Windows". For example, if I had six windows that contained eight number of radiation events, they would be included in this table as "Number of Radiation: 8" and "Number of Windows: 6". With this new table, I was then able to create histograms of the data. Here is a sample of my code from MATLAB:

wsize=0.01 % size of window in seconds
Amin=min(A) % minimum number of counts
Amax=max(A) % maximum number of counts

Aave=sum(A)/length(A) % Average count per window
stdc=std(A) % Calculated standard deviation
stdt=(Aave)^.5 % Thoretical standard deviation (square root of average)
perr=abs((stdt-stdc)/(stdt)*100) % percent error in calculated over theoretical

Asec=Aave/wsize % Average count per second
stdcs=stdc/wsize % calculated standard deviation per second
stdts=stdt/wsize % theoretical standard deviation per second

In this code, A is simply a vector that contains the data from the "Counts". These values were then put into the second sheet on each of the tables above. From the min and max value, I was then able to find a range over which I could create bins for a histogram. In Google Spreadsheets, I then used the "count" function on the whole range of data to find the number of times a specific number of radiations occurred. I then put this into MATLAB as a vector, and then using the plot function I created plots of the data, with the number of windows as a function of the number of radiations. Here are all the plots for my data:

Because I was unable to label these graphs in the wiki code, they represent all of the data sets starting at the smallest window size, 10 ms, and increasing up to the largest window size, 1s. So going left to right, the graphs are: 10, 20, 40, 80, 100, 200, 400, 800 ms and 1 s. While I realize that it is not clear in the above graphs, it is important to note that there is a general trend in the graphs. As the window size increases, the peaks move to ever increasing values. Also they become more flattened, and spread out at increasing window size. This is an important feature of a Poisson distribution, and indicates that our measurements of the radiation does in fact fit the model. To better illustrate this, here is a graph of several different window sizes:

As you can see, as the size of the window increases, the distribution spreads out and the peak moves to higher levels. Obviously, the increase in peak value is a direct result of the increasing time, but the change in the general shape is a characteristic feature of a Poisson distribution. This allows us to argue that the radiation data follows this general trend. However, there is a more important result to illustrate the Poisson distribution.

In my data section above, I calculated both the average and the standard deviation. For the standard deviation, I calculated it twice. The first was the standard way to determine it directly from the data. The second method was simply the square root of the average value. The reason for this, is that the characteristic behavior of a Poisson distribution is that the standard deviation is simply the square root of the function. The average values, the two standard errors, and the percentage difference between the two is summarized here:

{{#widget:Google Spreadsheet

key=0AjJAt7upwcA4dFQ4VjRpSHktbkoxMHVHankwSjRYOWc width=600 height=325

}}

Conclusions

From our data we were able to determine two things about the background radiation. The first was that there was on average 30 ± 5 radiation events of significant energy in the lab. The sources of these could be from high energy particles, like gluons (SJK: muons) from the upper atmosphere, high energy photons, or any number of other natural sources. The second thing was that the radiation follows a Poisson distribution. We can say this with confidence because our data showed two trends. The first was that as the window size that we measured the number of radiations from grew, the distribution of number of events per window spread out over time, and its peak decreased in size. This trend is characteristic of the Poisson distribution, and is explained by the second trend. From our data, it was clear that the standard error calculated directly from the data was very close to the square root of the average for each window. The two numbers never differed more than 2%. This is important because the Poisson distribution is unique in that its standard deviation is the square root of the average. Because we were able to directly show this from our data, we were able to show that background radiation follows the Poisson distribution.

Acknowledgments and References

Because the lab equipment has been updated recently, Dr. Gold's manual was not as useful for this experiment as it has been for others. However, we did find this page very helpful. I also want to thank Dr. Koch for helping me learn the software and keeping me entertained as it ran, Nathan and Richard for helping me set up the equipment and answering some of my questions, and our T.A. Katie for helping me get out early when I was sick.

As for references beyond word of mouth, I used An Introduction to Error Analysis by John R. Taylor, and the Wikipedia page for the Poisson distribution].

Of course, I also want to thank my lab partner, Kirstin.