Physics307L:Help/Fitting a line

From OpenWetWare
Revision as of 11:23, 3 November 2008 by Steven J. Koch (talk | contribs)
Jump to navigationJump to search

Take home message from this class

There are statistically sound methods for obtaining the maximum likelihood slope and intercept to fit a set of data of the form [math]\displaystyle{ (x_i,y_i) }[/math]. This really is the take home message...I want you to remember enough to know that you can do it and to be able to quickly find the resources so you can remind yourself of the necessary assumptions about the data and the formulas (or algorithms) for calculating the best fit values, along with uncertainty. Two good resources:

  • Chapter 6 ("Least-squares fit to a straight line") of Bevington and Robinson second edition.
  • Chapter 8 ("Least-squares fitting") of Taylor second edition.

In order to leave the class with this confidence (knowing you can do it and where to find material to refresh your memory), you'll need to practice the techniques during your labs! There are plenty of labs (in fact a majority of them) where least-squares fitting to a line can and should be implemented.

Theoretical background

Assumptions

It is beyond the scope of this class to describe the methods with the least assumptions possible. For example, you can do least-squares fitting when uncertainties in both x and y are important, but here we'll assume only uncertainties in y. We're also only talking about a linear fit (y=A + B*x)...extension to quadratic and higher order is not too difficult but we're not doing that here.

  • Assume that the data should follow a linear relationship. You can assess this assumption by examining the residuals of the best fit line.
  • Assume that the uncertainty in each [math]\displaystyle{ y_i }[/math] is normally distributed, with a standard deviation of [math]\displaystyle{ \sigma _i }[/math]