Difference: Atf2LwLogBook20130225NLScanOptimisation (1 vs. 2)

Revision 225 Feb 2013 - LaurieNevay

Line: 1 to 1
 
META TOPICPARENT name="Atf2LwLogBook"

20130225 Nonlinear Scan Optimisation

Added:
>
>
  • all files save in extlw/sim/20130225_nl_scan/
 Nonlinear scans often produce a fit uncertainty for sigma of about 0.1um. This is ok, however, better is always better and more importantly, fitting the same data repeatedly often yields different results that can be different by greater than the uncertainty. Whilst not ideal, this can be understood as being a four parameter fit, it is often possible to minimise successfully with one parameter different from a previous fitting by adjusting the others. This is usually shown with data from different scans as each scan is unique and the route taken to minimum often is also. Therefore, if we can't improve this, we should improve the uncertainty on individual points and secondly (more for time purposes) the number of points and their location.

More samples corresponds to greater statistics but also a greater overall length of scan (linear in number of samples), however, the benefit from the statistics reduces as it's dependent on 1/sqrt(nsamples). At some point, a significantly

Added:
>
>

Version 1 - used before 20130222

  • Range: 200um
  • Cubic with minimum step size
  • Minimum step size: 0.2um

Version 2 - developed during shift on 20130222

  • Range: 150um
  • Cubic with minimum step size
  • Minimum step size: 0.35um

Comparison

Generate model overlap integral data for different sigmas with the same normalisation and background over the same range. Use 1001 points to see smooth curve with linear samples for now.

sigma = arange(0.6,1.61,0.2)
data   = [lwIntegral2.OISetEV(yarray,0,200,s,1.0,0.0,0.0) for s in sigma]
datad = dict(zip(sigma,data))
datad2 = {}
for key,val in datad.iteritems():
    newkey = str(round(key,2))
    datad2[newkey] = val
for key in datad2:
    plot(yarray,datad2[key],label=('$\sigma_{ex}$ = '+key+'$\mu$m'))

Saved this data as

different_sigmas.dat

  • Uncertainty on the points is around 5% from typical data so far.
  • Let's aim for 50 points for scanning.
  • Use sigma = 1.0 microns for comparison.

  • It seems important to have a few points close together at the top incase you miss the middle when specifying where the centre of the scan should be.
  • But you also want more points closer together down the sides.

 

Revision 125 Feb 2013 - LaurieNevay

Line: 1 to 1
Added:
>
>
META TOPICPARENT name="Atf2LwLogBook"

20130225 Nonlinear Scan Optimisation

Nonlinear scans often produce a fit uncertainty for sigma of about 0.1um. This is ok, however, better is always better and more importantly, fitting the same data repeatedly often yields different results that can be different by greater than the uncertainty. Whilst not ideal, this can be understood as being a four parameter fit, it is often possible to minimise successfully with one parameter different from a previous fitting by adjusting the others. This is usually shown with data from different scans as each scan is unique and the route taken to minimum often is also. Therefore, if we can't improve this, we should improve the uncertainty on individual points and secondly (more for time purposes) the number of points and their location.

More samples corresponds to greater statistics but also a greater overall length of scan (linear in number of samples), however, the benefit from the statistics reduces as it's dependent on 1/sqrt(nsamples). At some point, a significantly

<--

Settings just to customise this page

-->
 
This site is powered by the TWiki collaboration platform Powered by PerlCopyright © 2008-2022 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding RHUL Physics Department TWiki? Send feedback