LMS Stability, Data Correction and the Radiation Accident within the PrimEx Experiment by LaRay J. Benton M.S. Nuclear Physics May 2006 Graduate North.

Slides:



Advertisements
Similar presentations
KINETICS.
Advertisements

ACHIZITIA IN TIMP REAL A SEMNALELOR. Three frames of a sampled time domain signal. The Fast Fourier Transform (FFT) is the heart of the real-time spectrum.
MEEG 5113 Modal Analysis Set 3.
STA305 week 31 Assessing Model Adequacy A number of assumptions were made about the model, and these need to be verified in order to use the model for.
Absorbance spectroscopy
When Intuition Differs from Relative Frequency
TH EDITION LIAL HORNSBY SCHNEIDER COLLEGE ALGEBRA.
Introduction to Control Charts.
MAE 552 Heuristic Optimization
Methods of Exploratory Data Analysis GG 313 Fall /25/05.
Excellence Justify the choice of your model by commenting on at least 3 points. Your comments could include the following: a)Relate the solution to the.
Y. Karadzhov MICE Video Conference Thu April 9 Slide 1 Absolute Time Calibration Method General description of the TOF DAQ setup For the TOF Data Acquisition.
Preliminary Results of the Test Run at SLAC Petra Hüntemeyer.
Random Sampling. In the real world, most R.V.’s for practical applications are continuous, and have no generalized formula for f X (x) and F X (x). We.
Forward Detectors and Measurement of Proton-Antiproton Collision Rates by Zachary Einzig, Mentor Michele Gallinaro INTRODUCTION THE DETECTORS EXPERIMENTAL.
Control Charts for Attributes
Algebra Problems… Solutions
Chemometrics Method comparison
Introduction Data surrounds us in the real world. Every day, people are presented with numbers and are expected to make predictions about future events.
Graphing Linear Inequalities
Copyright © Cengage Learning. All rights reserved. Percentage Change SECTION 6.1.
Integrals 5.
QNT 531 Advanced Problems in Statistics and Research Methods
What is the Nature of Science? The Nature of Science is a logical, sequential way of investigating our world. We wonder, what would happen if I …? Then.
9/23/2015Slide 1 Published reports of research usually contain a section which describes key characteristics of the sample included in the study. The “key”
Non-Linear Models. Non-Linear Growth models many models cannot be transformed into a linear model The Mechanistic Growth Model Equation: or (ignoring.
The importance of sequences and infinite series in calculus stems from Newton’s idea of representing functions as sums of infinite series.  For instance,
Introduction to Statistical Quality Control, 4th Edition
Basic Statistics Concepts Marketing Logistics. Basic Statistics Concepts Including: histograms, means, normal distributions, standard deviations.

Data Presentation & Graphing Introduction to Mechanical Engineering The University of Texas-Pan American College of Science and Engineering.
Error, Accuracy, Precision, and Standard Deviation Notes.
COURSE: JUST 3900 INTRODUCTORY STATISTICS FOR CRIMINAL JUSTICE Instructor: Dr. John J. Kerbs, Associate Professor Joint Ph.D. in Social Work and Sociology.
1 Atomic Absorption Spectroscopy Lecture Emission in Flames There can be significant amounts of emission produced in flames due to presence of flame.
FLC Group Test-beam Studies of the Laser-Wire Detector 13 September 2006 Maximilian Micheler Supervisor: Freddy Poirier.
The Examination of Residuals. Examination of Residuals The fitting of models to data is done using an iterative approach. The first step is to fit a simple.
Make observations to state the problem *a statement that defines the topic of the experiments and identifies the relationship between the two variables.
11 The student will learn about: §4.5 Application of Definite Integrals and Area Between Curves. the average value of a function, the average value of.
Copyright © Cengage Learning. All rights reserved.
Calorimeter Data Monitoring News Benoit Viaud (LAL-in2p3) B. Viaud, Calo Mtg Aug. 31 st
Optimising Cuts for HLT George Talbot Supervisor: Stewart Martin-Haugh.
Slide Slide 1 Copyright © 2007 Pearson Education, Inc Publishing as Pearson Addison-Wesley. Lecture Slides Elementary Statistics Tenth Edition and the.
Dr. Harris Lecture 18 HW: Ch 17: 5, 11, 18, 23, 41, 50 Ch 17: Kinetics Pt 1.
Grading and Analysis Report For Clinical Portfolio 1.
Objectives Determine Detector offsets in hall B reference frame and thus absolute beam position at Hycal Examine Flux calculation and see if beam trips.
MPPC Measurements at LSU Brandon Hartfiel LSU Hardware Group Thomas Kutter, Jessica Brinson, Jason Goon, Jinmeng Liu, Jaroslaw Nowak Sam Reid January 2009.
11/23/2015Slide 1 Using a combination of tables and plots from SPSS plus spreadsheets from Excel, we will show the linkage between correlation and linear.
GRAPHING AND RELATIONSHIPS. GRAPHING AND VARIABLES Identifying Variables A variable is any factor that might affect the behavior of an experimental setup.
Solution of. Linear Differential Equations The first special case of first order differential equations that we will look is the linear first order differential.
Slide 1 Copyright © 2004 Pearson Education, Inc..
1 A first look at the KEK tracker data with G4MICE Malcolm Ellis 2 nd December 2005.
Graphing Most people at one time or another during their careers will have to interpret data presented in graphical form. This means of presenting.
Hycal Energy Resolution, Timing, &Trigger Efficiency, A cumulative study. Chris Mauney.
Copyright © Cengage Learning. All rights reserved. 16 Quality Control Methods.
Barnett/Ziegler/Byleen Finite Mathematics 11e1 Learning Objectives for Section 5.1 Inequalities in Two Variables The student will be able to graph linear.
Graphing Data A variable is any factor that might affect the behavior of an experimental setup. Identifying Variables Section 1.3 The independent variable.
Lecture 8: Measurement Errors 1. Objectives List some sources of measurement errors. Classify measurement errors into systematic and random errors. Study.
N. Saoulidou, Fermilab1 Study of the QIE Response & Calibration (Current Injection CalDet & Development of diagnostic tools for NearDet N.Saoulidou,
Pedestal Stability and Sparsification Nick Woods UNCW July 29, 2005.
Development of a pad interpolation algorithm using charge-sharing.
1M. Ellis - MICE Tracker PC - 1st October 2007 Station QA Analysis (G4MICE)  Looking at the same data as Hideyuki, but using G4MICE.  Have not yet had.
Lecture -5 Topic :- Step Response in RL and RC Circuits Reference : Chapter 7, Electric circuits, Nilsson and Riedel, 2010, 9 th Edition, Prentice Hall.
Objectives  Graph the relationship between Independent and Dependent Variables.  Interpret Graphs.  Recognize common relationships in graphs.
Statistics A statistic: is any number that describes a characteristic of a sample’s scores on a measure. Examples are but not limited to average (arithmetic.
Instrumentation for Colliding Beam Physics 2017
CHAPTER 26: Inference for Regression
NanoBPM Status and Multibunch Mark Slater, Cambridge University
9.4 Enhancing the SNR of Digitized Signals
Annabelle C. Singer, Loren M. Frank  Neuron 
LMS Data Correction and the Radiation Accident
Presentation transcript:

LMS Stability, Data Correction and the Radiation Accident within the PrimEx Experiment by LaRay J. Benton M.S. Nuclear Physics May 2006 Graduate North Carolina A&T State University Thomas Jefferson National Laboratory PrimEx Collaboration Advised by Dr. Samuel Danagoulian

One issue that has had affect data analysis and calibration is the filter wheel position during data collection phase 2 of the experiment. During the experimental run, data collection was done in three phases; Phase 1: Pedestal Analysis, Phase 2: LMS Data, Phase 3: Production Runs. Where as the phase of the experiment periodically changed throughout the experimental run. Hence, the current phase of the experiment depended on the type of data that was being collected at the time. Thus the filter would rotate, depending on the phase of the experiment, and it would either allow a signal to enter the LMS trigger, or not. During Phase 2 of the experiment, light was allowed in and LMS Data was collected. However, there are different settings of filter wheel position, and depending on the position of the filter wheel, we would record LMS data that was not collimated and corresponded to the filter wheel position in which it was recorded. Therefore you have some runs that had LMS data, and some that didn't. This absence of LMS data is displayed on our graphs, bottom right, and is seen as wholes in the graphs. The larger the whole, the more consecutive runs that were taken with the filter wheel position being closed.

Missing LMS Data There is a total of 332 runs without LMS data, equating to about 23.85% of the total run (1350 Runs), and I labeled these as bad runs in my analysis. This missing data is also confirmed and corresponds to wholes existent in Dr. Danagoulian's PMT ratio plots. This behavior is also seen in the actual data, as seen to the left, as ADC values that often deviate drastically from the mean, with a constant value that is the same for all runs where there is no LMS data. Hence, these bad runs are not initially included in my averaging technique to correct LMS data, but values for these bad runs will be filled in later in my analysis.

LMS Data As you can see to the left, the actual LMS data for crystal ID W1005 displays a behavior that is directly proportional to the filter wheel position. Where as for every sequence of runs, they alternate between a High, Med, or Low ADC count readout. Hence, giving validity to the fact that there are 3 filter wheel positions in which light or a signal can enter into the LMS trigger. Thus, when we went to analyze the LMS data, particularly the stability of the data over all runs, we got graphs that looked like the one shown above. This graph displays 3 separate graphs, instead of one single graph. Hence, supporting the fact that our signal is being divided into three parts, instead of being collimate into one single signal. So to correct this problem we chose to collimate every three runs, take an average of the group, and redisplay the results. This was very possible to do and a very likely solution since each run was only giving us 1/3 of the total signal that we needed.

Averaged Data As seen above to the left, when we average every 3 runs we get a single averaged ADC value, as well as a single run number to plot it against. Now when we averaged all the runs and plotted them, as shown above right, our graphs yield a single line data that is better descriptive of the LMS data and stability over the entire run, for this particular ID. However, we did encounter situations where not all of the averaged groups had1 Med, 1 High, and 1 Low data set. Some had 2 Med and 1 Low, 2 High and 1 Med, ect... This resulted in averaged values that were either above or below the mean for the averaged data set. This particular situation is shown above to the left, highlighted in red. Looking back at the previous slide, the averaged group of runs 4148, 4149, and 4150, yields an ave of which is above the overall mean, and is displayed as the first point above the mean on the graph shown above. To correct these situations, different algorithms had to be devised and entered into the code to correct this problem.

As you can see above, my program does corrects the LMS data and fixes any data points that fall outside of the mean during the averaging of the data. I edited my program to correct all LMS data and handle all possible combinations of data. Where as my program is capable of handling various data sets such as; 2 High and 1 Low, 1 Low, 1 Med, and 1 Low, ect.. Hence now all incorrect data points will be collimated and corrected. Corrected LMS Data

Instead of setting the value of the averaged group equivalent to a predetermined group, or value already calculated, which is a widely used way to correct data, I'm using the values given within the averaged group to correct its self. An example of the code used to correct the data is as follows; if (fabs(((val[0]+val[1]+val[2])) - ((val[0]+val[1]+val[1]))) <= 3.0) // This works { if ((val[1]-val[2])==0.0 && val[0] < val[1]) // This fixes #1 { val[2]= (val[0]-((val[1]-val[0]))); sum = val[0]+val[1]+val[2]; //cout <<sum <<endl; // This prints out the Sum of 3 runs cout <<sum / 3.0 <<endl; // This prints out the Average of 3 runs k=0; sum=0.0; } This is the code I used to correct the data point mentioned earlier, in which the data points were corrected and the averaged of the group went from , as mentioned on slide #4, down to a value of 914, which is well with in the mean. This was done by reassigning the value of the 3 rd run in the set, and recalculating the average of the group. The following is an example of how I corrected of this group, and is equivalent to the code written above. Run 3 = Run 1 - ( Run 2 – Run 1) = ( 925 – 914) = 903 New Average = (Run 1 + Run 2 + Run 3) / 3.0 = ( ) / 3.0 = 914 How I Corrected the Data

Thus, my program collimates and corrects the data graphs, but does not correct all of the data points for every ID. There are some incidents were my program does improve the data, but doesn't correct it to the point were the graphs are linear and smooth as shown earlier. These particular ID's and graphs are a result of an over exposure to radiation of the crystals, during the experimental run. The graphs of one of these exposed ID's are as follow; Radiation Accident As shown in both graphs by the inverse spike in the data, the radiation accident happened around run What is even more interesting is as time passed from run to run, the crystal started almost repairing its self and rather re cooperated from the radiation damage done to it. To better understand this anomaly and others, the rate dependence of the LMS gain may need to be monitored and analyzed, to understand the effects from this radiation exposure in order to correct the data for all radiated ID's. This analysis is ongoing.

Other Anomalies Other anomalies from graphs that are not yet explained are as follows;

Correction of Missing LMS Data Data correction for all missing runs or runs without LMS data will occur as follows; 1) A complete list of all IDs having the same general plots must be made, and divided into groups of rather their plots are Linear, Exponential, etc... 2) For those plots that are Linear, a Mean value will be determined, and all ADC values for missing or bad runs will be set to a value equal to the mean. All Exponential plots and other plots will have to be fitted and a function will have to be calculated, and all missing or bad ADC values for these IDs will be set to that particular function in order to fill in all wholes present with in the data. 3) After all data is corrected and filled in, all data will be re-graphed and will hopefully display a single continuous line of data, depending on the ID, and will improve our stability plots, histograms, and overall data, such that calibration of HyCal can be performed.

In relation to statistics, when the LMS was developed and implemented into the PrimEx experiment, it was calculated that we would need a total of 2000 statistics of LMS data, for the total signal needed in order to properly calibrate the detector and it's associated instruments, and correctly calculate and record the short-term stability of the experiment it's self. However, upon analysis of the data, particularly analyzing the ADC spectrum of events, it was discovered that we were only accumulating about 700 statistics or events, instead of the 2000 initially determined. Where as in the set-up of the LMS, we 3 reference PMT's and 1 pin- diode that makes up the total LMS trigger. Thus giving us the following equation for the LMS trigger; 3 yap + 1 LED = LMS Trigger Thus giving us a 3 to 1 ratio in relation to the 3 radioactive yap sources to the LED. Hence indicating that the data received by the LMS is only 1/3 of the total signal, every run gives us a signal of about 700 statistics instead of the 2000 needed.