First Results from MiniBooNE Eric Prebys, FNAL/BooNE Collaboration
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 2 University of Alabama Los Alamos National Laboratory Bucknell University Louisiana State University University of Cincinnati University of Michigan University of Colorado Princeton University Columbia University Saint Mary’s University of Minnesota Embry Riddle University Virginia Polytechnic Institute Fermi National Accelerator Laboratory Western Illinois University Indiana University Yale University The MiniBooNE Collaboration
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 3 Outline State of neutrino mixing measurements History and background Without LSND LSND and Karmen Experiment Beam Detector Calibration and cross checks Analysis Results Future Plans and outlook
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 4 The Neutrino “Problem” 1968: Experiment in the Homestake Mine first observes neutrinos from the Sun, but there are far fewer than predicted. Possibilities: Experiment wrong? Solar Model wrong? ( believed by most not involved) Enough created, but maybe oscillated (or decayed to something else) along the way. ~1987: Also appeared to be too few atmospheric muon neutrinos. Less uncertainty in prediction. Similar explanation. Both results confirmed by numerous experiments over the years. 1998: SuperKamiokande observes clear oscillatory behavior in signals from atmospheric neutrinos. For most, this establishes neutrino oscillations “beyond a reasonable doubt”
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 5 Theory of Neutrino Oscillations Neutrinos are produced and detected as weak eigenstates ( e, , or ). These can be represented as linear combination of mass eigenstates. If the above matrix is not diagonal and the masses are not equal, then the net weak flavor content will oscillate as the neutrinos propagate. Example: if there is mixing between the e and : then the probability that a e will be detected as a after a distance L is: Distance in km Energy in GeV Mass eigenstates Flavor eigenstates Only measure magnitude of the difference of the squares of the masses.
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 6 Probing Neutrino Mass Differences & Reactors Accelerators use decay to directly probe e Reactors use use disappearance to probe e Cerenkov detectors directly measure and e content in atmospheric neutrinos. Fit to e mixing hypotheses Solar neutrino experiments typically measure the disappearance of e. Different experiments probe different ranges of Path length Energy Also probe with new generation of “long baseline” accelerator and reactor experiments, MINOS, T2K, etc
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 7 Best Three Generation Picture (all experiments but LSND)
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 8 The LSND Experiment ( ) Signature Cerenkov ring from electron Delayed from neutron capture ~30 m Energy MeV mix
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 9 LSND Result (Soudan, Kamiokande, MACRO, Super-K) (Homestake, SAGE, GALLEX, Super-K SNO, KamLAND) Excess Signal: Best fit: Only exclusive appearance result to date Problem: m 2 ~ 1 eV 2 not consistent with other results with simple three generation mixing
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 10 Possibilities 4 neutrinos? We know from Z lineshape there are only 3 active flavors Sterile? CP or CPT Violation? More exotic scenarios? LSND Wrong? Can’t throw it out just because people don’t like it.
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 11 Accommodating a Positive Signal We know from LEP that there are only 3 active, light neutrino flavors. If MiniBooNE confirms the LSND results, it might be evidence for the existence of sterile neutrinos
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 12 Karmen II Experiment: not quite enough Pulse 800 MeV proton beam (ISIS) 17.6 m baseline 56 tons of liquid scintillator Factor of 7 less statistical reach than LSND -> NO SIGNAL Combined analysis still leaves an allowed region Combined
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 13 Role of MiniBooNE Boo(ster) N(eutrino) E(xperiment) Full “BooNE” would have two detectors Primary Motivation: Absolutely confirm or refute LSND result Optimized for L/E ~ 1 Higher energy beam -> Different systematics than LSND E: ~30 MeV -> 500 MeV L: ~30 m -> 500 m Timeline Proposed: 12/97 Approved 1998 Began Construction: 10/99 Completed: 5/02 First Beam: 8/02 Began to run concurrently with NuMI: 3/05 Oscillation results: 4/07
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 14 MiniBooNE Neutrino Beam (not to scale) 8 GeV Protons ~ 8E16 p/hr max ~ 1 detected neutrino/minute L/E ~ 1 FNAL Booster Be Target and Horn “Little Muon Counter” (LMC): to understand K flux 50 m Decay Region 500m dirt Detector
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 15 Detector 950,000 l of pure mineral oil 1280 PMT’s in inner region 240 PMT’s outer veto region Light produced by Cerenkov radiation and scintillation Light barrier Trigger: All beam spills Cosmic ray triggers Laser/pulser triggers Supernova trigger
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 16 Neutrino Detection/Particle ID n p W ee ee n p W n 00 Z p 00 Important Background!!!
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 17 Selecting Neutrino Events Collect data from -5 to +15 usec around each beam spill trigger. Identify individual “events” within this window based on PMT hits clustered in time. Time (ns) No cuts Veto hits < 6 Veto hits ns spill
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 18 Delivering Protons Requirements of MiniBooNE greatly exceed the historical performance of the 30+ year old 8 GeV Booster, pushes… Average repetition rate Above ground radiation Radiation damage and activation of accelerator components Intense Program to improve the Booster Shielding Loss monitoring and analysis Lattice improvements (result of Beam Physics involvement) Collimation system Very challenging to continue to operate 8 GeV line during NuMI/MINOS operation Once believed impossible Element of lab’s “Proton Plan” Goal to continue to deliver roughly 2E20 protons per years to the 8 GeV program even as NuMI intensities ramp up.
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 19 Small but Hungry MiniBooNE began taking data in Fall of 2002, and has currently taken more protons than all other users in history of Fermilab combined (including NuMI and pBar) Total protons (10 19 ) 63E19 Events in mode (this analysis) In anti- mode since ~1/06
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 20 Modeling neutrino flux Production GEANT4 model of target, horn, and beamline MARS for protons and neutrons Sanford-Wang fit to production data for and K Mesons allowed to decay in model of decay pipe. Retain neutrinos which point at target Data Constrained by HARP Experiment e e K e e K Antineutrino content: 6% e / = 0.5% “Intrinsic” e + e sources: e + e (52%) K + e + e (29%) K 0 e e (14%) Other ( 5%)
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 21 Interactions Interactions Cross sections Based on NUANCE 3 Monte Carlo Use NEUT and NEUEN as cross checks Theoretical input: Llewellyn-Smith free nucleon cross sections Rein-Sehgal resonant and coherent cross-sections Bodek-Yang DIS at low-Q2 Standard DIS parametrization at high Q2 Fermi-gas model Final state interaction model Constraining NUANCE Observed nm data M A eff -- effective axial mass E lo SF -- Pauli Blocking parameter From electron scattering data E b -- binding energy p f -- Fermi momentum MiniBooNE
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 22 D. Casper, NPS, 112 (2002) 161 Predicted Event Rates (NUANCE)
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 23 Characterizing the Detector Full GEANT 3.21 model of detector Includes detailed (39-parameter!!) optical model of oil and PMT response MC events produced at the raw data level and reconstructed in the same way as real events
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 24 Calibrating the Detector
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 25 N 00 N NC 0 The 0 decays to 2 photons, which can look “electron-like” mimicking the signal... <1% of 0 contribute to background. N N 25% 8% CC + Easy to tag due to 3 subevents. Not a substantial background to the oscillation analysis. Events producing pions (also decays to a single photon with 0.56% probability)
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 26 Bottom line: signal and background If the LSND best fit is accurate, only about a third of our observed rate will come from oscillations Backgrounds come from both intrinsic e and misidentified Signal Mis ID Intrinsic e Energy distribution can help separate
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 27 Analysis Philosophy The “golden signal” in the detector is the Charged Current Quasi- Elastic (CCQE) event With a particular mass hypothesis, the neutrino energy can be reconstructed from the observed partcle with If the LSND signal were due to neutrino oscillations, we would expect an excess in the energy range of MeV The “signal” is therefore an event with A high probability of being a e CCQE event A reconstructed energy in the range 300 to 1500 MeV
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 28 Two Analyses “Track Based” analysis: Use detailed tracking information to form a particle ID likliehood. “Box” defined by e likelihood and energy. Backgrounds weighted by observed events outside of box. “Boosting” Particle ID based on feeding a large amount of event information into a “boosted decision tree” (details to follow) “Box” defined by the boosted “score” and the mass range. Backgrounds determined by models which are largely constrained by the data.
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 29 Common Event Cuts >200 tank hits <6 veto hits R<500 cm (algorithm dependent) data MC
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 30 Blindness In general, the two analyses have sequestered (hidden) data which has A reconstructed neutrino energy in the range 300 to 1500 MeV A high probability of being an electron This leaves the vast majority (99%) of the data available for analysis Individual open boxes were allowed, provided it could be established that an oscillation would not lead to a significant signal. In addition, detector level data (hit times, PMT spectra, etc) were available for all events. Determination of “primary” analysis based on final sensitivity before opening the box
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 31 The goal of both analyses: minimize background & maximize signal efficiency. “Signal range” is approximately 300 MeV < E QE < 1500 MeV One can then either: look for a total excess (“counting expt”) fit for both an excess and energy dependence (“energy fit”) MiniBooNE signal examples: m 2 =0.4 eV 2 m 2 =0.7 eV 2 m 2 =1.0 eV Neutrino Energy (GeV) Arbitrary Units
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 32 MiniBooNE is searching for a small but distinctive event signature In order to maintain blindness, Electron-like events were sequestered, Leaving ~99% of the in-beam events available for study. Rule for cuts to sequester events: <1 signal outside of the box Low level information which did not allow particle-id was available for all events. Open Data for Studies:
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 33 Uses detailed, direct reconstruction of particle tracks, and ratio of fit likelihoods to identify particles. Philosophy: This algorithm was found to have the better sensitivity to e appearance. Therefore, before unblinding, this was the algorithm chosen for the “primary result” Analysis 1: “Track-Based” (TB) Analysis
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 34 Track-based analysis
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 35 Each event is characterized by 7 reconstructed variables: vertex (x,y,z), time, energy, and direction ( ) (U x, U y, U z ). Resolutions: vertex: 22 cm direction: 2.8 energy: 11% CCQE events 2 subevents Veto Hits<6 Tank Hits>200
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 36 Rejecting “muon-like” events Using log(L e /L ) log(L e /L )>0 favors electron-like hypothesis Note: photon conversions are electron-like. This does not separate e/ 0. Separation is clean at high energies where muon-like events are long. Analysis cut was chosen to maximize the e sensitivity e CCQE CCQE MC
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 37 Rejecting “ 0 -like” events MC Cuts were chosen to maximize e sensitivity Using a mass cutUsing log(L e /L ) NC 0 e CCQE NC 0 e CCQE
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 38 BLIND e π0π0 Invariant Mass e π0π0 BLIND Monte Carlo π 0 only Testing e- 0 separation using data 1 subevent log(L e /L )>0 (e-like) log(L e /L )<0 ( -like) mass>50 (high mass) log(L e /L ) invariant mass signal
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 39 2 Prob for mass<50 MeV (“most signal-like”): 69% mass<200 (low mass) log(L e /L )>0 (e-like) log(L e /L )<0 ( -like) BLI ND Monte Carlo π 0 only Next: look here subevent log(L e /L )>0 (e-like) log(L e /L )<0 ( -like) mass<200 (low mass)
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 40 Efficiency: Log(L e /L ) + Log(L e /L ) + invariant mass Backgrounds after cuts Summary of Track Based cuts “Precuts” +
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 41 Construct a set of low-level analysis variables which are used to make a series of cuts to classify the events. Philosophy: This algorithm represents an independent cross check of the Track Based Analysis Analysis 2: Boosted Decision Trees (BDT)
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 42 Fundamental information from PMTs Analysis Hit Position Charge Hit Timing variables Energy Time sequence Event shape Physics Step 1: Convert the “Fundamental information” into “Analysis Variables” “Physics” = 0 mass, E QE, etc.
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 43 Resolutions: vertex: 24 cm direction: 3.8º energy 14% Examples of “Analysis Variables” Reconstructed quantities which are inputs to E QE CCQE U Z = cos z E visible
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 44 Step 2: Reduce Analysis Variables to a Single PID Variable “A procedure that combines many weak classifiers to form a powerful committee” Boosted Decision Trees hit level (charge, time, position) analysis variables One single PID “score” Byron P. Roe, et al., NIM A543 (2005) 577.
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 45 (sequential series of cuts based on MC study) A Decision Tree (N signal /N bkgd ) 30,245/16, / / / / /11867 signal-like bkgd-like sig-like bkgd-like etc. This tree is one of many possibilities... Variable 1 Variable 2 Variable 3
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 46 A set of decision trees can be developed, each re-weighting the events to enhance identification of backgrounds misidentified by earlier trees (“boosting”) For each tree, the data event is assigned +1 if it is identified as signal, -1 if it is identified as background. The total for all trees is combined into a “score” negativepositive Background- like signal-like
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 47 BDT cuts on PID score as a function of energy. We can define a “sideband” just outside of the signal region
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 48 BDT cuts on PID score as a function of energy. We can define a “sideband” just outside of the signal region
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 49 BDT Efficiency and backgrounds after cuts: Analysis cuts on PID score as a function of Energy signal background Efficiency after precuts
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 50 Errors, Constraints and Sensitivity
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 51 We have two categories of backgrounds: (TB analysis) mis-id intrinsic e Predictions of the backgrounds are among the nine sources of significant error in the analysis
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 52 Flux from + / + decay 6.2 / 4.3 √ √ Flux from K + decay3.3 / 1.0 √ √ Flux from K 0 decay 1.5 / 0.4 √ √ Target and beam models2.8 / 1.3 √ -cross section 12.3 / 10.5 √ √ NC 0 yield1.8 / 1.5 √ External interactions (“Dirt”) 0.8 / 3.4 √ Optical model6.1 / 10.5 √ √ DAQ electronics model7.5 / 10.8 √ Source of Uncertainty On e background Checked or Constrained by MB data Further reduced by tying e to Track Based /Boosted Decision Tree error in %
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 53 Tying the e background and signal prediction to the flux constrains this analysis to a strict e appearance-only search Data/MC Boosted Decision Tree: 1.22 ± 0.29 Track Based: 1.32 ± 0.26 BDT Predict Normalization & energy dependence of both background and signal From the CCQE events
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 54 e e Measure the flux Kinematics allows connection to the flux E (GeV) E (GeV) E = 0.43 E Once the flux is known, the flux is determined constraint on intrinsic e from + decay chains
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 55 K + and K 0 decay backgrounds At high energies, above “signal range” and “ e -like” events are largely due to kaon decay Signal examples: m 2 =0.4 eV 2 m 2 =0.7 eV 2 m 2 =1.0 eV 2 Predicted range of significant oscillation signal: 300<E QE <1500 MeV Neutrino Energy (GeV) Arbitrary Units signal range
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 56 TB In Boosted Decision Tree analysis: Low energy bin (200<E QE <300 MeV) constrains mis-ids: 0, N , dirt... signal range signal up to 3000 MeV BDT In both analyses, high energy bins constrain e background Use of low-signal/high-background energy bins
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 57 We constrain 0 production using data from our detector Because this constrains the resonance rate, it also constrains the rate of N Reweighting improves agreement in other variables, e.g. This reduces the error on predicted mis-identified 0 s
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 58 Other Single Photon Sources From Efrosinin, hep-ph/ , calculation checked by Goldman, LANL Neutral Current: + N + N + Charged Current + N + N’ + negligible where the presence of the leads to mis-identification Use events where the is tagged by the michel e -, study misidentification using BDT algorithm. < 6 95% CL
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 59 “Dirt” Events Event Type of Dirt after PID cuts Enhanced Background Cuts interactions outside of the detector N data /N MC = 0.99 ± 0.15 Cosmic Rays: Measured from out-of-beam data: 2.1 ± 0.5 events External Sources of Background
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 60 Summary of predicted backgrounds for the final MiniBooNE result (Track Based Analysis): (example signal)
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 61 Handling uncertainties in the analyses: For a given source of uncertainty, Errors on a wide range of parameters in the underlying model For a given source of uncertainty, Errors in bins of E QE and information on the correlations between bins What we begin with what we need
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 62 How the constraints enter... TB: Reweight MC prediction to match measured result (accounting for systematic error correlations) Two Approaches Systematic (and statistical) uncertainties are included in (M ij ) -1 BDT: include the correlations of to e in the error matrix: (i,j are bins of E QE )
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 63 M A QE, e lo sf 6%, 2% (stat + bkg only) QE norm 10% QE shape function of E e / QE function of E NC 0 rate function of 0 mom M A coh, coh ±25% N rate function of mom + 7% BF E B, p F 9 MeV, 30 MeV s 10% M A 1 25% M A N 40% DIS 25% determined from MiniBooNE QE data determined from MiniBooNE NC 0 data Example: Cross Section Uncertainties determined from other experiments (Many are common to and e and cancel in the fit)
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 64 Example: Optical Model Uncertainties 39 parameters must be varied Allowed variations are set by the Michel calibration sample To understand allowed variations, we ran 70 hit-level simulations, with differing parameters. “Multisims”
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 65 Using Multisims to convert from errors on parameters to errors in E QE bins: For each error source, “Multisims” are generated within the allowed variations by reweighting the standard Monte Carlo. In the case of the OM, hit-level simulations are used. number of multisims Number of events passing cuts in bin 500<E QE <600 MeV 1000 multisims for K + production 70 multisims for the Optical Model standard MC
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 66 Correlations between E QE bins from the optical model: N is number of events passing cuts MC is standard monte carlo represents a given multisim M is the total number of multisims i,j are E QE bins Error Matrix Elements: Total error matrix is sum from each source. TB: e -only total error matrix BDT: - e total error matrix MC BDT
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 67 As we show distributions in E QE, keep in mind that error bars are the diagonals of the error matrix. The effect of correlations between E QE bins is not shown, however E QE bin-to-bin correlations improve the sensitivity to oscillations, which are based on an energy-dependent fit.
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 68 Sensitivity of the two analyses The Track-based sensitivity is better, thus this becomes the pre-determined default algorithm Set using 2 90% CL
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 69 Comparison to sensitivity goal for 5E20 POT determined by Fermilab PAC in 2003
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 70 The Initial Results
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 71
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 72 Box Opening Procedure After applying all analysis cuts: 1.Fit sequestered data to an oscillation hypothesis, returning no fit parameters. Return the 2 of the data/MC comparison for a set of diagnostic variables. 2. Open up the plots from step 1. The Monte Carlo has unreported signal. Plots chosen to be useful diagnostics, without indicating if signal was added. 3. Report the 2 for a fit to E QE, without returning fit parameters. 4.Compare E QE in data and Monte Carlo, returning the fit parameters. At this point, the box is open (March 26, 2007) 5.Present results two weeks later. Progress cautiously, in a step-wise fashion
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 73 Step 1 Return the 2 of the data/MC comparison for a set of diagnostic variables All analysis variables were returned with good probability except... Track Based analysis 2 Probability of E visible fit: 1% This probability was sufficiently low to merit further consideration 12 variables are tested for TB 46 variables are tested for BDT
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 74 In the Track Based analysis We re-examined our background estimates using sideband studies. We found no evidence of a problem However, knowing that backgrounds rise at low energy, We tightened the cuts for the oscillation fit: E QE > 475 MeV We agreed to report events over the original full range: E QE > 300 MeV,
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 75 Step 1: again! Return the 2 of the data/MC comparison for a set of diagnostic variables Parameters of the oscillation fit were not returned. TB (E QE >475 MeV) BDT 2 probabilities returned: 12 variables 46 variables
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 76 Open up the plots from step 1 for approval. Examples of what we saw: MC contains fitted signal at unknown level Step 2 E visible TB (E QE >475 MeV) BDT fitted energy (MeV) E visible 2 Prob= 28% 2 Prob= 59%
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 77 Report the 2 for a fit to E QE across full energy range TB (E QE >475 MeV) 2 Probability of fit: 99% BDT analysis 2 Probability of fit:52% Step 3 Leading to... Open the box... Step 4
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 78 Counting Experiment: 475<E QE <1250 MeV data: 380 events expectation: 358 19 (stat) 35 (sys) events significance: 0.55 The Track-based e Appearance-only Result:
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 79 Error bars are diagnonals of error matrix. Fit errors for >475 MeV: Normalization 9.6% Energy scale: 2.3% Best Fit (dashed): (sin 2 2 , m 2 ) = (0.001, 4 eV 2 ) Track Based energy dependent fit results: Data are in good agreement with background prediction.
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 80 The result of the e appearance-only analysis is a limit on oscillations: Energy fit: 475<E QE <3000 MeV 2 probability, null hypothesis: 93%
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys ± 17 ± 20 events above background, for 300<E QE <475MeV Deviation: 3.7 As planned before opening the box.... Report the full range: 300<E QE <3000 MeV to E>475 MeV Background-subtracted:
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 82 Best Fit (dashed): (sin 2 2 , m 2 ) = (1.0, 0.03 eV 2 ) 2 Probability: 18% Fit to the > 300 MeV range: } Examples in LSND allowed range
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 83 This will be addressed by MiniBooNE and SciBooNE This is interesting, but requires further investigation A two-neutrino appearance-only model systematically disagrees with the shape as a function of energy. We need to investigate non-oscillation explanations, including unexpected behavior of low energy cross sections. This will be relevant to future e searches
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 84 Boosted Decision Tree Analysis Counting Experiment: 300<E QE <1600 MeV data: 971 events expectation: 1070 33 (stat) 225 (sys) events significance: 0.38
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 85 Boosted Decision Tree E QE data/MC comparison: error bars are stat and sys (diagonals of matrix) data -predicted (no osc) error (sidebands used for constraint not shown)
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 86 Boosted Decision Tree analysis shows no evidence for e appearance-only oscillations. Energy-fit analysis: solid: TB dashed: BDT Independent analyses are in good agreement.
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 87 1) There are various ways to present limits: Single sided raster scan (historically used, presented here) Global scan Unified approach (most recent method) 2) This result must be folded into an LSND-Karmen joint analysis. We will present a full joint analysis soon. Two points on interpreting our limit Church, et al., PRD 66,
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 88 A MiniBooNE-LSND Compatibility Test For each m 2, determine the MB and LSND measurement: z MB z MB, z LSND z LSND where z = sin 2 (2 ) and z is the 1 error For each m 2, form 2 between MB and LSND measurement Find z 0 that minimizes 2 (weighted average of two measurements) and this gives 2 min Find probability of 2 min for 1 dof; this is the joint compatibility probability for this m 2
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 89 Maximum Joint Probability m 2 (eV 2 ) MiniBooNE is incompatible with a e appearance only interpretation of LSND at 98% CL
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 90 Plans: A paper on this analysis will be posted to the “archive” and to the MiniBooNE webpage after 5 CT today. Many more papers supporting this analysis will follow, in the very near future: CCQE production 0 production MiniBooNE-LSND-Karmen joint analysis We are pursuing further analyses of the neutrino data, including... an analysis which combines TB and BDT, more exotic models for the LSND effect. MiniBooNE is presently taking data in antineutrino mode.
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 91 Conclusions
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 92 A generic search for a e excess in our beam, An analysis of the data within a e appearance-only context Our goals for this first analysis were:
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 93 Within the energy range defined by this oscillation analysis, the event rate is consistent with background. The observed low energy deviation is under investigation.
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 94 The observed reconstructed energy distribution is inconsistent with a e appearance-only model Therefore we set a limit on e appearance
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 95 We Thank: DOE and NSF The Fermilab Divisions and Staff
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 96 Conclusions and Outlook MiniBooNE has been running for over three years, and continues to run well in the NuMI era The analysis tools are well developed and being refined to achieve the quality necessary to release the result of our blind analysis Recent results for CCQE and CCPiP give us confidence on our understanding of the detector and data. Look forward to many interesting results in 2006
Backup Slides
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 98 Sterile Neutrinos: Astrophysics Constraints (courtesy M. Shaevitz) Constraints on the number of neutrinos from BBN and CMB Standard model gives N =2.6±0.4 constraint If 4 He systematics larger, then N =4.0±2.5 If neutrino lepton asymmetry or non- equilibrium, then the BBN limit can be evaded. K. Abazajian hep-ph/ G. Steigman hep-ph/ “One result of this is that the LSND result is not yet ruled out by cosmological observations.” Hannestad astro-ph/ Bounds on the neutrino masses also depend on the number of neutrinos (active and sterile) Allowed m i is 1.4 (2.5) eV 4 (5) neutrinos N =3 solid 4 dotted 5 dashed Hannestad astro-ph/
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 99 Reactor Experimental Results Single reactor experiments (Chooz, Bugey, etc). Look for e disappearance: all negative KamLAND (single scintillator detector looking at ALL Japanese reactors): e disappearance consistent with mixing.
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 100 K2K First “long baseline” accelerator experiment Beam from KEK PS to Kamiokande, 250 km away Look for disappearance (atmospheric “problem”) Results consistent with mixing Best fit No mixing Allowed Mixing Region
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 101 HARP (CERN) 5% Beryllium target 8.9 GeV proton beam momentum Modeling Production of Secondary Pions HARP collaboration, hep-ex/ Data are fit to a Sanford-Wang parameterization.
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 102 Everybody Loves a Mystery 3+2 Sterile neutrinos Sorel, Conrad, and Shaevitz (hep-ph/ ) MaVaN & 3+1 Hung (hep-ph/ ) Sterile neutrinos Kaplan, Nelson, and Weiner (hep-ph/ ) Explain Dark Energy? CPT violation and 3+1 neutrinos Barger, Marfatia & Whisnant (hep-ph/ ) Explain matter/antimatter asymmetry Lorentz Violation Kostelecky & Mewes (hep-ph/ ) Extra Dimensions Pas, Pakvasa, & Weiler (hep-ph/ ) Sterile Neutrino Decay Palomares-Ruiz, Pascoli & Schwetz (hep-ph/ )
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 103 Three Generation Mixing (Driven by experiments listed) General Mixing Parameterization CP violating phase Almost diagonal Third generation weakly coupled to first two “Wolfenstein Parameterization” Mixing large No easy simplification Think of mass and weak eigenstates as totally separate
HEP Seminar, UT Austin, April 30 th, 2007 – E. Prebys 104 SNO Solar Neutrino Result Looked for Cerenkov signals in a large detector filled with heavy water. Focus on 8 B neutrinos Used 3 reactions: e +d p+p+e - : only sensitive to e x +d p+n+ x : equally sensitive to e, , x + e - x + e - : 6 times more sensitive to e than , d Consistent with initial full SSM flux of e ’s mixing to , Completely consistent with three generation mixing Just SNOSNO+others