1. Faculty:
    2. Scientists and Post Docs: 
    3. Ph.D. Students:
    4. Diploma/Master Students: 
    5. Description of Service work
    6. Computing Resources

Last updated: September 19, 2018


IceCube Institutional Memorandum Of Understanding (MOU)

Scope Of Work

 

Niels Bohr Institute K ø benhavns Universitet

D. Jason Koskinen

Ph.D Scientists (Faculty Scientist/Post Doc Grads) : 4 (2 3 1)

 

Labor Cat. Names WBS L3 Tasks
WBS 2.1
WBS 2.2
WBS 2.3
WBS 2.4
WBS 2.5
WBS 2.6
Grand Total
       
Program Coordination
Detector Maintenance & Operations
Computing & Data Management
Data Processing & Simulation
Software
Calibration
 
KE
KOSKINEN, D. JASON 2.1.1 Administration Pubcom member
0.10
         
0.10
    2.1.4. Education & Outreach Speaking engagements (high school classes, open houses)
0.05
         
0.05
KOSKINEN, D. JASON Total  
0.15
         
0.15
PO
RAMEEZ, MOHAMED 2.5.3 Reconstruction Low energy reconstruction        
0.15
 
0.15
  STUTTARD, TOM 2.4.1 Offline Data Production OscNext Event Selection      
0.10
   
0.10
    2.2.3 Online Filter (Pnf) Oscillation WG co-convenor  
0.25
       
0.25
    2.5.1 Core Software PISA        
0.10
 
0.10
    2.1.2 Engineering and R&D Support Upgrade L3 simulation
0.15
         
0.15
  NBI PO Total    
0.15
0.25
 
0.10
0.25
 
0.75
GR
BOURBEAU, ETIENNE 2.6.1 Detector Calibration Dedicated measurements of coincident noise          
0.30
0.30
    2.6.1 Detector Calibration Individual DOM efficiency          
0.10
0.10
    2.2.4 Detector Monitoring Monitoring shifts  
0.05
       
0.05
NBI GR Total  
0.05
     
0.40
0.45
NBI Total    
0.30
0.30
 
0.10
0.25
0.40
1.35
 
Note: Gen-2 contributions not relevant for IceCube M&O are highlighted in blue (Total: 0.15 FTE)

 

Contribution from Master Students

Labor Cat. Names WBS L3 Tasks
WBS 2.1
WBS 2.2
WBS 2.3
WBS 2.4
WBS 2.5
Grand Total
       
Program Management
Detector Maintenance & Operations
Computing & Data Management
Triggering & Filtering
Data Quality, Reconstruction & Simulation Tools
 
Master Students Ida Storehaug & Thomas Halberg   MCEq, DirectReco        
1.0
1.0
NBI Master Student Total        
1.0
1.0
 


Faculty:

D. Jason Koskinen: PINGU/IC-Upgrade and low-energy simulation, tau neutrino appearance, 100% IceCube

Markus Ahlers: cosmic-ray anisotropy analysis, neutrino sources

[Subir Sarkar, representing Oxford U. on ICB, was also Niels Bohr Professor, spending 50% time at NBI until September 30, 2018]


Scientists and Post Docs: 

Tom Stuttard, Mohammed Rameez, Morten Medici


Ph.D. Students:

Etienne Bourbeau: Maintainer of Vuvuzela noise model, 2MRS correlation w/ IceCube multiplets, SNOLab DOM noise

Thesis/Analysis topics: Extended Tau Neutrino Appearance Measurement in DeepCore

 


Diploma/Master Students: 

1 FTE of total service work is being done by MSc students:

Lea Halser: A Comprehensive Study of Neutrino Transients with IceCube DeepCore/Upgrade

Ida Storehaug (atm. nu flux systematics and MCEq), Thomas Halberg (DirectReco for DeepCore and IC-Upgrade), and Mia Nielsen (LE transient search using GRECO)

 


Description of Service work

We generated DeepCore/PINGU/IC-Upgrade MuonGun files and with our 10 card GPU farm. NBI is responsible for refining and maintaining the correlated noise simulation. Etienne spent months at SNOLab making underground DOM measurements. Tom organized and hosted the weeklong IC-Upgrade Sim/Reco workshop. For IC-Upgrade Rameez is working on the mDOM reconstruction. Ida is working on integrating and verifying MCEq for atm. flux systematics. Thomas is working on getting DirectReco tested w/ DeepCore and expanding it for use w/ mDOMs in the IC-Upgrade.

 The NBI IceCube group also organized a local Masterclass for students of regional High School classes. Markus summarized IceCube's particle physics activities on the NBI Discovery Day.

 



Computing Resources
 

2016
2017
  CPU Cores GPU Cards CPU Cores GPU Cards
IceCube 0   0  
PINGU 0 10 0 10
Gen2        

 
Due to offsite and remote access issues there is additional work in order to get the necessary glide-ins necessary for automated simprod production. 
 
We have 10 K20 cards with the following setup.
 
- 2x E5-2650v2 (8 core, @2.6 GHz, 10% faster than E5-2670)
- 64GB memory
- Max 4x Nvidia K10 GPUs, full bandwidth (16x PCIe 3) simultaneously to all GPUs.

Back to top


Page 1 of 3