1. IceCube Institutional Memorandum Of Understanding (MOU)
  1. University of Maryland
    1. Greg Sullivan
    2. Ph.D Scientists (Faculty Scientist/Post Doc Grads): 6 (3 3 2)
    3. Scope of Work
    4. UPGRADE

     Last updated: September 16, 2019



IceCube Institutional Memorandum Of Understanding (MOU)

 

Back to top


University of Maryland


Greg Sullivan


Ph.D Scientists (Faculty Scientist/Post Doc Grads): 6 (3 3 2 )


Scope of Work

 

 

 

 

Labor Cat.
Names
WBS L3
Tasks
Funds Source
WBS 2.1
WBS 2.2
WBS 2.3
WBS 2.4
WBS 2.5
WBS 2.6
Grand Total
         
Program Coordination
Detector Maintenance & Operations
Computing & Data Management
Data Processing & Simulation
Software
Calibration
 
KE SULIVAN, GREG Administration M&O planning Inst. In-Kind
0.30
         
0.30
    Administration ExecCom member Inst. In-Kind
0.20
         
0.20
  SULIVAN, GREG Total     
0.50
         
0.50
  HOFFMAN, KARA Engineering and R&D Support Detector R&D Inst. In-Kind
0.20
         
0.20
  HOFFMAN, KARA Total    
0.20
         
0.20
  UMD KE Education & Outreach E&O Inst. In-Kind
0.10
         
0.10
  UMD KE Total    
0.10
         
0.10
SC BLAUFUSS, ERIK

 

 

Online Filter (PnF) Maintain PnF Software and Online Filters NSF M&O Core  
0.20
       
0.20
    Administration ICC member Base Grants
0.05
         
0.05
    Simulation Production Simulation Production site manager Base Grants
 
   
0.10
   
0.10
    Core Software Support Core Software NSF M&O     0.1      
0.1
    Online Filter (PnF) Filter requests, bandwidth, TFT Board Member. IceTray NSF M&O  
0.1
       
0.1
  BLAUFUSS, ERIK Total    
0.05
0.3
0.1
0.10
   
0.55
PO OLIVAS, ALEX Detector Maintenance & Operations SW Coordinator – Detector M&O NSF M&O     0.35        
0.35
    Administration ICC Base Grants 0.05          
0.05
    Computing & Data Management SW Coordinator – Core Software NSF M&O     0.20      
0.20
    Data Quality, Reconstruction & Simulation Tools SW Coordinator – Data Quality, Reconstruction and Simulation NSF M&O        
0.25
 
0.25
    Core Software Support Core Software Inst. In-Kind      0.05      
0.05
  OLIVAS, ALEX Total    
0.05
0.35
0.25
 
0.25
 
0.90
  Larson, Michael Simulation Low Energy tools Base Grants       0.2    
0.20
    Reconstruction Develop & test reconstruction Base Grants        
0.05
 
0.05
    Online Filter (PnF) Near Real time alerts/GRB Base Grants  
0.1
       
0.1
  Larson, Michael Total      
0.1
   .20
0.05
 
0.35
GR John Evans     Base Grants  
0.0
 
       
0.0
  TBD Total      
0.0
   
0.0
 
0.0
  FREIDMAN, LIZ Engineering and R&D Support Detector R&D Base Grants
0.25
         
0.25
    Online Filter (Pnf) Real-time & near real time alerts    
0.20
       
0.20
    Offline Data Production Neutrino Sources Data Curator Base Grants      
.20
   
0.20
  FREIDMAN, LIZ Total      0.25
0.20
 
.20
   
0.65
   UMD GR Detector Monitoring Monitoring shifts Base Grants  
0.06
       
0.06
  UMD GR Total      
0.06
       
0.06
UMD Total      
1.15
1.01
0.35
0.50
0.30
 
3.31
 

 

 


UPGRADE


 

Labor Cat.
Names
WBS L3
Tasks
Funds Source
WBS 1.6
Grand Total
         
Data Systems
 
KE SULIVAN, GREG Administration Upgrade planning Inst. In-Kind
0.04
0.04
  SULIVAN, GREG Total     
0.04
0.04
  UMD KE Total    
0.04
0.04
SC BLAUFUSS, ERIK
 

 

Administration L2 Manager NSF M&O Core  0.45
0.45
  BLAUFUSS, ERIK Total    
0.45
0.45
PO OLIVAS, ALEX SW Coordinator Offline software and simulation upgrade coordination NSF M&O  0.1
0.1
  OLIVAS, ALEX Total    
0.1
0.1
UMD Total      
0.59
0.59
 

 

 

Faculty:

Greg Sullivan (L,+) – Former Spokesperson, Data Systems, ExecCom, ICB, Institution lead, Outreach, NGIC upgrade coordination

Kara Hoffman – filter development, Radio R&D, Outreach

Jordan Goodman – Coordination with Milagro/HAWC, Outreach

Scientists and Post Docs:

Erik Blaufuss –   Former Analysis Coordinator, TFT board Member, PnF, IceTray, SVN repository, Operations Group, ICC, Upgrade L2 for Data Systems, ROC committee

Michael Larson –   Online – near real time GRB analysis, Extending point source sample to low energy GRECO data set

Alex Olivas –   Tuesday Call co-convener, Software management, Software Coordinator

 

Ph.D. Students:  

Liz Friedman – Core Software, Datasets for filter testing, Detector R&D Point Source WG Data Curator

 Thesis/Analysis topics: GRB, Realtime Alerts and response

John Evans – John just started this Fall semester and are working out his responsibilities

 Thesis: Just started in Fall 2019, not yet selected

TBD 1 student

 

 

 

UMD General M&O (non-science) IceCube Responsibilities and Contributions:

The Maryland Group’s major responsibilities and contributions towards maintenance and operations of the IceCube experiment include:

·   Primary institutional responsibility for the maintenance of the online PnF filter system.

·   Primary institutional responsibility for the maintenance of the IceTray analysis framework, SVN code repository and software package building.

·   Major responsibility for the maintenance of the IceCube simulation package (IceSim).

·   Software coordinator Alex Olivas

·   IceCube Upgrade L2 manager for Data systems and M&O integration

·   The Maryland group maintains a computing cluster of about 750 cpu cores and 48 GPU boards (24 GTX980, 24 GTX1080) with online disk storage of more then 350TB dedicated to IceCube activities. A minimum of 350 cpu cores and all GPUs are reserved for dedicated simulation production under the coordination of the IceCube simulation production manager. Maryland also provides resources to host and maintain a 64 GPU card system for UW.

Institutional (UMD) resource contribution to Computing:

The maintenance and operation of the computing cluster includes:

1.  High quality Computing Space, cooling and power (provided by UMD)

2.  Networking and high speed connectivity to the Internet (provided UMD)

3.  System administration (.5 FTE sys-admin) (partial support by UMD)

4.  Hardware maintenance on a 5-year replacement cycle of $40k/year (partial support by UMD).

5.  First ½ of GPU cluster purchased by UMD ($80k) and maintained with help from UMD

6.  hosting and maintaining an additional 64 GPU cards provided from UW for MC production

1. & 2. Computing Space, cooling and power & Networking and high speed connectivity to the Internet

The University of Maryland provides high quality space, cooling and power. The IceCube group is provided essentially unlimited space in a modern HPC computing facility for research computing on campus. The facility is monitored 24/7 by provided technicians and we have 24/7 secure access. The current Maryland-IceCube system occupies 10 rack spaces with additional space set aside for possible expansion.

Host to UW GPU system of 8 GPU systems for 64 GPU cards.

Maryland is a major hub for the Internet-2 backbone in the northeast US. The University provides a 10 Gb/s fiber connection directly from the interenet-2 backbone into our cluster in the research computing facility. In addition, the university provides a dedicated fiber between the research computing facility and our research group in the physics building.

3., 4. & 5. System administration & Hardware maintenance on a 5-year replacement cycle

The University of Maryland provides $40k per year in funding to be used towards the total. The system administration is approximately .5 FTE and includes administration for the computing cluster as well as about 1 dozen workstations used by the PA group. The hardware maintenance for the compute cluster is $40k per year.

Computing Resources

2017
2018
  CPU Cores GPU Cards CPU Cores GPU Cards
IceCube 350 guaranteed of 750 total 24 (GTX980) +
24 (GTX1080)
350 guaranteed of 1000 total 48(GTX980/1080)
+ 64 new GPU cards hosting for UW  

Back to top


Page 1 of 4

Maryland_MoU_SOW_2019.0916