|
April 22, 2019
IceCube Institutional Memorandum Of Understanding (MOU)
Michigan State University
Tyce DeYoung
Ph.D Scientists (Faculty Scientist/Post Doc Grads): 7 (5 2 5)
Scope of Work
Maintenance and Operations
|
||||||||||||
Labor Cat.
|
Names
|
WBS L3
|
Tasks
|
Funds Source
|
WBS 2.1
|
WBS 2.2
|
WBS 2.3
|
WBS 2.4
|
WBS 2.5
|
WBS 2.6
|
Grand Total
|
|
Program Coordination
|
Detector Maintenance & Operations
|
Computing & Data Management
|
Data Processing & Simulation
|
Software
|
Calibration
|
|||||||
KE
|
DeYOUNG, TYCE
|
|
Outreach
|
Base Grants
|
0.05
|
0.05
|
||||||
|
Executive committee
|
Inst. In-Kind
|
0.05
|
0.05
|
||||||||
DeYOUNG, TYCE Total
|
|
0.10
|
|
|
|
|
0.10
|
| ||||
GRANT, DARREN
|
|
Collaboration Spokesperson
|
Inst. In-Kind
|
0.50
|
0.50
|
|||||||
|
Executive committee
|
Inst. In-Kind
|
0.20
|
0.20
|
||||||||
|
Outreach
|
Inst. In-Kind
|
0.05
|
0.05
|
||||||||
|
0.75
|
0.75
|
||||||||||
KOPPER, CLAUDIO
|
|
IceTray framework maintenance
|
Inst. In-Kind
|
0.05
|
0.05
|
|||||||
|
Maintenance of clsim photon propagation tool
|
Inst. In-Kind
|
0.10
|
0.10
|
||||||||
|
Diffuse WG co-chair
|
Inst. In-Kind
|
0.25
|
0.25
|
||||||||
|
Offline Processing Support / pass2
|
Inst. In-Kind
|
0.10
|
0.10
|
||||||||
|
GPU computing resources
|
Inst. In-Kind
|
0.05
|
0.05
|
||||||||
|
IceCube MasterClass
|
Inst. In-Kind
|
0.10
|
0.10
|
||||||||
|
0.10
|
0.25
|
0.15
|
0.15
|
0.65
|
|||||||
TOLLEFSON, KIRSTEN
|
||||||||||||
|
0.00
|
|||||||||||
MAHN, KENDALL
|
|
Integration/development of GENIE for low energy systematics
|
Inst. In-Kind
|
0.05
|
0.05
|
|||||||
MAHN, KENDALL Total
|
|
|
|
|
|
0.05
|
0.05
|
| ||||
PO
|
NISA, MEHR
|
|
Simulation production site manager at MSU HPCC
|
NSF M&O Core
|
0.25
|
0.25
|
||||||
|
Monitoring shift
|
Inst. In-Kind
|
0.03
|
0.03
|
||||||||
|
0.03
|
0.25
|
0.28
|
|||||||||
HALLIDAY, ROBERT
|
|
In-situ DOM sensitivity / angular response calibration from muon neutrinos
|
Inst. In-Kind
|
0.10
|
0.10
|
|||||||
HALLIDAY, ROBERT Total
|
|
|
|
|
|
|
0.10
0.10
|
| ||||
GR
|
NEER, GARRETT
|
|||||||||||
|
0.00
|
|||||||||||
NOWICKI, SARAH
|
||||||||||||
NOWICKI, SARAH Total
|
0.00
|
|||||||||||
RYSEWYCK CANTU, DEVYN
|
|
IceCube Masterclass
|
Base Grants
|
0.10
|
0.10
|
|||||||
RYSEWYCK CANTU, DEVYN Total
|
0.10
|
0.10
|
||||||||||
SANCHEZ HERRERA, SEBASTIAN
|
Detector calibration
|
|
Inst. In-Kind
|
0.10
|
0.10
|
|||||||
|
0.10
|
0.10
|
||||||||||
MICALLEF, JESSIE
|
Education & Outreach
|
|
NSF Grad Fellowship
|
0.05
|
0.05
|
|||||||
Core Software
|
|
NSF Grad Fellowship
|
0.20
|
0.20
|
||||||||
MICALLEF, JESSIE Total
|
0.05
|
0.20
|
0.30
|
|||||||||
MSU Total
|
1.10
|
0.28
|
0.25
|
0.15
|
0.40
|
0.20
|
2.38
|
IceCube Upgrade
|
||||||||||||
Labor Cat.
|
Names
|
WBS L3
|
Tasks
|
Funds Source
|
WBS 1.1
|
WBS 1.2
|
WBS 1.3
|
WBS 1.4
|
WBS 1.5
|
WBS 1.6
|
Grand Total
|
|
Project Management
|
Drilling
|
Sensors
|
Comms, Power, Timing
|
Calibration
|
Data Systems
|
|||||||
KE
|
DeYOUNG, TYCE
|
|
Management
|
NSF / Inst. In-Kind
|
0.50
|
0.50
|
||||||
DeYOUNG, TYCE Total
|
|
|
|
|
0.50
|
|
0.50
|
| ||||
GRANT, DARREN
|
|
MSU mDOM production facility
|
Inst. In-Kind
|
0.25
|
0.25
|
|||||||
GRANT, DARREN Total
|
|
|
|
0.25
|
|
|
0.25
|
| ||||
EN
|
NG, CHRISTOPHER
|
|
Penetrator Cable Assemblies, Main cable mechanical/ string hardware
|
NSF
|
0.75
|
0.75
|
||||||
|
Design, set up and maintain NTS
|
NSF
|
0.25
|
0.25
|
||||||||
|
1.00
|
1.00
|
||||||||||
FERGUSON, BRIAN
|
|
Main cable electrical, procurement support
|
NSF
|
0.25
|
0.25
|
|||||||
FERGUSON, BRIAN Total
|
|
|
|
|
0.25
|
|
0.25
|
| ||||
PO
|
HALLIDAY, ROBERT
|
|
Engineering support
|
Inst. In-kind
|
0.25
|
0.25
|
||||||
HALLIDAY, ROBERT Total
|
0.25
|
0.25
|
||||||||||
MSU Total
|
0.25
|
2.00
|
2.25
|
Faculty:
Tyce DeYoung (IL) – ExecCom, outreach, Upgrade L2 for CPT, 100% IceCube
Darren Grant – Spokesperson, outreach, Upgrade DOM production 100% IceCube
Claudio Kopper – Diffuse WG co-convener, software development, offline processing support, outreach, 100% IceCube
Kirsten Tollefson – 50% IceCube (50% HAWC)
Kendall Mahn – low energy systematics/GENIE, 5% IceCube (95% GENIE, T2K, DUNE)
Scientists and Engineers:
Chris Ng – engineering support for Upgrade WBS 1.3, 1.4, 100% IceCube
Brian Ferguson – engineering support for Upgrade WBS 1.4, 25% IceCube
Post Docs:
Mehr Nisa – simprod and distributed computing support (porting simprod to new cluster environment), monitoring shifts
Thesis/Analysis topics: neutrino/multi-messenger astronomy
Robert Halliday – DOM response/simulation calibration using neutrino-induced muons, Upgrade cable design/engineering
Thesis/Analysis topics: diffuse spectrum measurement/global fits
Ph.D. Students:
Garrett Neer
Thesis/Analysis topics: dark matter search using LE contained events
Sarah Nowicki
Thesis/Analysis topics: dark matter search using LE contained events
Devyn Rysewyck Cantu – IceCube Masterclass
Thesis/Analysis topics: Extended Galactic source search, IceACT R&D
Sebastian Sanchez Herrera – DOM response/simulation calibration using neutrino-induced muons
Thesis/Analysis topics: Anti-neutrino tagging at low energy using Michel electrons
Jessie Micallef – (funded by NSF Grad Fellowship) Core Software: Software strike team / CLSim development and maintenance. Education and outreach: CUWiP organizer, IceCube Masterclass
Thesis/Analysis topics: Next-generation oscillation analysis
Computing Resources:
2019
|
||
CPU Cores | GPU Cards | |
IceCube | 750 | 100 |
The Michigan State IceCube group provides the collaboration access to several large computing clusters maintained and administered by the Michigan State High Performance Computing Cluster in the Institute for Cyber-Enabled Research. The HPCC includes more 21,716 computing cores and 478 GPUs (mix of Tesla V100, K80, and K20). The IceCube group has purchased 750 cores and 8 GPUs in the cluster.
Cluster scheduling policy allows opportunistic use of all compute resources by campus users with job lengths <4 hours. Jobs making use of specialized resources, such as GPUs, have priority over all other jobs queued for those nodes. When appropriate jobs are available from IceProd servers, average (peak) resource allocations of 1000 (3000) CPU cores and 150 (300) GPUs have been obtained.
A new compute environment was rolled out on HPCC in 2018 and we have not yet completed the process of adapting our local scripts to the new environment. The pledge represents a conservative estimate of resource allocation available to IceProd jobs of less than 4 hours duration once that process is complete.
Michigan_MoU_SOW_2019.0422 Page 3 of 3