-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathjunk.tex
135 lines (124 loc) · 6.66 KB
/
junk.tex
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
\begin{textblock}{38.0}(4,40)
\begin{block}{High Energy Physics (HEP)}
%\begin{center}
The quest to understand the fundamental building blocks of nature,
and their interactions, is one of the longest running and most
ambitious of human endeavors. Facilities such as the Large Hadron
Collider (LHC), where we do our research, represent a huge step
forward in our ability to answer these questions. The discovery of
the Higgs boson, the observation of exceedingly rare decays of B
mesons, and exclusion of countless theories beyond the Standard
Model (SM) of particle physics demonstrate that these experiments
deliver results. However, the most interesting fundamental physics
questions remain wide open, amongst them: What is the dark matter
which pervades the universe? Does space-time have additional
symmetries or extend beyond the 3 spatial dimensions we know? What
is the mechanism stabilizing the Higgs mass from enormous quantum
corrections? Are neutrinos, whose only SM interactions are weak,
their own anti-particles? Can the theories of gravity and quantum
mechanics be reconciled? Planned and running HEP experiments
and facilities aim to answer these questions over the next 20 years.
The computing and software challenges of these projects are formidable.
The LHC experiments, for example, use nearly 0.5 Exabyte of
storage today in 170 computer centers in 42 countries.
The upgrade to the High-Luminosity Large Hadron Collider (HL-LHC) will
increase the data volume
by more than a factor of 100, with significantly increased data and detector complexity. The resulting computing needs will outpace the expected improvements in computer performance (Moore's Law) by factors of between 3 and 30.
\begin{figure}[tbph]
\centering
%\includegraphics[width=0.48\textwidth]{images/0910152_02-A5-at-72-dpi.jpg}
\includegraphics[width=0.41\textwidth]{images/CERN-LHC-cutaway-view-medium.png}
%\includegraphics[width=0.41\textwidth]{images/run204769_evt71902630_VP1Base-half.png}
\includegraphics[width=0.42\textwidth]{images/eemm_run195099_evt137440354_ispy_3d-annotated-2.png}
%\begin{center}
%\end{center}
\end{figure}
{\small \copyright~2009-2016 CERN (License: CC-BY-SA-4.0)}
\end{block}
\end{textblock}
\begin{textblock}{38.0}(44,40)
\begin{block}{S2I2-HEP and the HEP Software Foundation}
Our S2I2-HEP strategic plan will describe how an NSF S2I2, and the
U.S.\ university community, could provide leadership and enable the
science of the HL-LHC era. HEP experiments involve international
collaborations and a global software ecosystem, however, and the
activities of a possible S2I2 for HEP would need to fit into a
larger international context. To that end, we are also working with
the HEP Software Foundation (HSF) to develop on the same time scale
a ``Community White Paper'' (CWP) with a global roadmap for HEP
Software and Computing R\&D for the 2020s. The aim of the CWP is
to identify and prioritise the software research and development
investments required:
\begin{itemize}
\item to achieve improvements in software efficiency, scalability and performance and to make use of the advances in CPU, storage and network technologies
\item to enable new approaches to computing and software that could radically extend the physics reach of the detectors
\item to ensure the long term sustainability of the software through the lifetime of the HL-LHC
\end{itemize}
Achieving consensus in a large international community is a complex task. We are modeling the CWP process on that used for the HEP ``decadal survey'' (Snowmass) process and using a mix of dedicated general and topical workshops, solicitations for topical white papers contributions and outreach sessions at pre-existing HEP meetings. Most major HEP stakeholders (experiments, labs, institutions, software projects) are being engaged.
\end{block}
\end{textblock}
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%\begin{textblock}{32.0}(22,30)
%\begin{figure}[tbph]
%\centering
%\includegraphics[width=0.90\textwidth]{images/iris-hep-map-V1.png}
%\end{figure}
%%\end{block}
%\end{textblock}
%
%\begin{textblock}{28.0}(54,31)
%\begin{block}{An Intellectual Hub for the HEP Community}
%\begin{textblock}{26.0}(54,33)
%\begin{figure}[tbph]
%\centering
%\includegraphics[width=1.07\textwidth]{images/20230925-arxiv-roadmap.png}
%\end{figure}
%18 U.S. universities provide a hub for larger national and international software collaborations.
%\end{textblock}
%\end{block}
%\end{textblock}
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
\begin{textblock}{25.0}(2,52)
\begin{block}{A Python Data Science Ecosystem}
\begin{textblock}{25.0}(2,54)
\begin{figure}[tbph]
\centering
\includegraphics[width=1.00\textwidth]{images/scikit-hep-shells-hep.png}
\end{figure}
We develop sustainable analysis tools to extend the physics reach of the HL-LHC experiments by creating greater functionality, reducing time to insight, lowering the barriers for smaller teams, and streamlining analysis preservation, reproducibility, and reuse.
\end{textblock}
\end{block}
\end{textblock}
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
\begin{textblock}{25.0}(29,52.5)
\begin{block}{Innovative Algorithms}
\begin{textblock}{25.0}(29,54.5)
\begin{figure}[tbph]
\centering
\includegraphics[width=0.90\textwidth]{images/0610026_01-A5-at-72-dpi-slice.jpg}
\end{figure}
High performant software algorithms to perform the real-time processing in the trigger and the reconstruction of both real and simulated detector data are critical components of HEP’s computing challenge, for example for charged particle tracking.
\begin{figure}[tbph]
\centering
\includegraphics[width=0.90\textwidth]{images/trackreco-graphic.png}
\end{figure}
\end{textblock}
\end{block}
\end{textblock}
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
\begin{textblock}{25.0}(56,54)
\begin{block}{Data Organization, Management, Access (DOMA)}
\begin{textblock}{25.0}(56,56)
\begin{figure}[tbph]
\centering
\includegraphics[width=0.90\textwidth]{images/doma-scale-challenges.png}
\end{figure}
Our DOMA program of work centers on the management and delivery of exabyte-scale production datasets and delivery of data to analysis facilities. Biennial data challenges are used to show readiness of technologies and scale.
\end{textblock}
\end{block}
\end{textblock}