Skip to content

Commit a3cc214

Browse files
committed
adding ISC BoF
1 parent d015806 commit a3cc214

1 file changed

Lines changed: 132 additions & 0 deletions

File tree

_bofs/isc25-streaming.html

Lines changed: 132 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,132 @@
1+
---
2+
layout: page
3+
title: ISC 2025 Data Streaming BoF
4+
subtitle: "Real-Time Scientific Data Streaming to HPC Nodes: Challenges and Innovations"
5+
event_date: June 10, 2025
6+
time: "TBD CET"
7+
order: 6
8+
---
9+
10+
<div class="container" data-aos="fade-up" style="padding: 3em 0;">
11+
<div class="row">
12+
<div class="col-lg-8 wg">
13+
<p class="text-center">
14+
<img src="https://isc-hpc.com/wp-content/uploads/2024/02/isc-logo.svg" style="width: 15em;" />
15+
</p>
16+
<p class="subheading text-center">{{page.subtitle}}</p>
17+
<h2 class="mb-4 text-center">{{page.title}}</h2>
18+
19+
<p class="text-center mb-5">
20+
{{page.event_date | date: "%A - %b %d, %Y" }}<br />
21+
{{page.time}}<br />
22+
<!-- Room B213 -->
23+
</p>
24+
<p>
25+
The most common way scientific data arrives at HPC facilities today is through a set of border gateway
26+
nodes connected to some form of shared file systems. Dataflow orchestration tools like Globus have made
27+
this approach popular by being programmable, easy to use, and efficient. This approach works well when
28+
the subsequent compute job for data analysis can afford to wait in the scheduler’s queue, with the
29+
overall completion time being mostly dominated by that wait time.
30+
</p>
31+
<p>
32+
However, as HPC centers and experimental and observational facilities become more integrated, workflows
33+
will demand immediate, real-time feedback where the latency and performance variability that comes with
34+
a shared file system is no longer acceptable. While a few streaming workflows have emerged recently,
35+
many HPC, data, or network user facilities are not set out to support these workflows out-of-gate, for
36+
policy, scheduler, or hardware reasons. However, these workflows are the cornerstone of modern
37+
scientific applications, as their stringent timing requirements benchmark the ultimate integration of
38+
HPC, data, and networks into seamless compute-in-the-loop workflows for experimental or observational
39+
user facilities.
40+
</p>
41+
<p>
42+
This BoF discusses this alternative way of using HPC by opening with a science use case that exemplifies
43+
this new class of emerging workflows. It addresses a groundbreaking field in supercomputing that has not
44+
been previously featured at ISC. Through a set of lightning talks from user facilities, the BoF will
45+
survey how HPC centers address this need today and what they have planned for the near future. These
46+
presentations aim to seed the ensuing discussion where the audience can ask questions to key staff of
47+
HPC facilities, or can bring their specific streaming workflow to the attention of the workflow
48+
community, or participate in a workflow challenge where the audience will be divided to represent the
49+
various stakeholders of example streaming workflows and have to work together to identify requirements,
50+
hurdles and possible solutions.
51+
</p>
52+
<!-- <img src="/images/bof/data_streaming_bof.jpg" style="width: 100%; border-radius: 1em;" /> -->
53+
54+
<h2 class="mb-4">Agenda</h2>
55+
56+
<div class="col-md-12 ftco-animate">
57+
<ul>
58+
<li style="color: rgb(5, 135, 215)">
59+
5min &mdash; Welcome and Interactive Setup
60+
<p style="padding-left: 1em; line-height: 1.2em; color: #666; margin-bottom: 0.5em">
61+
Bjoern Enders &mdash; <span style="font-style: italic; color: #999">National Energy Research
62+
Scientific Computing Center (NERSC)</span><br />
63+
Rafael Ferreira da Silva &mdash; <span style="font-style: italic; color: #999">Oak Ridge
64+
Leadership Computing Facility (OLCF)</span><br />
65+
</p>
66+
</li>
67+
<li style="color: rgb(5, 135, 215)">
68+
15min &mdash; Lightning Talks
69+
<p style="padding-left: 1em; line-height: 1.2em; color: #666; margin-bottom: 0.5em">
70+
Sam Welborn &mdash; <span style="font-style: italic; color: #999">Lawrence Berkeley National
71+
Laboratory (LBNL)</span><br />
72+
</p>
73+
<p style="padding-left: 1em; line-height: 1.2em; color: #666; margin-bottom: 0.5em">
74+
<!-- Network Streaming vs File Transfer<br /> -->
75+
Eli Dart &mdash; <span style="font-style: italic; color: #999">Energy Sciences Network
76+
(ESnet)</span>
77+
</p>
78+
<p style="padding-left: 1em; line-height: 1.2em; color: #666; margin-bottom: 0.5em">
79+
<!-- AIsB &mdash; High Performance Access to ECMWF Weather & Climate Data<br /> -->
80+
Alex Upton &mdash; <span style="font-style: italic; color: #999">Swiss National Supercomputing Centre (CSCS)</span>
81+
</p>
82+
</li>
83+
<li style="color: rgb(5, 135, 215)">
84+
35min &mdash; Interactive Workflow Challenge<br />
85+
<p style="padding-left: 1em; line-height: 1.2em; color: #666; margin-bottom: 0.5em">
86+
<span style="font-style: italic; color: #999">5 min: Group formation and scenario distribution</span>
87+
<span style="font-style: italic; color: #999">15 min: Small group discussions and solution development</span>
88+
<span style="font-style: italic; color: #999">10 min: Cross-group solution sharing and feedback</span>
89+
<span style="font-style: italic; color: #999">5 min: Real-time polling on proposed solutions and implementation challenges</span>
90+
</p>
91+
92+
</li>
93+
<li style="color: rgb(5, 135, 215)">
94+
5min &mdash; Synthesis and Next Steps<br />
95+
<!-- <span style="font-style: italic; color: #999">Planning of a Full Day Workshop on Streaming at an
96+
ASCR User Facility in the Near Future</span> -->
97+
</li>
98+
</ul>
99+
</div>
100+
</div>
101+
<div class="col-lg-4">
102+
<div class="blog-sidbar">
103+
<h2 class="mb-5">Supporters</h2>
104+
<div class="col-md-12 ftco-animate">
105+
<a href="https://www.nersc.gov/" target="_blank">
106+
<img src="https://encrypted-tbn0.gstatic.com/images?q=tbn:ANd9GcQzKv5LXRUBcMbGsXmwhEVArUKYup-AJjDs8g&s"
107+
height="30px" class="mb-3 mr-3" />
108+
</a><br />
109+
<a href="https://lbl.gov/" target="_blank">
110+
<img src="https://www.lbl.gov/wp-content/uploads/2022/09/berkeley-logo.svg" height="30px"
111+
class="mb-3 mr-3" />
112+
</a><br />
113+
<a href="https://ornl.gov/" target="_blank">
114+
<img src="https://www.hpcwire.com/wp-content/uploads/2019/07/ORNL-OLCF-logo-700x.jpg"
115+
height="50px" class="mb-3 mr-3" />
116+
</a><br />
117+
<a href="https://www.es.net/" target="_blank">
118+
<img src="https://portal-east.es.net/static/media/esnet-logo.1ae3ec10.png" height="30px"
119+
class="mb-3 mr-3" />
120+
</a><br />
121+
<a href="https://www.cscs.ch/" target="_blank">
122+
<img src="https://upload.wikimedia.org/wikipedia/commons/3/3a/Logo_of_the_Swiss_National_Supercomputing_Centre_CSCS.jpg"
123+
height="50px" class="mb-3 mr-3" />
124+
</a><br />
125+
<a href="https://science.osti.gov/" target="_blank">
126+
<img src="https://science.osti.gov/assets/img/doe-logos/logo.png" height="30px" class="mb-3 mr-3" />
127+
</a>
128+
</div>
129+
</div>
130+
</div>
131+
</div>
132+
</div>

0 commit comments

Comments
 (0)