Opening the Spark UI
etl_daily_pipeline Run #42 - Succeeded
Output
[2026-02-26 18:03:12] Reading delta table...
[2026-02-26 18:03:14] Filter: status='completed'
[2026-02-26 18:15:28] Broadcast join completed
[2026-02-26 18:37:41] GroupBy aggregation done
[2026-02-26 18:41:33] Pipeline completed. Rows: 2,847
Task run
Job ID
913847562
Job run ID
42
Started
02/26/2026, 6:03:12 PM
Duration
38.4 min
Status
Succeeded
Cluster
etl-production-cluster md-fleet.16xlarge - 8 workers
Click the cluster name link in the right panel. That opens the cluster page, where you click the Spark UI tab:
etl-production-cluster Running
md-fleet.16xlarge - 8 workers - DBR 14.3 LTS - Photon enabled
Configuration Notebooks Libraries Spark UI Driver Logs Metrics
Spark Jobs
User: rootTotal Uptime: 38.4 minScheduling Mode: FAIR
Jobs Stages Storage Environment Executors SQL / DataFrame