| Stage Id ▾ | Pool Name | Description | Submitted | Duration | Tasks: Succeeded/Total | Input | Output | Shuffle Read | Shuffle Write |
|---|---|---|---|---|---|---|---|---|---|
| 48 | tenants-pool-506 | Delta: replenishmentRunId = 10000000004 tenantId = 8793759653749352888 activityType = BufferDataSnapShot activityId = 36e0914c-7e4e-3a52-bc93-e306e07299ba workflowType = KpiPrepareDataSnapshotWorkflow workflowId = b49fb3c6-88b3-3c61-97e8-cbe3f98f9409 attempt = 1 cornerstoneTenantId = 8445 marketUnit = 10000_OP_PERF scenario = STANDARD: Filtering files for query $anonfun$recordDeltaOperationInternal$1 at DatabricksLogging.scala:128
RDD: Delta Table State #5 - hdlfs://7da5304c-1f57-47fa-b75a-0f0b4349c280.files.hdl.prod-eu20.hanacloud.ondemand.com:443/crp-dl-stream-service/cornerstone/sap-cic-product-productplant/_delta_log
org.apache.spark.sql.delta.metering.DeltaLogging.$anonfun$recordDeltaOperationInternal$1(DeltaLogging.scala:139) com.databricks.spark.util.DatabricksLogging.recordOperation(DatabricksLogging.scala:128) com.databricks.spark.util.DatabricksLogging.recordOperation$(DatabricksLogging.scala:117) org.apache.spark.sql.delta.Snapshot.recordOperation(Snapshot.scala:87) org.apache.spark.sql.delta.metering.DeltaLogging.recordDeltaOperationInternal(DeltaLogging.scala:138) org.apache.spark.sql.delta.metering.DeltaLogging.recordDeltaOperation(DeltaLogging.scala:128) org.apache.spark.sql.delta.metering.DeltaLogging.recordDeltaOperation$(DeltaLogging.scala:118) org.apache.spark.sql.delta.Snapshot.recordDeltaOperation(Snapshot.scala:87) org.apache.spark.sql.delta.stats.DataSkippingReaderBase.filesForScan(DataSkippingReader.scala:1207) org.apache.spark.sql.delta.stats.DataSkippingReaderBase.filesForScan$(DataSkippingReader.scala:1204) org.apache.spark.sql.delta.Snapshot.filesForScan(Snapshot.scala:87) org.apache.spark.sql.delta.stats.PrepareDeltaScanBase.$anonfun$filesForScan$1(PrepareDeltaScan.scala:134) org.apache.spark.sql.delta.util.DeltaProgressReporter.withJobDescription(DeltaProgressReporter.scala:56) org.apache.spark.sql.delta.util.DeltaProgressReporter.withStatusCode(DeltaProgressReporter.scala:35) org.apache.spark.sql.delta.util.DeltaProgressReporter.withStatusCode$(DeltaProgressReporter.scala:29) org.apache.spark.sql.delta.stats.PrepareDeltaScan.withStatusCode(PrepareDeltaScan.scala:308) org.apache.spark.sql.delta.stats.PrepareDeltaScanBase.filesForScan(PrepareDeltaScan.scala:119) org.apache.spark.sql.delta.stats.PrepareDeltaScanBase.filesForScan$(PrepareDeltaScan.scala:114) org.apache.spark.sql.delta.stats.PrepareDeltaScan.filesForScan(PrepareDeltaScan.scala:308) org.apache.spark.sql.delta.stats.PrepareDeltaScanBase$$anonfun$prepareDeltaScan$1.$anonfun$applyOrElse$1(PrepareDeltaScan.scala:152) | 2026/04/20 06:14:33 | 0.2 s |
50/50
| 3.7 KiB |
| Stage Id ▾ | Pool Name | Description | Submitted | Duration | Tasks: Succeeded/Total | Input | Output | Shuffle Read | Shuffle Write |
|---|---|---|---|---|---|---|---|---|---|
| 47 | default | $anonfun$withThreadLocalCaptured$2 at <unknown>:0 org.apache.spark.sql.execution.SQLExecution$.$anonfun$withThreadLocalCaptured$2(SQLExecution.scala:316) java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(Unknown Source) java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) java.base/java.lang.Thread.run(Unknown Source) | Unknown | Unknown |
0/6
|