1

我在数据块中安排了几项工作,其中我有兴趣阅读两个特定工作的统计信息。我需要编写一个 databricks 笔记本代码来将作业统计信息(jobName、startTime、endTime 和状态)写入雪花表。

4

1 回答 1

0

We can use following python code to get the details from databricks job api.

Note : tested code here

from pyspark.sql.types import IntegerType
from pyspark.sql.types import *
from pyspark.sql import Row
import base64
import requests
import json

databricks_instance ="<databricks-instances>"

url_list = f"{databricks_instance}/api/2.0/jobs/runs/get?run_id=39347"

headers = {
  'Authorization': 'Bearer <databricks-access-token>',
  'Content-Type': 'application/json'
}

response = requests.request("GET", url_list, headers=headers).json()
print(response)
print(response['job_id'])
print(response['start_time'])
print(response['end_time'])

enter image description here

于 2021-10-26T18:39:36.300 回答