1

I am using Composer to run my Dataflow pipeline on a schedule. If the job is taking over a certain amount of time, I want it to be killed. Is there a way to do this programmatically either as a pipeline option or a DAG parameter?

4

1 回答 1

1

不确定如何将其作为管道配置选项,但这是一个想法。

您可以启动一个任务队列任务,并将倒计时设置为您的超时值。当任务启动时,您可以检查您的任务是否仍在运行:

https://cloud.google.com/dataflow/docs/reference/rest/v1b3/projects.jobs/list

如果是,您可以使用作业状态对其进行更新JOB_STATE_CANCELLED

https://cloud.google.com/dataflow/docs/reference/rest/v1b3/projects.jobs/update

https://cloud.google.com/dataflow/docs/reference/rest/v1b3/projects.jobs#jobstate

这是通过googleapiclient库完成的:https ://developers.google.com/api-client-library/python/apis/discovery/v1

这是一个如何使用它的例子

class DataFlowJobsListHandler(InterimAdminResourceHandler):

    def get(self, resource_id=None):
        """
        Wrapper to this:
        https://cloud.google.com/dataflow/docs/reference/rest/v1b3/projects.jobs/list
        """
        if resource_id:
            self.abort(405)
        else:
            credentials = GoogleCredentials.get_application_default()
            service = discovery.build('dataflow', 'v1b3', credentials=credentials)
            project_id = app_identity.get_application_id()
            _filter = self.request.GET.pop('filter', 'UNKNOWN').upper()

            jobs_list_request = service.projects().jobs().list(
                projectId=project_id,
                filter=_filter)  #'ACTIVE'
            jobs_list = jobs_list_request.execute()

            return {
                '$cursor': None,
                'results': jobs_list.get('jobs', []),
            }
于 2018-10-15T16:27:36.313 回答