0

我有一个云函数调用 SCC 的 list_assets 并将分页输出转换为 List(以获取所有结果)。但是,由于我在组织树中有很多资产,因此需要大量时间来获取和云功能超时(最大超时 540 秒)。

asset_iterator = security_client.list_assets(org_name)
asset_fetch_all=list(asset_iterator)

我尝试通过 WebUI 导出,它工作正常(大约需要 5 分钟)。有没有办法使用 API 将资产从 SCC 直接导出到 Cloud Storage 存储桶?

4

2 回答 2

0

尝试这样的事情:我们使用它来将结果上传到存储桶中。确保给 SP 函数正在运行存储桶的正确权限。

def test_list_medium_findings(source_name):
    # [START list_findings_at_a_time]
    from google.cloud import securitycenter
    from google.cloud import storage


    # Create a new client.
    client = securitycenter.SecurityCenterClient()


    #Set query paramaters
    organization_id = "11112222333344444"
    org_name = "organizations/{org_id}".format(org_id=organization_id)
    all_sources = "{org_name}/sources/-".format(org_name=org_name)

   
   #Query Security Command Center
    finding_result_iterator = client.list_findings(all_sources,filter_=YourFilter)

    

    #Set output file settings
    bucket="YourBucketName"
    storage_client = storage.Client()
    bucket = storage_client.get_bucket(bucket)
    output_file_name = "YourFileName"
    my_file = bucket.blob(output_file_name)


    with open('/tmp/data.txt', 'w') as file:
        for i, finding_result in enumerate(finding_result_iterator):
            file.write(
               "{}: name: {} resource: {}".format(
                  i, finding_result.finding.name, finding_result.finding.resource_name
            )
        )


    #Upload to bucket
    my_file.upload_from_filename("/tmp/data.txt")
于 2020-07-16T17:07:13.347 回答
0

我用 Python 开发了同样的东西,用于导出到 BQ。在 BigQuery 中搜索比在文件中更容易。代码与 GCS 存储非常相似。这是我与 BQ 的工作代码

import os

from google.cloud import asset_v1
from google.cloud.asset_v1.proto import asset_service_pb2
from google.cloud.asset_v1 import enums

def GCF_ASSET_TO_BQ(request):

    client = asset_v1.AssetServiceClient()
    parent = 'organizations/{}'.format(os.getenv('ORGANIZATION_ID'))
    output_config = asset_service_pb2.OutputConfig()
    output_config.bigquery_destination.dataset = 'projects/{}/datasets/{}'.format(os.getenv('PROJECT_ID'),os.getenv('DATASET'))
    content_type = enums.ContentType.RESOURCE
    output_config.bigquery_destination.table = 'asset_export'
    output_config.bigquery_destination.force = True

    response = client.export_assets(parent, output_config, content_type=content_type)

    # For waiting the finish
    # response.result()
    # Do stuff after export
    return "done", 200


if __name__ == "__main__":
    GCF_ASSET_TO_BQ('')

如您所见,Env Var 中有一些值(OrganizationID、projectId 和 Dataset)。要导出到 Cloud Storage,您必须更改output_config类似这样的定义

output_config = asset_service_pb2.OutputConfig()
output_config.gcs_destination.uri = 'gs://path/to/file' 

在这里有其他语言的例子

于 2020-06-12T09:11:26.173 回答