1

我正在使用 Python 的 simple-salesforce 包来执行批量上传。我看到一些不一致的响应错误,我认为可以通过将“concurrencyMode”更改为“Serial”来解决这些错误

我在文档中没有看到该选项。有谁知道是否可以更新源代码以将该参数添加到请求中?我尝试更新 api.py 和 bulk.py 中的标头,但没有成功。

谢谢

4

1 回答 1

0

simple-salesforcebulk 方法通过 POSTing 使用 Salesforce Bulk API 1.0https://<salesforce_instance>/services/async/<api_version>/job. 在bulk.py中,作业是这样创建的:

 def _create_job(self, operation, object_name, external_id_field=None):
        """ Create a bulk job
        Arguments:
        * operation -- Bulk operation to be performed by job
        * object_name -- SF object
        * external_id_field -- unique identifier field for upsert operations
        """

        payload = {
            'operation': operation,
            'object': object_name,
            'contentType': 'JSON'
        }

这会生成以下 XML 有效负载:

<jobInfo
   xmlns="http://www.force.com/2009/06/asyncapi/dataload">
 <operation>...</operation>
 <object>...</object>
 <contentType>JSON</contentType>
</jobInfo>

要显式请求串行作业,您需要向concurrencyMode请求中添加元素。jobInfo片段应该是

<jobInfo
   xmlns="http://www.force.com/2009/06/asyncapi/dataload">
 <operation>...</operation>
 <object>...</object>
 <concurrencyMode>Serial</concurrencyMode>
 <contentType>JSON</contentType>
</jobInfo>

更改_create_job以具有此额外元素:

 def _create_job(self, operation, object_name, external_id_field=None):
        """ Create a serial bulk job
        Arguments:
        * operation -- Bulk operation to be performed by job
        * object_name -- SF object
        * external_id_field -- unique identifier field for upsert operations
        """

        payload = {
            'operation': operation,
            'object': object_name,
            'concurrencyMode': 'Serial',
            'contentType': 'JSON'
        }
于 2019-09-16T23:31:56.347 回答