我有一个来自 ADLS Gen2 输入的数据工厂(只有这在我们公司是合规的)。它工作正常。下面给出的图片是“复制数据”活动的设置。如图中用于存储日志(丢失的行数据),我们被迫使用 blob 存储或第 1 代数据湖。我们如何为此使用 ADLS Gen2?看起来是个瓶颈。如果此类数据存储在 Gen2 之外,我们将面临自满问题
问问题
42 次
1 回答
0
在我这边没问题,请尝试直接编辑您的活动的定义json:
这是我的json:
{
"name": "pipeline3",
"properties": {
"activities": [
{
"name": "Copy data1",
"type": "Copy",
"dependsOn": [],
"policy": {
"timeout": "7.00:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
},
"userProperties": [],
"typeProperties": {
"source": {
"type": "BinarySource",
"storeSettings": {
"type": "AzureBlobFSReadSettings",
"recursive": true
},
"formatSettings": {
"type": "BinaryReadSettings"
}
},
"sink": {
"type": "BinarySink",
"storeSettings": {
"type": "AzureBlobFSWriteSettings"
}
},
"enableStaging": false,
"logStorageSettings": {
"linkedServiceName": {
"referenceName": "AzureDataLakeStorage1",
"type": "LinkedServiceReference"
}
},
"validateDataConsistency": false
},
"inputs": [
{
"referenceName": "Binary1",
"type": "DatasetReference"
}
],
"outputs": [
{
"referenceName": "Binary2",
"type": "DatasetReference"
}
]
}
],
"annotations": []
}
}
于 2020-08-05T14:34:11.690 回答