1

我正在尝试使用 Key Vault 支持的机密范围将 Azure Blob 存储容器装载到 Databricks 工作簿。

设置:

  1. 创建了 Key Vault
  2. 在 Key Vault 中创建了一个秘密
  3. 创建了一个 Databricks 秘密范围
  • 这是众所周知的。
    • 运行dbutils.secrets.get(scope = dbrick_secret_scope, key = dbrick_secret_name)结果没有错误
    • 在 Databricks 中查看机密会导致[REDACTED]

Databricks 中的单元格:

%python

dbrick_secret_scope = "dbricks_kv_dev"
dbrick_secret_name = "scrt-account-key"

storage_account_key = dbutils.secrets.get(scope = dbrick_secret_scope, key = dbrick_secret_name)
storage_container = 'abc-test'
storage_account = 'stgdev'

dbutils.fs.mount(
    source = f'abfss://{storage_container}@{storage_account}.dfs.core.windows.net/',
    mount_point = f'/mnt/{storage_account}',
    extra_configs = {f'fs.azure.accountkey.{storage_account}.dfs.core.windows.net:{storage_account_key}'}
)

结果:

  • 错误:AttributeError: 'set' object has no attribute 'keys'以红色突出显示的mount_point行。dbutils.fs.mount()
  • 完整错误:
AttributeError                            Traceback (most recent call last)
<command-3166320686381550> in <module>
      9     source = f'abfss://{storage_container}@{storage_account}.dfs.core.windows.net/',
     10     mount_point = f'/mnt/{storage_account}',
---> 11     extra_configs = {f'fs.azure.accountkey.{storage_account}.dfs.core.windows.net:{storage_account_key}'}
     12 )

/local_disk0/tmp/1625601199293-0/dbutils.py in f_with_exception_handling(*args, **kwargs)
    298             def f_with_exception_handling(*args, **kwargs):
    299                 try:
--> 300                     return f(*args, **kwargs)
    301                 except Py4JJavaError as e:
    302                     class ExecutionError(Exception):

/local_disk0/tmp/1625601199293-0/dbutils.py in mount(self, source, mount_point, encryption_type, owner, extra_configs)
    389                 self.check_types([(owner, string_types)])
    390             java_extra_configs = \
--> 391                 MapConverter().convert(extra_configs, self.sc._jvm._gateway_client)
    392             return self.print_return(self.dbcore.mount(source, mount_point,
    393                                                        encryption_type, owner,

/databricks/spark/python/lib/py4j-0.10.9-src.zip/py4j/java_collections.py in convert(self, object, gateway_client)
    520         HashMap = JavaClass("java.util.HashMap", gateway_client)
    521         java_map = HashMap()
--> 522         for key in object.keys():
    523             java_map[key] = object[key]
    524         return java_map

AttributeError: 'set' object has no attribute 'keys'

似乎与extra_configs参数有关,但我不确定是什么。谁能看到我错过了什么?

4

1 回答 1

1

在您的情况下,真正的错误是您需要提供字典作为extra_configs参数,但您提供的是集合:{f'fs.azure.accountkey.{storage_account}.dfs.core.windows.net:{storage_account_key}'}- 发生这种情况是因为您没有正确的语法('缺少两个)。正确的语法是:{f'fs.azure.accountkey.{storage_account}.dfs.core.windows.net':storage_account_key}

但实际上,您不能abfss使用存储帐户密钥使用协议挂载 - 它仅支持使用wasbs协议挂载。因为abfss必须使用服务主体,并提供它的 ID 和密码,如下所示(请参阅文档):

configs = {"fs.azure.account.auth.type": "OAuth",
          "fs.azure.account.oauth.provider.type": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
          "fs.azure.account.oauth2.client.id": "<application-id>",
          "fs.azure.account.oauth2.client.secret": dbutils.secrets.get(scope="<scope-name>",key="<service-credential-key-name>"),
          "fs.azure.account.oauth2.client.endpoint": "https://login.microsoftonline.com/<directory-id>/oauth2/token"}

dbutils.fs.mount(
  source = "abfss://<container-name>@<storage-account-name>.dfs.core.windows.net/",
  mount_point = "/mnt/<mount-name>",
  extra_configs = configs)

尽管理论上您可以使用wasbs协议和存储密钥安装 ADLS Gen2 存储,但不建议这样做,因为您可能会遇到问题(我个人认为)。此外,不建议使用存储密钥,最好使用共享访问签名 - 它更安全。

于 2021-07-07T10:25:30.857 回答