我们正在尝试使用在我们端配置的 SSO 钱包和 Apache Spark 连接到作为 AmazonRDS 运行的远程 Oracle 数据库。我们可以使用spark-shell
如下所述的实用程序加载数据
使用添加到类路径的 jdbc 和 oraclepki jar 启动 spark shell
spark-shell --driver-class-path /path/to/ojdbc8.jar:/path/to/oraclepki.jar
这是使用的 JDBC 网址:
val JDBCURL="jdbc:oracle:thin:@(DESCRIPTION=(ADDRESS=(PROTOCOL=TCPS)(HOST=www.example.aws.server.com)(PORT=1527))(CONNECT_DATA=(SID=XXX))(SECURITY = (SSL_SERVER_CERT_DN =\"C=US,ST=xxx,L=ZZZ,O=Amazon.com,OU=RDS,CN=www.xxx.aws.zzz.com\")))"
下面是加载数据的 Spark jdbc 调用
spark.read.format("jdbc").option("url",JDBCURL)
.option("user","USER")
.option("oracle.net.tns_admin","/path/to/tnsnames.ora")
.option("oracle.net.wallet_location","(SOURCE=(METHOD=file)(METHOD_DATA=(DIRECTORY=/path/to/ssl_wallet/)))")
.option("password", "password")
.option("javax.net.ssl.trustStore","/path/to/cwallet.sso")
.option("javax.net.ssl.trustStoreType","SSO")
.option("dbtable",QUERY)
.option("driver", "oracle.jdbc.driver.OracleDriver").load
但是当我们尝试使用spark-submit
命令运行它时,我们会收到以下错误:
Exception in thread "main" java.sql.SQLRecoverableException: IO Error: The Network Adapter could not establish the connection
at oracle.jdbc.driver.T4CConnection.logon(T4CConnection.java:774)
at oracle.jdbc.driver.PhysicalConnection.connect(PhysicalConnection.java:688)
...
...
...
Caused by: oracle.net.ns.NetException: The Network Adapter could not establish the connection
at oracle.net.nt.ConnStrategy.execute(ConnStrategy.java:523)
at oracle.net.resolver.AddrResolution.resolveAndExecute(AddrResolution.java:521)
at oracle.net.ns.NSProtocol.establishConnection(NSProtocol.java:660)
at oracle.net.ns.NSProtocol.connect(NSProtocol.java:286)
at oracle.jdbc.driver.T4CConnection.connect(T4CConnection.java:1438)
at oracle.jdbc.driver.T4CConnection.logon(T4CConnection.java:518)
... 28 more
Caused by: oracle.net.ns.NetException: Unable to initialize ssl context.
at oracle.net.nt.CustomSSLSocketFactory.getSSLSocketEngine(CustomSSLSocketFactory.java:597)
at oracle.net.nt.TcpsNTAdapter.connect(TcpsNTAdapter.java:143)
at oracle.net.nt.ConnOption.connect(ConnOption.java:161)
at oracle.net.nt.ConnStrategy.execute(ConnStrategy.java:470)
... 33 more
Caused by: oracle.net.ns.NetException: Unable to initialize the key store.
at oracle.net.nt.CustomSSLSocketFactory.getKeyManagerArray(CustomSSLSocketFactory.java:642)
at oracle.net.nt.CustomSSLSocketFactory.getSSLSocketEngine(CustomSSLSocketFactory.java:580)
... 36 more
Caused by: java.security.KeyStoreException: SSO not found
at java.security.KeyStore.getInstance(KeyStore.java:851)
at oracle.net.nt.CustomSSLSocketFactory.getKeyManagerArray(CustomSSLSocketFactory.java:628)
... 37 more
Caused by: java.security.NoSuchAlgorithmException: SSO KeyStore not available
at sun.security.jca.GetInstance.getInstance(GetInstance.java:159)
at java.security.Security.getImpl(Security.java:695)
at java.security.KeyStore.getInstance(KeyStore.java:848)
我对火花很陌生,可能在这里做错了什么。这就是我尝试配置配置的方式
val conf = new SparkConf().setAppName(JOB_NAME)
conf.set("javax.net.ssl.trustStore", "/path/to/cwallet.sso");
conf.set("javax.net.ssl.trustStoreType", "SSO")
conf.set("oracle.net.tns_admin", "/path/to/tnsnames.ora")
conf.set("oracle.net.wallet_location", "(SOURCE=(METHOD=file)(METHOD_DATA=(DIRECTORY=/path/to/ssl_wallet/dir/)))")
conf.set("user", "user")
conf.set("password", "pass")
下面是spark-submit
使用的命令
spark-submit --class fully.qualified.path.to.main \
--jars /path/to/ojdbc8.jar,/path/to/oraclepki.jar,/path/to/osdt_cert.jar,/path/to/osdt_core.jar \
--deploy-mode client --files /path/to/hive-site.xml --master yarn \
--driver-memory 12G \
--conf "spark.executor.extraJavaOptions=-Djavax.net.ssl.trustStore=/path/to/cwallet.sso -Djavax.net.ssl.trustStoreType=SSO" \
--executor-cores 4 --executor-memory 12G \
--num-executors 20 /path/to/application.jar /path/to/application_custom_config.conf
也尝试添加
--conf 'spark.executor.extraJavaOptions=-Djavax.net.ssl.trustStore=/path/to/cwallet.sso -Djavax.net.ssl.trustStoreType=SSO'
和
--files /path/to/cwallet.sso,/path/to/tnsnames.ora
命令spark-submit
,但没有任何运气。我到底在这里做错了什么?还尝试了这篇文章中提到的解决方案,但得到了同样的错误。我需要确定trustStore should be accessible on each executor node
吗?如果是这种情况,那么为什么该spark-shell
命令工作正常?这是否意味着 spark-cli 不包含任何工作节点来执行命令?
请指教
更新:
看起来您正在使用 12.1.0.2 的 JDBC 驱动程序。请升级到 18.3,您可以从 oracle.com/technetwork/database/application-development/jdbc/... 下载该版本。为了更方便地使用钱包,我们进行了一些更改。--@Jean de Lavarene
在遵循@Jean de Lavarene 建议的更改之后,摆脱了最初的错误,但下面是我现在得到的
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0 (TID 3, example.server.net, executor 2): java.sql.SQLException: PKI classes not found. To use 'connect /' functionality, oraclepki.jar must be in the classpath: java.lang.NoClassDefFoundError: oracle/security/pki/OracleWallet
at oracle.jdbc.driver.PhysicalConnection.getSecretStoreCredentials(PhysicalConnection.java:3058)
at oracle.jdbc.driver.PhysicalConnection.parseUrl(PhysicalConnection.java:2823)
当我在火花本地模式下运行它时:--master local[*]
它工作正常但yarn
模式失败。
我已经在使用--jars
带有逗号分隔的 jar 列表的命令。我发现的是:
1)--jars
期望路径是本地路径,然后将它们复制到 HDFS 路径
2)file:///
在开始时使用不起作用
3)如果我没有指定--jars
程序要求缺少 JDBC 驱动程序类的参数。一旦我指定ojdbc8.jar
使用 --jars 错误就会消失并开始给出oraclepki.jar
未找到的错误。我不知道为什么会这样。
4)还尝试:
在指定多个罐子时用作分隔符,但没有任何运气
更新 2
我能够oraclepki.jar
通过使用解决未找到的异常
--driver-class-path /path/to/oraclepki.jar:/path/to/osdt_cert.jar:/path/to/others.jar
但是一旦我们进入该--master yarn
模式,就会显示以下异常
Caused by: oracle.net.ns.NetException: Unable to initialize the key store.
at oracle.net.nt.CustomSSLSocketFactory.getKeyManagerArray(CustomSSLSocketFactory.java:617)
at oracle.net.nt.CustomSSLSocketFactory.createSSLContext(CustomSSLSocketFactory.java:322)
... 32 more
Caused by: java.io.FileNotFoundException: /path/to/cwallet.sso (No such file or directory)
根据我的理解,当它从工作节点启动作业时,cwallet.sso
文件路径在这些节点上不可用。我们尝试为钱包指定 HDFS 路径,但实用程序希望在创建钱包时提供本地路径。
那么我们是否需要手动将钱包文件复制到所有工作节点?或者有没有更好的选择来实现这一目标?
请指教