我想将 pyspark 数据帧写入 AWS RDS 中的 MySQL 表,但我不断收到错误消息
pyspark.sql.utils.IllegalArgumentException: requirement failed: The driver could not open a JDBC connection. Check the URL: jdbc:mysql:mtestdb.ch4i3d3jc0yc.eu-central-1.rds.amazonaws.com
我的代码如下所示:
import os
import sys
spark = SparkSession.builder\
.appName('test-app')\
.config('spark.jars.packages', 'mysql:mysql-connector-java:8.0.28')\
.getOrCreate()
properties = {'user':'admin', 'password':'password', 'driver':'com.mysql.cj.jdbc.Driver'}
resultDF.write.jdbc(url='jdbc:mysql:mtestdb.ch4i3d3jc0yc.eu-central-1.rds.amazonaws.com', table='mcm_objects', properties=properties)\
.mode('append')\
.save()
我还尝试了 url 'jdbc:mysql://mtestdb.ch4i3d3jc0yc.eu-central-1.rds.amazonaws.com',但随后出现错误:
java.sql.SQLException: No database selected
不知道我做错了什么。任何帮助将不胜感激