我是德鲁伊的新手。我想从我的 java 应用程序中查询一个远程 druid 集群。我在 druid-user google 组中读到我们可以使用 io.druid.client.DirectDruidClient 。有人可以帮助我或指出一个具有相同示例的资源吗?
4 回答
这是一个简单的Spring Boot Java 应用程序,它使用Avatica JDBC 驱动程序查询Druid数据并打印查询的第一行。
假设 Druid 在本地运行,并且您已经在表名“ druid_table ”中有数据,该表具有列sourceIP
FlinkDruidApplication.java
@SpringBootApplication
public class FlinkDruidApplication {
public static void main(String[] args) {
SpringApplication.run(FlinkDruidApplication.class, args);
Logger log = LoggerFactory.getLogger("FlinkDruidApplication");
ExecutionEnvironment env = ExecutionEnvironment.getExecutionEnvironment();
DataSet<Row> dbData =
env.createInput(
JDBCInputFormat
.buildJDBCInputFormat()
.setDrivername("org.apache.calcite.avatica.remote.Driver")
.setDBUrl("jdbc:avatica:remote:url=http://localhost:8082/druid/v2/sql/avatica/")
.setUsername("null")
.setPassword("null")
.setQuery(
"SELECT sourceIP FROM druid_table"
)
.setRowTypeInfo((RowTypeInfo) Types.ROW(Types.STRING))
.finish()
);
try {
log.info("Printing first IP :: {} " + dbData.collect().iterator().next());
} catch (Exception e) {
log.error(e.getMessage());
}
}
}
pom.xml
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://maven.apache.org/POM/4.0.0"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.1.8.RELEASE</version>
<relativePath/> <!-- lookup parent from repository -->
</parent>
<groupId>com.shashank</groupId>
<artifactId>FlinkDruid</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>FlinkDruid</name>
<description>Flink Druid Connection</description>
<properties>
<java.version>1.8</java.version>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter</artifactId>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.flink/flink-core -->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-core</artifactId>
<version>1.9.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.flink/flink-streaming-java -->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_2.12</artifactId>
<version>1.9.0</version>
<scope>provided</scope>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.flink/flink-java -->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-java</artifactId>
<version>1.9.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.flink/flink-clients -->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients_2.12</artifactId>
<version>1.9.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.flink/flink-jdbc -->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-jdbc_2.12</artifactId>
<version>1.8.1</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.calcite.avatica/avatica-core -->
<dependency>
<groupId>org.apache.calcite.avatica</groupId>
<artifactId>avatica-core</artifactId>
<version>1.15.0</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>
在 Druid 中查询的原生方式是通过 HTTP 上的 JSON。他们在https://github.com/implydata/druid-client确实有一个 Java 客户端,但它看起来不像是在积极开发中。(撰写本文时的最后一次提交是在 2 年前。)因此它可能不支持原生 JSON 查询语言的所有功能。
如果您尝试从 Java 应用程序查询 Druid,更好的方法是在 Java 中构建 JSON 查询,然后通过 HTTP 发送查询。
这是设置和测试的另一个示例。
使用简单的 Java 程序进行 DRUID-JDBC 连接测试:
JdbcTest.java
import java.sql.*;
import java.util.Properties;
public class jdbcTest{
public static void main(String args[]) {
// Connect to /druid/v2/sql/avatica/ on your Broker.
String url = "jdbc:avatica:remote:url=http://ec2-15-206-160-168.ap-south-1.compute.amazonaws.com:8082/druid/v2/sql/avatica/";
Properties connectionProperties = new Properties();
connectionProperties.setProperty("user", "vaibhav");
connectionProperties.setProperty("password", "Qwe@2019");
try (Connection connection =
DriverManager.getConnection(url,connectionProperties)) {
try (
final Statement statement = connection.createStatement();
final ResultSet rs = statement.executeQuery("select count(*) from wikipedia");
)
{
while (rs.next()) {
// Do something
System.out.println("The Count Result is=" + rs.getString(1));
}
}
}
catch (Exception e)
{
System.out.println(e.toString());
}
}
}
如何使用这个程序:
1) 创建一个名为 jdbcTest.java 的 java 文件,上面的内容替换所需的详细信息 [HostName:PortNo,User,Password,DataSource Name ]。
2)编译:javac JdbcTest.java
例如。$ javac JdbcTest.java
- 下载 Avatica JDBC 驱动程序
可以从以下 Imply 存储库下载驱动程序: https://static.imply.io/support/avatica-1.12.0.jar。
将 jar avatica jar 复制到您的机器上。
4)执行:java -cp .:/avatica-1.12.0.jar JdbcTest
例如:$ java-cp.:/Users/vaibhav/Downloads/avatica-1.12.0.jar JdbcTest SLF4J:无法加载类“org.slf4j.impl.StaticLoggerBinder”。SLF4J:默认为无操作 (NOP) 记录器实现 SLF4J:有关详细信息,请参阅http://www.slf4j.org/codes.html#StaticLoggerBinder。计数结果为=24433
预期结果:成功的完整执行应该返回数据源的计数(*)[在这种情况下,我的数据源是维基百科],如下所示:计数结果是=
public static void Connect() {
ExecutionEnvironment env = ExecutionEnvironment.getExecutionEnvironment();
DataSet<Row> dbData =
env.createInput(
JDBCInputFormat
.buildJDBCInputFormat()
.setDrivername("org.apache.calcite.avatica.remote.Driver")
.setDBUrl("jdbc:avatica:remote:url=http://hostname:8082/druid/v2/sql/avatica/")
.setUsername("null")
.setPassword("null")
.setQuery(
"SELECT __time,account,empname FROM temp where name ='abc'"
)
.setRowTypeInfo((RowTypeInfo) Types.ROW(Types.SQL_TIMESTAMP,Types.STRING, Types.STRING))
.finish()
);
try {
ArrayList<Row> list = (ArrayList<Row>) dbData.collect();
System.out.println("List Size: "+list.size());
for(Row row : list) {
java.sql.Timestamp time = (java.sql.Timestamp) row.getField(0);
System.out.print("Time :" + time);
String account = (String) row.getField(1);
System.out.print(" Account :" + account);
String awsregion = (String) row.getField(2);
System.out.println(" empname :" + empname);
}
//System.out.println("Account:" + dbData.collect().iterator().next());
} catch (Exception e) {
e.printStackTrace();
}
}