I'm getting events from Kafka
and storing into Cassandra
. Parsing json
which contains fields eventID, sessionID, timestamp, userID
to create columns for Cassandra
table which looks like this:
cassandra@cqlsh> CREATE TABLE mydata.events (
... "event_date" date,
... "eventID" text,
... "userID" text,
... timestamp timeuuid,
... "sessionID" text,
... "fullJson" text,
... PRIMARY KEY ("event_date", timestamp, "sessionID")
and in code:
case class cassandraFormat(
eventID: String,
sessionID: String,
timeuuid: UUID, // timestamp as timeuuid
userID: String,
event_date: LocalDate, // YYYY-MM-dd format
fullJson: String // full json from Kafka
)
I need to add timestamp
column as timeuuid
. Since I'm parsing from json
, extracted all values from header and created columns in this fashion:
val allJson = rdd.
map(x => {
implicit val formats: DefaultFormats.type = org.json4s.DefaultFormats
//use serialization default to format a Map to JSON
(x, Serialization.write(x))
}).
filter(x => x._1 isDefinedAt "header").
map(x => (x._1("header"), x._2)).
filter(x => (x._1 isDefinedAt "userID") &&
(x._1 isDefinedAt "eventID") &&
(x._1 isDefinedAt "sessionID") &&
(x._1 isDefinedAt "timestamp").
map(x => cassFormat(x._1("eventID").toString,
x._1("sessionID").toString,
com.datastax.driver.core.utils.UUIDs.startOf(x._1("timestamp").toString.toLong),
x._1("userID").toString,
com.datastax.driver.core.LocalDate.fromMillisSinceEpoch(x._1("timestamp").toString.toLong),
x._2))
This part:
com.datastax.driver.core.utils.UUIDs.startOf(x._1("timestamp").toString.toLong)
is generating Error
java.lang.NumberFormatException: For input string: "2019-05-09T09:00:52.553+0000" at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
Even tried:
java.util.UUID.fromString(x._1("timestamp").toString
,
also generating same Error.
How to properly cast/convert timestamp
as timeuuid
and insert into Cassandra
via spark job