I am new to Spark and am trying to understand how (if at all) is it possible to register dataframes as temp tables in the Spark thrift server. To clarify, this is what I am trying to do:
- Submit an application that generates a dataframe and registers it as a temporary table
- Connect from a JDBC client to the Spark ThriftServer (running on the master) and query the temporary table, even after the application that registered it completed.
So far I've had no success with this - the Spark ThriftServer is running on the Spark master, but I'm unable to actually register any temp table to it.
Is this possible? I know I can use HiveThriftServer2.startWithContext to serve a dataframe via JDBC, but that requires the application to keep running forever + it requires me to launch additional applications.