I am trying to find a way to automatically update data from a csv file on a PostgreSQL table.
The csv file holds LabVIEW test data, which will renew with every execution and is stored on the local computer.
So far I am able to import data from the csv into table 'test1' using 2 methods:
COPY command
COPY test1 FROM 'C:\Users\..\Test.csv' DELIMITER E'\t' CSV;
Foreign Data Wrappers (file_fdw)
CREATE SERVER labview FOREIGN DATA WRAPPER file_fdw; CREATE FOREIGN TABLE labview_test ( column1 text ) SERVER labview OPTIONS (filename 'C:\Users\..\Test.csv', format 'csv');
From there I can insert data into test1 using
INSERT INTO test1 (column1) FROM labview_test;
This is ok for a single update, but I need this process to be automated. I tried using triggers and trigger functions and I am able to create the trigger on the foreign table, but the table does not update when I run a new LabVIEW test.
CREATE FUNCTION copy_test ()
RETURNS TRIGGER AS $$
BEGIN
INSERT INTO test1 VALUES (NEW.column1);
RETURN NULL;
END $$ LANGUAGE plpgsql;
CREATE TRIGGER copy_trigger
AFTER INSERT OR UPDATE ON labview_test
FOR EACH ROW
EXECUTE PROCEDURE copy_test ();
I read on a forum post that file_fdw does not support the trigger function, can anyone confirm this in the latest version of PostgreSQL (9.6)?
Does anyone know of a way to automatically update tables from a foreign table or from the .csv file directly?