Basically, all I want is to copy the prepared tables from Keboola storage into 1:1 tables in the application database.
The easiest way to solve this is obviously to use PostgreSQL writer and just write to the application database, but that basically means that local development of the application requires a remote database or to dump&restore the data from production. And if you want to change/add something, there is no easy way to test it without going through the production database first, which I dislike very much.
Right now, we think the best solution is to use Snowflake writer and have it provision a database with all the data, and then have a process in the application that can “pull” the data.
This solves both production and local development in the same way and we won’t have to maintain any hacks or alternative data-sync methods to avoid the problem that Keboola simply cannot write to a developer’s localhost database. And even if it could (throught VPN for example), it would mean maintaining a writer for production and then a copy of that writer for each developer and I don’t like that very much.
The disadvantage is that we won’t be using any of the Snowflake advantages and it will be just a dumb storage we can connect to using a JDBC driver. We’re also considering having our own PostgreSQL data-warehouse, which Keboola can write into and then the applications can pull the data from there into its database, in case the Keboola-provisioned Snowflake would become too expensive.
We have also considered using API of the Keboola storage directly, but in order to get the data out, we have to first run a job that writes it somewhere and then download a bunch of CSV files, which defeats the purpose of having it easily queriable by SQL.
What other options do we have? How else can I get the data from Keboola to application databases?