What I’m trying to do is dump one table to CSV, so that I can specify the columns to import using COPY FROM into another table ( which has an extra generated column ).
JSON does not seem to allow for extra columns … ? ( I should note the other table is on another instance elsewhere, otherwise I’d use INSERT INTO () SELECT * FROM blah )
I’ve read the various other tutorials but most seem to assume the CSV is generated elsewhere.
I’ve tried CSV export within DBeaver, but this blows up with memory issues ( 9M+ rows in the table to export ).
Yes right, COPY TO only supports JSON.
There is also a feature request at GH (export to csv · Issue #8551 · crate/crate · GitHub), but it lacks a bit the use-case or why post-processing JSON to CSV isn’t an option.
Converting the exported JSONL to CSV could be done e.g. using jq: jq -r -s '(map(keys) | add | unique) as $cols | map(. as $row | $cols | map($row[.])) as $rows | $cols, $rows[] | @csv' <YOUR_FILE_NAME>.json
Oh that’s odd - it was not doing anything on import, no matter what I tried and so I assumed that it was due to not specifying the columns and/or the table schema being different to the import.
I’ll have to try again, as it must be something else.