Motivation
...
url=https://analytics.broadinstitute.org/Metrics?type=vvp
where the file should be sent tofilepathRegex=(.*)/(.*AspAllOutputQC.csv)
which file-names should be pickedouter_columns=field1=$1,field2=$2,field3=XYZ
parse fields out of the filepathRegex and injects them in the JSON (useful when bits of data is encoded in the filename)dryrun=true
run all the delta capturing and regex-parsing and show the data without actually pushing the filedelta=<delta specification>
specifies ”how” files to be pickedpick 1 (or multiple) specific files
delta=FILES_CSV /seq/tableau_files/VVPVolumeQC/20221205_RACK_QC_083303_AspAllOutputQC.csvpick files timestamped between this and that timestamp
delta=FILES IN FOLDER /seq/tableau_files/VVPVolumeQC TIMESTAMPED BETWEEN 2019-10-19 13:11:46 AND 2019-10-19 13:11:46pick files timestamped in last 10 hours
delta=FILES IN FOLDER /seq/tableau_files/VVPVolumeQC TIMESTAMPED BETWEEN -10h AND NOWpick files timestamped between persisted-in-file-timestamp and now (production setup)
delta=FILES IN FOLDER /seq/tableau_files/VVPVolumeQC TIMESTAMPED BETWEEN /seq/tableau_files/VVPVolumeQC/VVP_etl_timestamp.txt AND NOW
The file merely contains a timestamp(2019-10-19 13:11:46) and should be created manuallyadjustment adjust delta so that you can mitigate clock-discrepancy problems
delta=FILES IN FOLDER /seq/tableau_files/VVPVolumeQC TIMESTAMPED BETWEEN /seq/tableau_files/VVPVolumeQC/VVP_etl_timestamp.txt MINUS 5 MINUTES AND NOW
Final production-grade command would look like this
...
For testing purposes, an individual file can also be manually pushed by using curl
(ATTN: LAST_MODIFIED will not be populated in this case)
on Linux
Code Block | ||
---|---|---|
| ||
curl -F file="@/seq/tableau_files/VVPVolumeQC/20221205_RACK_QC_083303_AspAllOutputQC.csv" "https://analytics.broadinstitute.org/Metrics?type=vvp" |
...
JSON-queries appear to be sensitive to the Oracle 19.8.0 Bug 31532339 - ORA-600 [koksccda1]
DBAs are working to address this by upgrading SEQPROD to v19.15.0“analytics” VM is in our private network so it can’t be directly accessed from Google Cloud.
However an “onprem”-script can easily read from GC and push to “analytics” VMTrue: Cost of JSON-parsing from Tableau’s side would be more compared to old-fashioned Oracle tables. However for small scale applications this cost would be negligible.