I’m thinking about implementing a Live Data CSV export that uses a job to be able to handle a lot of data. Basically, this would be similar to the Livetable Exporter Macro but built-in and using a job to create the CSV file.
From a user’s point of view, the menu we have in Live Data should contain a “CSV export” item that would lead to a job view with a progress bar and at the end, there would be a download link.
From a technical point of view, my current idea would be to add a new REST endpoint that gets basically the same data as the entries resource but instead of returning entries, it would create the live data config, store it in a job and return a job id and a URL for a status page. The Live Data UI would call that endpoint with the same parameters as the endpoint for getting the entries and would then redirect to the job status page. The job would paginate through the live data entries and write everything to a CSV file, updating the job status with every processed page (or every X items). At the end, the status page would display a download link. Ideally, there should also be a way to cancel the export job, so the progress UI also needs a cancel action. I’m wondering if the export progress should be displayed on a separate (traditional) page or if it should be displayed in the Live Data UI, updating the job status using the REST API, then the download itself should probably also happen via some REST endpoint.
Any opinions on this idea? I’m also wondering if we have any good examples for such export jobs, currently I’m only aware of one in the LaTeX exporter.