samedi 26 janvier 2019

Most efficient way to get csv data to the backend

I wanted to see if I could get some input from the community regarding the upload of a CSV file. These files are huge, talking multiple GB. Due to server restrictions, I am unable to use chunk uploading.

So lets say I have a very standard file input

<input type="file" class="form-control-file" name="fileToUpload">

On change or submit, I can then pass this via a route to my backend. The Controller function might look something like this

public function uploadFilePost(Request $request){
    $request->validate([
        'fileToUpload' => 'required|file',
    ]);

    $request->fileToUpload->store('csvFiles');

    return back()
        ->with('success','You have successfully uploaded the file.');
}

The big issue here is that with files of multiple GB, this process takes hours, and this is a problem as you need to keep your browser open whilst this request is happening.

So really my first question is this. I am able to parse the file on the client using something like PapaParse. This literally takes a few seconds to parse. With the file now parsed into a JSON string or array, would this in any way reduce the size of the file?

My second question is say parsing does nothing to the size, I now however have all of this csv data within local variable. Is there any way for me to send this to the backend without the user having to wait for hours? I have looked at queues, would this be an option, or would this only do anything once the data is on the backed?

Really, any information is appreciated. I may be way off in my thinking, I am just trying to come up with the most efficient way of sending a large volume of data to my backend.

Thanks



via Chebli Mohamed

Aucun commentaire:

Enregistrer un commentaire