mardi 12 juin 2018

What's the proper way to import 1.6M line file?

I have a Laravel 5.3 project. I need to import and parse a pretty large (1.6M lines) text file.

I am having memory resource issues. I think at some point, I need to use chunk but am having trouble getting the file loaded to do so.

Here is what I am trying;

    if(Input::hasFile('file')){
        $path = Input::file('file')->getRealPath(); //assign file from input
        $data = file($path); //load the file
        $data->chunk(100, function ($content) { //parse it 100 lines at a time
            foreach ($content as $line) {
                //use $line
            }
        });
    }

I understand that file() will return an array vs. File::get() which will return a string.

I have increased my php.ini upload and memory limits to be able to handle the file size but am running into this error;

Allowed memory size of 524288000 bytes exhausted (tried to allocate 4096 bytes)

This is occurring at the line;

$data = file($path);

What am I missing? And/or is this the most ideal way to do this?

Thanks!



via Chebli Mohamed

Aucun commentaire:

Enregistrer un commentaire