I would like to ask if anyone of you tried to insert multiple records at once in a related tables? Here's the scenario. I have a table of Drug and DrugMovement. Now I want to insert records to both tables using from a xls file as source. So this xls file contains thousand of records. By the time the user upload that file. All content of that file will be inserted to the tables. Now I'm thinking of batch upload. But I have no idea what will be the best approach to this.
Below is the schema
======== Drug Table ==========
class Drug extends Model
{
public function drugMovements() {
return $this->hasMany('App\DrugMovement');
}
}
======== Drug Movement ===========
class DrugMovement extends Model
{
public function drug() {
return $this->belongsTo('App\Drug');
}
}
Now I want to save records to both this table with a thousand of records inserting at once. How can I achieve this? If I do something like this then it will be a waste of resource as I need to loop to all the records and do thousands of insert.
foreach($datas as $data) {
$drug = Drug::create([
"pharmacy_id" => 1,
"name" => $data->drug_name,
"strength" => $data->strength,
]);
$drug_movements = new DrugMovement([
"quantity" => $data->quantity,
"pharmacist_id" => 1,
]);
$drug->drugMovements()->save($drug_movements);
$drug->save();
}
As you can see if data has thousands of records it will insert thousand times. How can I optimize this?
via Chebli Mohamed
Aucun commentaire:
Enregistrer un commentaire