Efficiently process large-scale data with batch imports
What we’ve done so far when importing our excel sheets into our database is create an import for each row: an actual insert query occurs each time. What if we had a 1000 rows? That’s right: 1000 insert queries would occur. Laravel does allow for batch inserts and Laravel-Excel accommodates that form of inserting. If you set a batch size of 100, that means that only 10 queries occur, not 1000.
Batch inserting limits the number of queries done and increases the speed. The number of rows to insert in a single batch will be case-by-case dependent. Start experimenting with the number of rows to import and see what works. A batch size of 1000 rows might not be the ultimate solution. It really depends on multiple factors, such as the amount of data being processed per row.
https://blog.devgenius.io/laravel-excel-p5-importing-sheets-with-formulas-4016dcc68862
Implementing Batch Insert
To implement batch insert, Laravel-Excel just requires the WithBatchInserts
concern and the required method, batchSize
.
<?php
namespace App\Imports;
use App\Models\User;
use Illuminate\Support\Facades\Hash;
use Maatwebsite\Excel\Concerns\ToModel;
use Maatwebsite\Excel\Concerns\WithBatchInserts;
use Maatwebsite\Excel\Concerns\WithHeadingRow;
class BatchImport implements ToModel, WithHeadingRow, WithBatchInserts
{
/**
* @param array $row
*
* @return \Illuminate\Database\Eloquent\Model|null
*/
public function model(array $row)
{
return new User([
'name' => $row['first_name'] . " " . $row['last_name'],
'email' => $row['email'],
'password' => Hash::make($row['password']),
]);
}
public function batchSize(): int
{
return 100;
}
That’s it. Laravel-Excel makes it super simple. Let’s create the method in our UserController
and our route so that we can test. I used the following mock data.
<?php
namespace App\Http\Controllers;
//...
use App\Imports\BatchImport;
use App\Models\User;
use Illuminate\Http\Request;
use Maatwebsite\Excel\Facades\Excel;
class UserController extends Controller
{
//...
public function import_with_batch() {
Excel::import(
new BatchImport,
'mock_data/MOCK_DATA_6.csv'
);
return redirect('/')->with('success', 'Users Imported Successfully!');
}
}
<?php
use Illuminate\Support\Facades\Route;
use App\Http\Controllers\UserController;
use Illuminate\Support\Facades\Artisan;
//...
Route::get(
'/import-with-batch',
[UserController::class, 'import_with_batch']
);
Running the import yields the result that we wanted.
Inserting New and Updating Existing
We can also implement functionality that imports new items but updates existing items. For example, the email
field is a unique field in the users
table. If the email
exists, update the content for that entry, otherwise, insert it.
To get this functionality, we implement the WithUpserts
concern and specify which field is unique in the uniqueBy
method.
?php
namespace App\Imports;
use App\Models\User;
use Illuminate\Support\Facades\Hash;
use Maatwebsite\Excel\Concerns\ToModel;
use Maatwebsite\Excel\Concerns\WithBatchInserts;
use Maatwebsite\Excel\Concerns\WithHeadingRow;
use Maatwebsite\Excel\Concerns\WithUpserts;
class BatchImport implements ToModel, WithHeadingRow, WithBatchInserts, WithUpserts
{
/**
* @param array $row
*
* @return \Illuminate\Database\Eloquent\Model|null
*/
public function model(array $row)
{
return new User([
'name' => $row['first_name'] . " " . $row['last_name'],
'email' => $row['email'],
'password' => Hash::make($row['password']),
]);
}
public function batchSize(): int
{
return 100;
}
public function uniqueBy(): string
{
return 'email';
}
}
Without clearing your users
table, run your import again and see that it doesn’t break. If you updated any of the fields, like first name or last name, those fields will be updated.
Laravel Series
Continue your Laravel Learning.
Seamlessly process spreadsheet formulas with Laravel-Excel
Laravel-Excel — P5: Importing Sheets With Formulas
In the fifth installment of our Laravel-Excel series, dive into importing spreadsheets containing formulas. Discover how to parse and evaluate formula-based data, ensuring accurate calculations, robust data handling, and streamlined application workflows.
Efficiently process large-scale data with batch imports.
Laravel-Excel — P6: Batch Imports
In the sixth installment of our Laravel-Excel series, discover how to perform large-scale data processing with batch imports. Explore methods for optimizing memory usage, handling parallel jobs, and ensuring robust data handling for massive spreadsheet files.
Handle massive datasets with efficient chunk reading.
Laravel-Excel — P7: Chunk Reading
In the seventh installment of our Laravel-Excel series, discover how to handle large datasets by reading them in manageable chunks. Learn techniques to optimize memory usage, boost performance, and maintain data integrity for scalable, high-performing applications.