Datadog collects and monitors your PHP app metrics and distributed traces in real-time with application performance monitoring. Decrease downtime and performance issues with Datadog APM by tracing requests across service boundaries and drilling into individual traces end-to-end with flame graphs. Start your 14-day trial for free today.

Processing a csv file in Laravel

Original – by Freek Van der Herten – 2 minute read

From time to time I need to process a csv file. PHP provides a fgetcsvfunction to help with that task. Unfortunately this is a very basic function. It will not, for instance, recognize a header column as such. In this quick post I'll show how using the excellent laravel-excel package (which can process csv's too) can help making your code much more readable.

Image you've got this csv than you want to import in your database:

last_name,first_name,job
Clegane,Sandor,Axe Wielder
Bolton,Ramsay,Hound Master
Snow, Jon, King Of The North
...

Without using laravel-excel you could do this:

$handle = fopen('characters.csv', "r");
$header = true;

while ($csvLine = fgetcsv($handle, 1000, ",")) {

    if ($header) {
        $header = false;
    } else {
        Character::create([
            'name' => $csvLine[0] . ' ' . $csvLine[1],
            'job' => $csvLine[2],
        ]);
    }
}

Let's see what we have on our hands: a file handle, a boolean to help skip the headers, numeric indexes on the $csvLine variable, an ugly loop, and an else branch. What a mess.

Let's improve this immensely by using laravel-excel:

use Illuminate\Support\Collection;

Excel::load('characters.csv')->each(function (Collection $csvLine) {

     Character::create([
         'name' => "{$csvLine->get('first_name')} {$csvLine->get('last_name')}",
         'job' => $csvLine->get('job'),
     ]);

});

To my eyes this is much cleaner. We don't need to manage the crap mentioned above anymore. The load function takes care of reading the file and processing the headers. It'll return a Collection object that you know and love.

The only advantage of the first example is that it uses less memory. The csvfile never gets entirely pulled in to memory, but only processed line by line. In the second example the whole csvfile gets loaded into memory at once. But laravel-excel has you covered. When dealing with big files you it can process the data in chucks, so only a few records are in memory at any given time. Read the docs on importing to learn how to do that.

You might argue that it's overkill to pull in a package just to read csv's. I think the added readability (and therefore improved maintainability) of the code is well worth the price of adding a package to a project.

EDIT: another library you might use to import csv's:

Stay up to date with all things Laravel, PHP, and JavaScript.

Follow me on Twitter. I regularly tweet out programming tips, and what I myself have learned in ongoing projects.

Every two weeks I send out a newsletter containing lots of interesting stuff for the modern PHP developer.

Expect quick tips & tricks, interesting tutorials, opinions and packages. Because I work with Laravel every day there is an emphasis on that framework.

Rest assured that I will only use your email address to send you the newsletter and will not use it for any other purposes.

Comments