Become a member
to unlock all features

Level Up!

Access all courses & lessons on egghead today and lock-in your price for life.


    Push Chunks of Large MySQL Data to an Algolia Index with Lodash


    Using the Lodash library, we’ll see how to split large amounts of data from a local MySQL database into smaller chunks and push them to an Algolia index.



    Become a Member to view code

    You must be a Pro Member to view code

    Access all courses and lessons, track your progress, gain confidence and expertise.

    Become a Member
    and unlock code for this lesson
    orLog In




    Instructor: When dealing with a larger number of records, it is recommended to split those into batches of 1,000 to 10,000, depending on the record size. Let's switch tables in our database from actors_sml to actors_big.

    The records are similar, except this time we don't have 500, but 50,000 rows to push. To push data by batches, we are going to use the Lodash library. Let's install the package and import it.

    Let's edit the query to target the right table and remove the previous addObjects method. Using Lodash chunks method, we will create an array holding smaller arrays of 1,000 records each.

    We then iterate over this array of arrays and call the addObject method each time, sending chunks of 1,000 records. We run the script and push the data back to the Algolia dashboard. I can now refresh the index and check that all the 50,000 actors are already there.