Push Chunks of Large MySQL Data to an Algolia Index with Lodash

Raphael Terrier
InstructorRaphael Terrier

Share this video with your friends

Send Tweet
Published 4 years ago
Updated 3 years ago

Using the Lodash library, we’ll see how to split large amounts of data from a local MySQL database into smaller chunks and push them to an Algolia index.

Instructor: [00:00] When dealing with a larger number of records, it is recommended to split those into batches of 1,000 to 10,000, depending on the record size. Let's switch tables in our database from actors_sml to actors_big.

[00:14] The records are similar, except this time we don't have 500, but 50,000 rows to push. To push data by batches, we are going to use the Lodash library. Let's install the package and import it.

[00:30] Let's edit the query to target the right table and remove the previous addObjects method. Using Lodash chunks method, we will create an array holding smaller arrays of 1,000 records each.

[00:44] We then iterate over this array of arrays and call the addObject method each time, sending chunks of 1,000 records. We run the script and push the data back to the Algolia dashboard. I can now refresh the index and check that all the 50,000 actors are already there.