batch processing

Converting a batch of Dashcam videos into a timelapse

I recently took a family vacation from St. Louis, MO to Branson, MO, and as it was the first time driving with my new Mobius Action Cam Mini dashcam installed on our Toyota Sienna (see a full writeup and review here), I wanted to see if I could quickly whip up a time lapse video of the entire drive.

Driving in St. Louis - dashcam loop gif
A tiny snippet of the final time-lapse video of my STL to Branson drive.

Use delegation, threading, and queues to speed up operations

I posted this to my personal site, but I wanted to mention it on this blog, as it's a performance optimization that I use quite often when programming for the web or for native applications: Don't Wait, Delegate! Proper use of threading and queueing.

There are hundreds of ways you can improve your app or website's performance, but few have the potential to improve your app or website's responsiveness as much as queueing or using background processes.

Using Batch API to build huge CSV files for custom exports

Flocknote is a large web application that lets churches easily manage communications with their members via email, text message, and phone calls. Many of the core features of email marketing services like MailChimp and Constant Contact are implemented in flocknote similarly, such as list management and mass emailing (and many features like shared list/member information management, text messaging, etc. are unique to flocknote).

Until recently, few groups using flocknote didn't have subscription lists that were big enough to hit our relatively high PHP max_time_limit setting when importing and exporting subscriber data. Since we're getting bigger, though, I've started implementing Batch API all over the place so user-facing bulk operations could not only complete without resulting in a half-finished operation, but could also show the end user exactly how much has been done, and how much is left:

Exporting List Subscribers - Batch API CSV Export

I've seen many tutorials, blog posts, and examples for using Drupal's Batch API for importing tons of data, but very few (actually, none) for exporting tons of data—and specifically, in my case, building a CSV file with tons of data for download. The closest thing I've seen is a feature request in the Webform issue queue: Use BatchAPI to Export very large data sets to CSV/Excel.

Before I get started, I want to mention that, for many people, something like Views Data Export (for getting a ton of data out of a View) or Node Export (specifically for exporting nodes) might be exactly what you need, and save you a few hours' time working with Batch API. However, since my particular circumstance ruled out Views, and since I was exporting a bit more customized data than just nodes or users, I needed to write my own batch export functionality.