Blog

How to join multiple MP4 files from a GoPro with ffmpeg

I recently shot some footage with a GoPro, and realized after the fact the GoPro 'chapters' the footage around 4 GB, so I ended up with a number of 4 GB files, instead of one larger file. There are various reasons for this, but in the end, I really wanted one long file, so it would be easier to synchronize with footage from another camera and my audio recorder.

So I found this answer on StackOverflow, which had exactly the commands I needed:

ffmpeg -i 1.mp4 -c copy -bsf:v h264_mp4toannexb -f mpegts intermediate1.ts
ffmpeg -i 2.mp4 -c copy -bsf:v h264_mp4toannexb -f mpegts intermediate2.ts
ffmpeg -i 3.mp4 -c copy -bsf:v h264_mp4toannexb -f mpegts intermediate3.ts
ffmpeg -i 4.mp4 -c copy -bsf:v h264_mp4toannexb -f mpegts intermediate4.ts
ffmpeg -i "concat:intermediate1.ts|intermediate2.ts|intermediate3.ts|intermediate4.ts" -c copy -bsf:a aac_adtstoasc output.mp4

Note: If you use the 'High Efficiency' (HEVC) encoder for your GoPro videos, change h264_mp4toannexb to hevc_mp4toannexb in the above commands.

Ignore noisy logs with fluentd in EKS or other Kubernetes clusters

Recently, I decided to use the fluentd-kubernetes-daemonset project to easily ship all logs from an EKS Kubernetes cluster in Amazon to an Elasticsearch cluster operating elsewhere.

The initial configuration worked great out of the box—just fill in details like the FLUENT_ELASTICSEARCH_HOST and any authentication info, and then deploy the RBAC rules and DaemonSet into your cluster, and you're off to the races (assuming your Elasticsearch instance is configured to allow access from the cluster!).

But once I did that, I noticed the brand new EKS cluster was sending over 16,000 log messages per second to Elasticsearch. Doing a tiny bit of analysis (not much was required, honestly), I found that over 98% of the logs were coming from two EKS-specific noisy containers, efs-csi-node and ebs-snapshot-controller.

Pi Day 2021 - Livestream of 16 drives on a Raspberry Pi (2nd attempt)

For Pi Day, I'm going to livestream my second attempt at getting 16 hard drives (well, 12 hard drives and 4 SSDs) recognized by a Raspberry Pi.

The first attempt went decently well... but I wound up running into power supply issues.

.embed-container { position: relative; padding-bottom: 56.25%; height: 0; overflow: hidden; max-width: 100%; } .embed-container iframe, .embed-container object, .embed-container embed { position: absolute; top: 0; left: 0; width: 100%; height: 100%; }

This time around, I will hopefully have those issues solved, and also we may have a little fun building a software-RAID-on-hardware-RAID (depending on how crazy we want to get). It probably won't work like I expect, but that's what makes it fun!.

M.2 on a Raspberry Pi - the TOFU Compute Module 4 Carrier Board

Ever since the Pi 2 model B went to a 4-core processor, disk IO has often been the primary bottleneck for my Pi projects.

You can use microSD cards, which aren't horrible, but... well, nevermind, they're pretty bad as a primary disk. Or you can plug in a USB 3.0 SSD and get decent speed, but you end up with a cabling mess and lose bandwidth and latency to a USB-to-SATA or USB-to-NVMe adapter.

The Pi 4 actually has an x1 PCI Express gen 2.0 lane, but the USB 3.0 controller chip populates that bus on the model B. The Compute Module 4, however doesn't presume anything—it exposes the PCIe lane directly to any card it plugs into.

TOFU board by Oratek - Raspberry Pi Compute Module 4 Carrier with M.2 slot

And in the case of Oratek's TOFU, it's exposed through an M.2 slot, making this board the first one I've used that can accept native NVMe storage, directly under the Pi:

Launched: Red Shirt Jeff merch store

It makes me throw up a little in my mouth to say this, because it's such a YouTuber thing to do... but I now have an official merch store to go along with my YouTube channel:

Red Shirt Jeff store - launch products

The Red Shirt Jeff store has original designs that I find interesting or funny, and there are three shirt designs available at launch:

Hardware RAID on the Raspberry Pi CM4

A few months ago, I posted a video titled Enterprise SAS RAID on the Raspberry Pi... but I never actually showed a SAS drive in it. And soon after, I posted another video, The Fastest SATA RAID on a Raspberry Pi.

Broadcom MegaRAID SAS storage controller HBA with HP 10K drives and Raspberry Pi Compute Module 4

Well now I have actual enterprise SAS drives running on a hardware RAID controller on a Raspberry Pi, and it's faster than the 'fastest' SATA RAID array I set up in that other video.

A Broadcom engineer named Josh watched my earlier videos and realized the ancient LSI card I was testing would not likely work with the ARM processor in the Pi, so he was able to send two pieces of kit my way: