Extension:TimedMediaHandler/Test media

See copy on testwiki at https://test.wikipedia.org/wiki/TimedMediaHandler/Test_media

I've made a couple of high-resolution sample video clips of a bubbling fountain, which may be freely used for testing TimedMediaHandler's ingest, transcoding, and playback. The scripts for making various formats and sizes from the source clips will be checked into git shortly; because the files are large they're being placed on-wiki here to avoid cluttering up the git repos. --Brooke Vibber (WMF) (talk) 18:22, 17 August 2023 (UTC)

Source clips edit

Two 20-second video clips, one at 1920x1080 59.94fps HDR, and the other at 3840x2160 29.97fps HDR. (These are the highest resolution options available on my camera at each frame rate.)

Currently our transcoding doesn't correctly pass through HDR settings to output and it ends up with the wrong colorspace without any mapping. This is fixable in TMH but needs some fancy ffmpeg calls, and isn't yet finished.

Derived clips edit

Automation edit

Given the two source files, all derived files can be regenerated with a Makefile that wraps ffmpeg commands for each file. This will get checked into source control soon.

Currently there's no automation for the upload process, or fetching the files for local testing. This may be added, but preserving the artifacts and their creation scripts is the most important thing!

Production testing edit

The files here on mediawiki.org can serve as canonical examples of the test files in the production environment. They will also be mirrored to test.wikipedia.org which will sometimes run an "advance configuration" with features not yet enabled live on the other wikis -- we'll be using this for doing some more serious checking for regressions and edge cases on the iOS-compatible HLS media generation.

See User:Brion Vibber (WMF)/Mobile video playback 2023

Things to check are working on the new transcode settings edit

General updates will go live everywhere once the patch merges and deploys, per usual staging. Will land to test first, though, so check the mirrors over there.

  • interlaced MPEG-2 files should now output one frame per field, preserving temporal resolution: so 480i30 MPEG-2 input -> 480p60, 360p60, 240p60 VP9 output
  • bitrates should be more consistent, and scaled according to framerate: so 24 fps videos may use lower bandwidth than 30 fps or 60 fps
  • high-frame rate media should convert reasonably

Things to check on the new HLS output edit

(Expected to go live October 2023!)

This will go live on test before it's enabled elsewhere to shake out any remaining bugs before we throw peoples random iPhones at this.

  • most newer iPhones (check version requirement) will see the full range of resolutions of VP9 video, auto-selected by the HLS player
  • some older iPhones back to iOS 12 (double-check this) will see a low-res Motion-JPEG video instead because they don't grok the VP9 and we haven't changed our policy about H.264 yet.
  • audio tracks should "just work", but iOS 17 will use Opus and iOS 16 and earlier will use MP3
    • confirm nothing breaks if you attempt to change audio or anything like that...
  • there should be no surprises like double-playback
  • Picture-in-Picture and AirPlay should work as expected, with both VP9 and Motion-JPEG (though the latter will be very blurry on AirPlay!)
    • note AirPlay is not enabled yet, but we can turn it on later (it needs room in the UI!)