In my last blog post about live video controls, I post some sketches of a prototype which could store some live video in a buffer for playback. Thanks to everyone for the feedback, which was all useful. I did want to draw attention to one point that James Heaver touched on: different uses of live video benefit from different ways of displaying the current time. James uses the example of watching a live football game, where knowing what quarter the game is in is more useful than knowing the actual time. In other cases, knowing the actual time (“4:30pm”) rather than the relative time (“you’ve watched for 4 minutes”) is more useful. Here’s some examples of uses for live video that could require different time displays:
As we develop the video controls, allowing developers the flexibility to decide which display time and/or labels suite their content will be important. Some video players today allow for toggling between relative and absolute time by clicking the timestamp: certainly an easy way to allow for both, if not very discoverable. We may find there’s other ways to improve usability for high traffic events, such as sports games or shuttle launches, by storing buffered video remotely rather than having users buffer it individually. Gerv points out that dynamically degrading video over time can allow for more content to be buffered, and Faaborg notes that there are instances that the user may want to save as much video as possible: two excellent points, which stress that making the video tag open and adaptable for the many kinds of content it can display is a primary objective.
Recent Comments