Wikipedia video gets boost with $100,000 Mozilla grant

Development to be integrated into open source MediaWiki package

Development to be integrated into open source MediaWiki package

The proliferation of standards-based video sharing and collaboration is set to take off with a $US100,000 grant from the Mozilla Foundation to fund the development of the Ogg Theora video codec and server-side streaming software.

Wikimedia developer Michael Dale announced the sponsorship during a presentation on Wikipedia's video content initiatives at this year's conference in Hobart.

The $100,000 grant is a six-month project for Ogg Theora encoder enhancements, improvements to network seeking, and client and server libraries which will end up in Firefox and MediaWiki.

"Once html5 was developed we noticed more work can be done to harden the backend libraries," Dale said. "The Xiph libraries for Firefox video playback are largely a volunteer effort. So we got people working on it full-time and are doing enhancements to the encoder to bring Theora in line with contemporary codecs."

Dale said Wikipedia will have staff resources put on integration of the components which will end up as a module to the open source MediaWiki wiki application.

These enhancements to MediaWiki should be available some time around May this year and will allow organizations to run their own collaborative video server.

Aussie open source software to enable temporal media

Annodex, the software being used to power Wikipedia's collaborative video sharing, has its origins at the CSIRO.

Australian Annodex developer Conrad Parker will spend one day a week working on the server-side seeking support to improve the speed of doing network seeking as a result of the Mozilla funding.

"I'll be improving network seeking in general," Parker said, adding he will collaborate the W3C media fragments working group to help develop the open standard.

"I’ll be implementing the server-side stuff of the Annodex spec we started a few years ago. The text annotation work is being worked on separately."

Annodex -- which allows video to be "tagged" with time and metadata and when served -- is network transparent, so a URL can define a point in time of a particular video and allow people to "surf video" through a Web browser in the same way they would surf the text Web.

"There has been a lot of collaboration in the free software community to get this working," Parker said. "You can add a query parameter on the end of a URL to play a specific time segment. You can put the same URL into Xine and can stream it to a handset or anything."

Additionally, videos can be tagged with links and other information which ties into Wikipedia's massive content repository. Have a video of a Tennis match? This could be tagged with information and statistics about Tennis in the same Web page.

Parker said the user-interface service is a Web application and can be completely separate to the video service.

Page Break

Dale said open video is important as it will make video a lot easier to view like text and images are today.

"So why do it? Free software needs to be on a par with proprietary systems. In the latest versions of Ubuntu it is one click away from importing software to play proprietary formats."

While Dale concedes the big problem is adoption, with transcoding more than 100,000 videos to the Ogg format and Wikimedia also supporting open formats, the game could change quickyl.

"A cool tool is Firefogg that does in-browser transcoding and uploading of video, so we can avoid videos looking crappy," Dale said.

Wikipedia is also working on a Web-based GUI developed in JavaScript for video editing that will make video content sharing much easier.

"The platform is the Web so we will see in-browser video editors dominate casual video content distribution. And it will be easy for open source desktop apps that integrate into Web applications."

Wiliepdia's collaborative video sequencer application will bring collaborative video to wikis and aims to support basic editing, effects and transitions, multi-track audio and video controls, wiki-driven templates and overlays, sequence transclusion, and APIs for desktop video editors to participate in collaborative video.

"It brings a sematic query system into temporal media," Dale said, adding such queries could be RSS feeds, for example.