We all know from our daily online experience that media contents are on the rise. Digital photography and DIY videos are part of our information diet: more than 71% of online adults use video-sharing sites on a regular basis and 46% of Internet users post original photos and videos.
While information grows we’re gearing up with our sister company Insideout10 to set-up a new framework for semantic analysis of media contents such as photos, videos and audio files. Read more on the Insideout10 Blog.
It always feels great when the testing team of A1 (Austrian’s #1 Telecom operator) completes the weekly session and we can bring in production our latest software release of the Helix Cloud platform.
This time we added support for Simulated Live Broadcasting that can now be configured using an easy-to-use Web UI or via Restful APIs.
We also started the testing of the latest release of Helix Server v. 15 that now adds support for MPEG-DASH.
Thanks to @ziodave for his amazing work on this one and to the A1 Web Streaming Team (Ludwig, Arnold and Michael in the photo)!
It has been a long time since our team (working back then for the third largest ISP in Italy and in collaboration with a theatrical production) sent live over the Internet the first multimedia packets bundling a live audio feed (using RealAudio 1.0) with a sequence of images pushed every second with Meta refresh (a legacy method that instructs the browser to automatically refresh the current web page after a given time interval). And I’m not here to reminisce the beauty of these times since: yes, we were all very excited that it was possible to build a TV on the web but…there was no audience for most of our performances.
In these last 16 years since the RagDoll experiment (RagDoll was the name of the theatrical production) we’ve seen a wave of technological enhancements in Streaming Media driven primarily by the idea of a world of purely Internet-based, on-demand and live TV productions that would have replaced completely traditional broadcasting. It didn’t happen: TV Broadcasting went also digital, and Streaming Media remained focused on delivering audio and video to the PC (using Flash Video – people still kept asking “do you think people really watch video over the Internet”) and eventually moved to the mobile space (using RealVideo and 3GPP – and I still remember how great it was to help pakistani people watch cricket games with their phones when the TV was off because of continuos power outages) and finally arrived on smart phone and tablets of all kinds (along with a crucial question “is an iPad closer to a larger phone or to a TV?”).
Get familiar with MPEG-DASH and be ready for the transition – keep this standard in mind when planning your new publishing/encoding projects.
The uprise of the App World: Mobile Video goes mainstream
Enabling video on touch devices was a big hit (and still is – 66% of mobile data in 2014 is expected to be video accordingly to QuantCast), it did impact the telco business, the broadcasting industry and last but not least the TV in our living room – (better in yours since from the RagDoll experiment I only use a PC and a projector); from “Connected” the TV became “Smart” and now Apps are flowing in just like on all other screens.
Fragmentation vs. Standards
From the technology point of view the Streaming Media world has been heavily characterised by a strong fragmentation of protocols, video codecsand well of course devices – but things did change with the introduction of technologies such as HTML5, H.264 and HTTP Live Streaming (HLS) – a wave of standardisation is on the way and this is where MPEG-DASH (the dynamic adaptive streaming via HTTP standard) comes in place.
MPEG-DASH and the competing HTTP streaming technologies
The pace of industry changes seems to be increasing. The next few years we will see dramatic shifts on how viewers find and watch movies and TV shows, and how video publishers distribute their programming – eventually Streaming Media and Digital Broadcasting will blend completely but this can only happen if the standardisation moves forward. As of today there are three competing HTTP adaptive bit rate streaming technologies:
Apple’s HTTP Live Streaming (HLS)
Adobe’s HTTP Dynamic Streaming (HDS),
Microsoft’s HTTP Smooth Streaming (HSS)
RealCloud – our Video Cloud solution for MPEG-DASH presented at IBC2012
Why MPEG-DASH is so important
MPEG-DASH represents a brilliant attempt at a unified standard that can bring the overall industry a step forward: imagine if connected televisions, TV set-top boxes, Smart TV, desktop computers, smartphones and tablets could all share the same delivery (and content protection) infrastructure; what if we could share the same standard to quickly enable premium, pay-based content and multi-screen experiences (a clear pattern is emerging nowadays, expectations from consumers are high and fragmentation is a huge burden).As of today the founding members of DASH-IF include among the others Akamai, Ericsson, Microsoft, Netflix, Qualcomm, Samsung, Adobe Systems, Cisco, Dolby Labs, DTS, Envivio, Espial Group, European Broadcast Union, Fraunhofer IIS, Harmonic, Huawei Technologies, Intel Corp, Irdeto, Nagravision, RealNetworks, Verimatrix, Wowza Media Systems and…one notable name missing name remains Apple determined to keep pushing for its own HLS.
How we can help you deliver your contents using MPEG-DASH
If you’re interested in understanding how Helix Universal Media Server and our middleware that was presented at IBC this year can help you introduce MPEG-DASH on your delivery infrastructure send us an email at sales@interactegypt.me or get in contact with our Cairo office.
This year we’ve been looking at the “Second Screen” technologies at IBC in Amsterdam – “Second Screen” is the use of an additional monitor (e.g. tablet, smartphone) while watching television. It allows the audience to interact with what they’re consuming whether it’s a TV show, video game or movie.