Skip navigation

Syncing Audio, Video, and Animations, Part 1

Sync storyboards and animations via markers to create dynamic content

Media is becoming more and more pervasive, with websites such as YouTube and Vimeo making it possible to see events before the news channels even air them. It’s pretty amazing if you think about it, especially if you consider where online audio and video was just 10 years ago. With the increase in media distribution through the web, there’s a huge opportunity to develop applications that capture and display media using Silverlight. Silverlight’s great for line of business (LOB) applications as well and I spend the majority of my time building these types of applications for customers. Silverlight also provides a nice set of rich media features and can even display HD video, as demonstrated by NBC’s Sunday Night Football and the 2010 Winter Olympics.

At this point, some of you may be thinking that you really don’t need media in your applications—that may certainly be true. However, I’d argue that most applications can benefit from integrated video tutorials (at a minimum) that show users how to use specific features and hopefully reduce Help desk calls. Sprinkle in a few animations that demo a particular feature or direct the user toward a menu and you can take your application to the next level.

Adding Media into an Application
 Silverlight applications requiring audio or video integration can place a MediaElement into an XAML file, point the MediaElement’s Source to the media file’s location, and then call Play() on the MediaElement to play it at a specific time. Here’s an example of a basic MediaElement pointing to a video named ImportDemo.wmv:

<MediaElement x:Name="ImportDemoVideo" Source="ImportDemo.wmv" />

While displaying media is fairly straightforward, things get a little tricky if you want to sync audio or video with storyboard animations, since media has to be started by calling the MediaElement’s Play() method. If you need to start media when a specific animation starts or ends you’ll quickly discover that things get quite a bit more challenging.

My company ran into this while developing a Silverlight marketing application for Intel that had hundreds of animations along with numerous audio and video files that had to start at specific times. At first glance it seemed like the requested features would be fairly simple to implement. However, after gathering more requirements we found that syncing all of the media assets was more challenging than we anticipated.

In this article, I’ll walk through some of the fundamentals of storyboards and animations and describe a key technique for syncing audio, video, and animations using something called markers. I’ll also demonstrate how Expression Encoder can be used to create marker XML files quite easily. Part two of this series will cover more advanced features, such as syncing through dependency properties and behaviors. My talk at MIX10 titled “Syncing Audio, Video and Animations” should be available here soon, so be sure to check back.

Storyboards and Animations
Silverlight provides several different types of animations that can be placed inside of a storyboard to animate opacity, position, color, transforms, and more. You can use DoubleAnimation, ColorAnimation, PointAnimation, and different key frame animations. Although I won’t provide a complete discussion of storyboards and animations here, Figure 1 shows an example of using them inside a storyboard named SyncStoryboard.

This code changes the opacity of several border elements at specific times by using LinearDoubleKeyFrames, each defining a KeyTime. Each border element is filled with a VideoBrush to display media (a border is used to add nice rounded corners around the video). The VideoBrush’s SourceName property points to a MediaElement that handles loading the media, as Figure 2 shows.

As each storyboard animation fires, it’d be nice to handle an event that could be used to call Play() on the associated MediaElement. Unfortunately, the storyboard class only exposes a Completed event, so there’s no direct way to know when a particular border’s opacity is being changed. Although the storyboard class doesn’t provide an AnimationStarted event, there are different workarounds that can be used, as you’ll see later in this article and in part two of this series.

In addition to defining storyboards in XAML, you can also create them programmatically, which is useful in cases where animations may be defined in an external file, database, or created by an end user at runtime. Figure 3 shows an example of creating dynamic storyboard animations. The code animates the height and width properties of a media container control named LightBoxControl. After the DoubleAnimation objects are created and added to the storyboard’s Children collection, the storyboard object’s Begin() method is called to start things in motion.

Now that you’ve seen some of the fundamentals of working with storyboards in XAML and in code, let’s examine different options for syncing media and animations.

 

Media Syncing Options
There are several different ways to track when animations start in order to play an audio or video file at a specific time. First, you can define an empty storyboard with no duration and handle its Completed event. As the storyboard’s Begin() method is called, its Completed event will fire immediately, giving you a frame-by-frame type timer. Once you handle the Completed event you can call Begin() again and the cycle repeats itself. This technique was used with older versions of Silverlight, where timers such as DispatcherTimer weren’t available, so I won’t go into more details here.

Another technique that can be used to sync media with animations is to use attached properties. By creating an attached property and attaching it to an element in your XAML, you can hook an animation to that property and track it. As the storyboard begins, the animation can change the value of the custom attached property, which can then raise an event that can be used to trigger other media to play. You can also take advantage of behaviors which were introduced with Expression Blend 3 and create a sync framework that can attach media start and stop times directly to a storyboard. I’ll provide more details about attached properties and behaviors in part two of this series.

The easiest option for syncing audio, video, and animations is to take advantage of markers and the media timeline provided by Expression Encoder. Markers provide a way to define sync points as an audio or video file plays. You can create markers programmatically in cases where dynamic applications are being written or store them statically. For applications that have known media sync points, your best bet is to use Expression Encoder, since it can create an XML representation of the sync points you need quite easily.

Markers are most effective when you have an audio or video file that is playing the entire time or when you have multiple audio/video files all with associated animations. If your application is driven by animations (as opposed to media) and audio and video files need to be started or stopped based on the animation key frames, this technique isn’t quite as useful. It works great when used in the appropriate scenario, however, and my company has used the technique successfully in media-centric applications. Figure 4 shows an example of Expression Encoder in action with the Markers window open.

Before creating markers, you’ll first need to load an audio or video file into Expression Encoder. You can do this by clicking the Import button in the lower-left corner of the interface (see Figure 4). Once the media is loaded, you’ll see a timeline in the middle of the interface. Markers are added by right-clicking on the timeline and selecting Add Marker from the menu. Doing this will add a marker to the Markers window, as Figure 4 shows. The Markers window is part of the Metadata tab.

Each marker can have a time and value associated with it. The value is useful for knowing what associated animation may need to be triggered at the time defined by the marker. Once all of the markers have been added on the timeline, you can select the “gear” icon at the bottom of the Markers window and select Export from the menu, as Figure 5 shows.

Once the markers are exported, an XML file will be created that can be used in your Silverlight application. Figure 6 shows an example of a marker file created by Expression Encoder.

The marker XML file can be loaded programmatically in a Silverlight application, triggering an animation to start when an audio or video file plays. To load the XML file, you’ll first need to parse it and create a special type of object called a TimelineMarker. This object can store the text for the marker (Value in the XML file shown in Figure 6) and the time the marker should be triggered. Once the TimelineMarker object is created, it can be added into the MediaElement’s Markers collection. Finally, the MediaElement’s MarkerReached event can be wired to an event handler that handles playing another media file or storyboard as appropriate. Figure 7 shows how to put all of these steps together.

Looking at the MyMedia_MarkerReached event handler, you can see that the value of the marker (as defined in Expression Encoder) can be accessed through the TimelineMarkerRoutedEventArgs object’s Text property. Different storyboards or MediaElements can be started or stopped based upon the value found in the Text property.

More on Storyboards Soon
There’s much more that can be covered on the topic of syncing audio, video and animations. In this article you’ve been introduced to some basic storyboard and animation concepts and seen how Expression Encoder can be used to create a marker XML file. You’ve also seen how the marker file can be parsed and loaded so that a given MediaElement object has a defined set of TimelineMarker objects that can fire events. In part two, I’ll discuss additional strategies that can be used to sync media and animations including using attached properties and custom behaviors.

Dan Wahlin ([email protected]), a Microsoft MVP for ASP.NET and XML web services, founded the XML for ASP.NET Developers website and The Wahlin Group (www.TheWahlinGroup.com), which specializes in .NET, Silverlight, and SharePoint consulting and training solutions. 

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish