Silverlight offers a tremendous amount of potential when it comes to presenting data to users in compelling and unique ways. The animation engine allows different types of animations to be performed with objects, and fluid UI enhancements available in Silverlight 4 allow items being added into items controls—such as the ListBox—to be animated as well. If you have a creative idea in mind, chances are you can implement it in Silverlight.
While the animation functionality available in Silverlight is quite robust, things can become more difficult when animations need to be synced with audio or video. The storyboard in Expression Blend doesn’t allow media objects to be started, stopped, or paused directly, and the Storyboard object doesn’t provide a way to add MediaElement objects and associated start or stop times into it. You can certainly work around these limitations, but some code and knowledge of the framework is needed.
At this point, some of you who don’t deal with animations much may be saying, “I don’t need to know this stuff.” While I’ll focus on media and animations, the concepts covered in this article can be applied to several different situations. I recommend that you read on, especially if you’re interested in learning more about creating attached properties and how they can be used in Silverlight applications.
Using Markers to Sync Audio, Video, and Animations
In part 1 of this article series, I discussed one potential way to synchronize media with storyboard animations using Expression Encoder and markers. Although I won’t discuss everything from that article here, I do want to mention that one of the easiest ways to synchronize media files with animations is to use markers and provide a quick review before showing some new ideas. If you’re already familiar with markers feel free to skip to the next section.
Markers are points on a timeline that can be used to fire events at specific times. They can be created programmatically or loaded from XML data. Expression Encoder allows markers to be created visually on a media timeline, which simplifies the process. (See my previous article for more details.) Figure 1 shows an example of a marker file created using Expression Encoder.
Figure 2 shows an example of parsing the XML file and adding each marker into the Markers collection of a MediaElement. Looking through the code, you’ll see that a callback event handler named MyMedia_MarkerReached is defined to handle marker events as they fire. Once the event fires, the marker value and time can be accessed and used to perform actions, such as starting a storyboard or MediaElement object.
While markers work well when audio or video is playing the entire time, you may need to start one or more MediaElements at different times while a storyboard plays. Let’s examine another technique that can be used to handle that situation that takes advantage of an attached property.
Creating an Attached Property
Attached properties are a feature available in Silverlight that allow additional information to be “attached” to an XAML element without creating a new subclass of the target object. You’ve probably used several different types of attached properties before whether you realized it or not. For example, if you’ve arranged child elements within a Canvas, then you used the Canvas.Left and Canvas.Top attached properties. These properties allow top and left positions to be “attached” to children of a Canvas so that they are arranged properly within the parent.
Attached properties are a special type of dependency property created using the DependencyProperty.RegisterAttached() method. As a result, attached properties can be manipulated using different types of animations, can participate in data binding operations, and more. This allows for several interesting uses of attached properties, especially when it comes to syncing media with animations. For example, an attached property can be created that a storyboard animates to change its value over a period of time. As the attached property changes, an event can be raised at specific intervals. The event handler can then check to see if any MediaElement objects should be started, stopped, or paused at that time. In other words, the attached property can act as a built-in storyboard timer that can raise events at specific intervals and track how far along a storyboard is.
Figure 3 shows an example of an attached property named Milliseconds contained in a StoryboardTimer class. The code uses the RegisterAttached() method to define the name of the attached property, its data type, its owner type, and a PropertyMetadata object that defines the property’s default value as well as the callback that should be invoked when the value changes.
As the Milliseconds attached property value changes, the OnMillisecondsChanged() method will be called, which in turn invokes the OnMediaKeyFrameTriggered() method. OnMediaKeyFrameTriggered() ensures that listeners are attached to the StoryboardTimer class’s MediaKeyFrameTriggered event and raises the event as appropriate. Now that you’ve seen how to create an attached property, let’s see how it can be used in XAML to help sync audio, video, and animations.
Using the Milliseconds Attached Property
Adding the Milliseconds attached property into an XAML file is quite straightforward. First, the namespace and assembly where the StoryboardTimer class lives must be defined using an XML namespace that is associated with a prefix:
Once the namespace prefix is defined, the Milliseconds attached property can be “attached” to any XAML element. The following XAML attaches the property to the LayoutRoot:
…Child elements go here
With the Milliseconds property attached to an XAML element, a DoubleAnimation can be created and added into a storyboard to animate the property, as Figure 4 shows. The DoubleAnimation sets the duration of the animation to 1 minute and 5 seconds and changes the Milliseconds property from 0 to 65000. The By attribute causes the property value to be updated every 50 milliseconds. Looking through the XAML code for the DoubleAnimation, you’ll quickly notice that the Storyboard.TargetProperty attribute targets Canvas.Top instead of the custom Milliseconds property. In this case that’s simply a placeholder that will be updated at runtime, as you’ll see next.
When the user control loads, the Storyboard.TargetProperty defined in the DoubleAnimation needs to be updated to target the Milliseconds property rather than Canvas.Top. An event handler also needs to be wired-up to the StoryboardTimer class’s MediaKeyFrameTriggered event so that we know when the Milliseconds property value changes. Both of these steps are shown next:
Now that the Milliseconds property is defined in XAML and targeted by the DoubleAnimation, a technique for tracking when different MediaElement objects should play must be created. There are several different ways to handle this, including storing media start and stop times in an external XML file and then parsing it at runtime or defining everything in code. The sample code (url provided at the end of this article) provides examples of doing it both ways.
Regardless of how you store the MediaElement start and stop times that should be synced with the storyboard animations, you’ll need a way to read these values as the StoryboardTimer object’s MediaKeyFrameTriggered event fires. I handled this by defining a class named MediaKeyFrame that defines a property to track the time the media should be triggered as well as what action should be taken at that time (start, stop, pause). Figure 5 shows the MediaKeyFrame class.
I won’t cover each property defined in MediaKeyFrame in this article, since some of them can be used for more advanced purposes, but the properties we’re interested in for this discussion include KeyTime, TargetName, ActionTriggered, and MediaAction. KeyTime is used to track the exact time that a media action should occur, TargetName is used to hold the name of the MediaElement that should be triggered, ActionTriggered is used to define whether or not a MediaElement has already been triggered or not, and MediaAction defines what action should occur at the time defined by KeyTime (start, stop, pause). Figure 6 shows an example of using the MediaKeyFrame class to create a List<MediaKeyFrame>.
Once the List<MediaKeyFrame> shown in Figure 6 is created, the storyboard can be started by calling its Begin() method. Several different animations can be included in the storyboard to animate text, change opacity of objects, and more. As these animations play, the Milliseconds attached property will be used to monitor the progress of the storyboard and fire the MediaKeyFrameTriggered event at appropriate times (every 50 milliseconds in this case). As the event fires, a LINQ query can be used to locate any MediaKeyFrame objects from Figure 6 that should fire at that specific time (in milliseconds).
Figure 7 shows an example of handling the MediaKeyFrameTriggered event and querying the List<MediaKeyFrame> collection to see if an action should be performed on a MediaElement.
Looking through the code in Figure 7, you can see that the current value of the Milliseconds attached property is accessed through the MediaKeyFrameTriggeredEventArgs object. From there, a TimeSpan object is created and compared to all of the KeyTime values defined in the List<MediaKeyFrame> . If a KeyTime is found that should be triggered at that specific time (and the MediaElement hasn’t already been triggered), the appropriate action to invoke is checked by accessing MediaKeyFrame’s MediaAction property. From there the code starts, stops, or pauses the media as appropriate. The end result is that audio or video files can be started at specific times as a storyboard plays other animations allowing everything to stay in-sync.
Syncing Multimedia Is Useful and Doable
Syncing audio, video, and animations isn’t something that’s built-into Silverlight 4 or Expression Blend, but there are several different techniques that can be used to accomplish the task. In this article you’ve seen how an attached property can be created and attached to an XAML element. From there the attached property can be animated using a DoubleAnimation that will cause an event to be raised. The event handler can scan a collection holding MediaElement names and determine if they should be played, stopped, or paused.