Author/Editor: Patrick L. Schmitz cogit@ludicrum.org
Initial releases of the Batik viewer for SVG 1.0 documents did not include support for animation. This document describes the general models proposed to provide this support in an upcoming release of Batik. This document is contributed to the Batik effort by the author, under the terms of The Apache Software Foundation Contributor License Agreement.
1. Overview
2. The Timing Model
2.1. Parsing Support
2.2. The Time Containment
Tree
2.3. Time
Conversion and Sampling Services
2.4. Time
Reference and Time Dependency Maintenance
2.5. Unresolved
Times and Time Dependency
2.6. Interval
Computation and Maintenance
2.7. Timing Classes
and Interfaces
3. The Animation Model
3.1. The Animation Compositor
3.2. Shared Animation
Services
3.3. Animation Classes and
Interfaces
4. References
SVG defines animation support based upon the SMIL Animation recommendation [SMIL-ANIMATION]. This includes support for a set of animation elements, an animation composition model, and a simplified timing model based upon aspects of the SMIL 2.0 Timing model [SMIL2-TIMING]. Although described in a single chapter in SVG 1.0, animation functionality is logically separated into the timing model and the animation model.
The timing model describes the support needed to interpret the timing attributes of the animation elements, to build and maintain the associated timegraph, to support sampling (evaluation) of the timegraph at any presentation time, and to support dynamism within the timegraph itself that arises through events and timing related DOM methods. The timegraph is sampled by directing the root to seek to a particular point in document time. The timegraph is traversed, converting the document time to the local time for each time-leaf. Additional state (and state transitions) described by the SMIL Timing semantics are maintained as part of this traversal (and its side-effects). Details are provided below in the section on The timing model.
The animation model describes the support needed to interpret the attributes of the animation elements, tools required to support conversion of unit spaces, interpolation of values, etc. The animation model also defines the model for composition of animations: how to combine multiple animations that target a given property, and how to account for the underlying property values when animation is defined to be additive. Details are provided below in the section on The animation model.
Each animation element is a client of the timing model services; when the
timegraph is sampled, each active animation will be evaluated at the associated
local simple time. Each animation element is also a client of the animation
model services; when the animation is evaluated, it provides its individual
result to the composition manager for the associated target property. When the
timegraph sample is complete and all the active animation elements have been
evaluated, each animation composition manager will compose the
individual animation results for each target property and apply the resulting
value to the SVGAnimatedXXX
value for the associated property. This completes
the document animation sample for a given document time.
The Timing Model describes an approach to implementing the semantics described in the SMIL specifications for timing. It describes a set of modules that provide this support, emphasizing modular, encapsulated design, robustness and efficiency. The design also includes flexibility to incorporate functionality that is under discussion for SVG 2.0, without adding undue complexity to the current implementation.
Much of the timing model design is dictated by — and to some extent even described in — the SMIL 2.0 Timing chapter [SMIL2-TIMING]. The specification describes a timegraph that is maintained dynamically, and which can be evaluated at any document time to calculate the local time at each graph node (where a node represents a timed element in the document). The timegraph includes several essentially orthogonal graphs that relate the graph nodes, of which two are of interest for SVG: the time containment tree, and the time dependency graph. These two structures and the associated functionality form the core functional areas of the timing model. Support for unresolved times (including event-based timing) and the calculation of intervals from lists of begin and end times are also key functional areas, along with local-time conversion services and of course parsing of the syntax. These functional areas are described below, followed by a description of the associated classes that will support the functionality.
A parser module will interpret the XML attribute values for timing, and construct the
timegraph accordingly. An XML parser will already have constructed a DOM instance that
includes the animation elements, and many properties (such as restart
) need no
further parsing (although in some cases, the attribute values may be represented more
efficiently by some internal value).
The begin
and end
attributes require the most parsing, and
have essentially no validation support from the XML parser. The parsing of the two
attributes is identical, with the exception that a missing value has a different default for begin
and end
. The attribute values are tokenized into individual time-values (they
have a semi-colon separator), and then each time value is parsed according to the
algorithm described in [SMIL2-TIMING], in the section Parsing
timing specifiers. The data structures for time values are described below.
The dur
attribute is much simpler, allowing only positive (> 0) clock
values and the special values "indefinite" and "media". The min and
max attributes are the same, except that they do not support "indefinite". Since
"media" is not really supported in SVG for any of these attributes (it should be
ignored), this value can be treated the same as an empty string or missing attribute
value.
The repeatDur
attribute parses like dur
, except with no
"media" value allowed. The repeatCount
attribute is just a simple floating
point value, or the special value "indefinite".
The parsers for dur
, min
, max
, and repeatDur
can leverage portions of the same parser used for begin
and end
attributes, with some constraints on the allowed token types. This implies that the parser
for individual time values be a separate method, and that it take flags of some sort to
indicate which time-value classes are legal for the target attribute value.
The restart and fill attributes could already have been validated by the XML parser (they are enumerations), and just map to some enumerated value in code.
All Clock values (which includes all offset values as well) could be parsed into a clock-value data structure that maintains the individual fields, but it is probably sufficient to maintain the original string (for debug and logging), and a converted canonical time value. Most correct would be a floating point count of seconds, and more efficient might be an integer count of milliseconds (finer than millisecond granularity is unlikely to be a requirement any time soon).
Reviewers: Is floating point arithmetic an issue for your target platforms? Should we bother dealing with the integer optimization? Given the amount of FP you will do for graphics, I would presume that this optimization may be a joke, but then every little bit counts...
The basic timed element (an animation element in SVG1) is a node in the time containment tree. This should be additional information that is associated with the existing DOM Node, rather than a completely orthogonal structure. It is important to be able to model timing as an extension of element behavior, rather than a completely separate model - this just makes it easier to code and manipulate. As such, the ideal implementation would have the timed element class subclass some generic element class, adding the information for timing.
In SVG1, only the document root is a time container, but there is little additional
cost in planning ahead and supporting a model of time containment. A time container
subclasses a basic time node to add a list of timed children, and an interface to define
default begin timing. In SVG1, the latter can be stubbed, since the time root always has <par>
semantics. Nevertheless, the ideal model will define the document root, as the timing
root, to derive from a timed element to include time containment.
Reviewers: Do you foresee problems with this inheritance model (in particular for the doc root)? Does it conflict with existing DOM class inheritance?
The list of timed children is an important extension to the DOM tree structure, as the timed children will not necessarily be immediate children of the document root (the time root). This is simply a list of element references.
In the general case of time containers, there is a means of defining an implicit
duration based upon the child timing. This is specified with the endsync
attribute, and also impacts the default value for the dur
attribute. For now,
we can add stubbed support for these, and leave real support to a future version (it will
be easier to extend if we have placeholders).
SMIL 2 defines the notion of local time for a timed element, and distinguishes between simple time (the basic timespace in which an animation function is defined) and active time (which includes the effects of repeating and various constraints). The conversion from local active time to local simple time is straightforward, and is detailed in the Element simple time calculation section of [SMIL2-TIMING].
In a flat time model like SVG1, element begin and end times are all ultimately represented in document simple time. There are no "long" sync-arcs, and time conversion is relatively trivial. Conversion from document simple time to an element local active time is simply a matter of subtracting the current interval begin time for the element.
In the more general case of hierarchic timing, these two conversions are combined, and the resulting conversion can be applied normally, inverted, and used in a recursive manner to handle arbitrarily deep time containment nesting. This is detailed in Converting from event time to element time.
For this model, we will account for the full case in the design, but will simplify the implementation somewhat to reflect the flat timespace. Where times must be computed or converted, we will use methods that wrap this detail, allowing us to easily extend support for SVG2 (should time containers be added).
Sampling the timegraph is really just a recursive/iterative conversion of document time to local simple time for each element in the timegraph. The sample is initiated by an external mechanism - the timegraph does not determine the document sample rate or document sample times.
The sample traversal is really just a call on the root, which calls each child (which can recurse in the general case). This can be supported with the "Traversable" design model, or it can be built into the classes themselves (I have coded it both ways in different implementations). The generalized part of the traversal is pretty small, and so it is not clear to me that it is worth generalizing/abstracting. In addition, the "Traversable" design paradigm makes the code less readable, IMO.
This is supported for leaf nodes very simply. In the conversion of active to simple time, repeat functionality is accounted for by applying a variant on the mathematical MOD (remainder) functionality. For simple nodes with a fixed duration, it is just MOD. For anything that can vary (like media or time containers), it is slightly more complex. We will include comments (or conditional compilation markers) in the code to handle the general case, but need only a very simple mechanism for animation. See also Element simple time calculation.
Min and Max act as constraints on the active duration. They are really applied in two cases/points in the logic:
min/max
constraints are applied again.The effects will be localized in a single method that calculates the active
duration/end for an interval. Inputs to the method are the simple duration (implicit or
explicit as dur
), any repeat attributes, the best candidate end time (if any)
from the list of end times, and the min and max
attribute values.
Time manipulations are not part of SMIL Animation, and so are not required functionality for SVG 1.0. They are, however, easy to implement (especially without the issues of time-based media scheduling such as for video and audio), and particularly well suited to graphics animation. I propose that we include support as a means of demonstrating their utility and power as applied to SVG.
Time manipulations support introduces one additional timespace, segment time,
to support the autoReverse
functionality. Within the time model, the
time manipulations basically just affect how active time is converted to simple time, (and
vice versa, although this is less of an issue for SVG). The details are described in the Converting
document time to element time section of [SMIL2-TIME-MANIP].
Parsing the attribute values is trivial: accelerate
and decelerate
are floating point numbers, constrained to the unit range (0.0 to 1.0), and further
constrained to add up to no more than 1.0. The autoReverse
attribute is a
simple Boolean, and should be handled by the XML parser. The speed
attribute
is also a simple floating point number, with the only constraint that it be non-0.
In order to be proper with respect to the SVG 1.0 specification, we may have to mark these attributes with an extension namespace.
A time reference is involved whenever a sync-base value or a wallclock value is
specified in the begin
or end
attribute value list. The element
that is referred to is called the time-base element, and the referring time-value
is called the dependent time value. The dependent time value registers with the
time-base element, indicating dependency on the respective begin or end time. If the time
values are created after the entire document has been parsed, then this is
straightforward. If the timegraph can be constructed before the entire document is parsed,
then the dependent element may have to defer resolution of forward references.
Once the dependency relationship is established, the dependent time can query for an initial value. The result may be a resolved time, or it may be unresolved (see also the next section for a discussion of unresolved times). In a flat timespace such as SVG 1.0, all the begin and end times are in document time, and so no conversion is required to use a sync-base time-value. Nevertheless, we will include a call to convert the time-base time to a time in parent simple time, so that we can easily add time containment later. This call can be commented out, stubbed, etc., although any cost of making a call to an empty method will be minimal.
For wallclock time values, there is an implicit time reference to the document root, since the conversion from wallclock time to document time depends upon the real-world begin time for the document. If the end-user can pause the document, the wallclock time may become unresolved after the document begins, and then become resolved (again) once the document presentation is resumed after pausing. This is handled as a time dependency on the document root.
Note that for each interval that the time-base element creates (i.e. if the time-base element has multiple begin/end times and so creates multiple intervals), there will be an associated dependent instance time value. The data structures that model this are described below.
As the [SMIL2-TIMING] specification describes, cyclic dependencies are legal and are detected at runtime. When an element time changes and triggers the notification of dependents, the current node and each dependent is marked as it is visited. Dependencies on dependents are notified recursively, synchronous to the original notification. When an element is visited that has already been marked, a cycle is detected and the recursion is halted. As the recursion unwinds, the flags are cleared.
There are inherently indeterminate time values like event times, and there may also be times in the model that are unresolved on a transitional basis (e.g. when the timegraph is initializing). In any case, if a time is unresolved, it can simply be thought of as a special value like the floating point "NaN" value. Once the time-base time (or the event-time) becomes resolved, dependents are notified and the new time is provided to them. Note that one element A may define a sync-base time value relative to a second element B. If the associated value for B is dependent upon an event, then the dependent time for element A will be unresolved until the event happens and resolves the time-base time for element B.
In some cases, a time can changed from resolved to unresolved, and dependents are similarly notified.
Note that the special value "indefinite" is a resolved time value. It can be compared to the floating point "Infinite" value.
A variation on this notification arises when a new interval is created for a time-base element. In this case, each dependent time value is notified to create a new time instance. Less often, existing interval may have to be pruned (removed) from the model; in this case as well, dependent time values are notified, and the associated time instances a removed.
An event time is handled much the same as a sync-base time, except that the dependency relationship is on a particular event on the event-base element. Unlike a syncbase element, and eventbase element need not be a timed element. When (if) the event happens, the time of the event is converted to document time (basically by subtracting the document begin time), and then this time is passed to all time dependents as a new resolved event time. This in turn will create a new instance time. Note that once resolved, an event time instance will not change.
We model this dependency relationship by directly registering the dependent time value for the event. When the event is raised, the event-base node will convert the event time to document time, and then pass this time to each dependent. The dependent nodes will check for validity (there are constraints on event sensitivity) and then create a new time instance.
A set of four methods is defined to begin and end elements programmatically. In our model, these just add resolved instance times to the begin or end instance times list, respectively. The only difference is that the instance times have no associated syntax specification, and will not be dependent on anything - they function like simple offset instance times. Like event times, these are cleared when the element resets (e.g., when it restarts).
If we ever add SMIL media elements to SVG, these can have an unresolved simple duration (the media duration may be unknown at authoring time or unpredictable due to streaming performance). This in turn can cause the end of each interval for the media element to be unresolved. The times become resolved when the media completes (or when it is fully downloaded, often in advance of complete playout).
Similarly, time containers in SMIL can have a simple duration defined as a function of the durations of their children. Since the children may have interactive end times (or can be media elements with unresolved durations), the end times for the time containers can also be unresolved. Otherwise, these cases behave the same as any other resolved values.
The computation of endsync
(child-relative simple duration) for time
containers introduces additional complexity. A child may be defined to end (or may end as
a result of events or media completion), between samples of the timegraph. As a result,
when a time container is sampled for a given simple time T, it will sample its children
for the time T (converted appropriately), only to learn that some children have ended just
before that time. But this in turn means that the time container should have
ended just previously as well. This must actually be back-propagated through the time
container logic, so that the correct time for the time container is reflected.
Two approaches to handling this are possible:
I know of one implementation that used approach 2, and it was very messy. Approach 1 is probably desirable, but non-trivial. It may not be feasible given threading constraints. In any event, it will not be an issue unless and until we add support for time containers in the general case, and so I recommend we do nothing at this point.
The [SMIL2-TIMING] specification provides a detailed
description of how the interval model must work. The Batik design will follow this closely. The only
non-obvious part is the logic behind the calcActiveEnd
()
method
in the pseudo-code. Inputs to the method are the simple duration (implicit or explicit as dur
),
any repeat attributes, the best candidate end time (if any) from the list of end times,
and the min and max
attribute values. The rules for combining these values
are detailed in the [SMIL2-TIMING] specification.
The constraints and requirements described above lead to the following data structure design:
begin
and one for end
) of time-value
specifications that describe the type of time value, and create time
instances when appropriate.begin
or end
attribute is parsed; one time-value specification is created for each (valid)
time-value in the semi-colon separated list.accessKey
event, the event-base is the element on which the accessKey events are raised (this is
implementation dependent - is it the document root in Batik?).begin
and one for end
) of instance
time values. These lists are dynamic, changing as the animation presentation
plays. Each time a time-base element generates a new Interval, and each time an
event-base raises an event, a new instance time is added to the appropriate list. Each
time the current element resets, some of the instance times (those associated with
events and DOM method calls) are removed from the list(s).beginElementAt()
) creates an
instance time, but not a time-value specification (above).Important note: When a document is seeked backwards in time (e.g. due to hyperlink traversal), the list of instance time values is not cleared or reset. All "old" intervals that begin after the seek-to time will be discarded, and new intervals computed as the document plays forward (again).
A set of classes and interfaces associated with timing support are defined below. I have defined some pieces as interfaces to easily simulate multiple inheritance in Java, although this needs review. We can then define an implementation of the interfaces that supports delegation of the interface from the primary element class implementations. This will likely require a back pointer to the parent/host-object (I am not yet sure what interface this is - a DOMElement or something?).
This is somewhere between Java and pseudo-code. Forgive my rusty memory of Java syntax.
package SMIL.Timing;
/* question of encapsulation versus efficiency: should we just use a float with * special values, or should we use an object? */ class TimeVal implements Comparable { protected final float INDEFINITE = java.lang.Float.POSITIVE_INFINITY; protected final float Unresolved = java.lang.Float.NaN; public TimeVal(); public void setINDEFINITE(); public void setUnresolved(); public void setSeconds( float seconds ); throws ??? if seconds not finite ; public boolean isINDEFINITE(); public boolean isResolved(); public float getSeconds(); throws ??? if not finite; public int compareTo( TimeVal val2 ); /* seconds from begin of parent time container, simple time */ protected float seconds; }
class InstanceTime { package TimeVal time; /* This is used to handle the logic for dependency Update */ protected TimeValueSpecification creator; /* Marks this instance time as of the class that must be cleared when * the owning TimedElement is reset. TRUE for events, DOM calls. */ package boolean clearOnReset; /* This is used only to remove Instances associated with a deleting * Interval. It is null for Instances that are not syncbase value * instances, and when set is the Interval upon which this depends. */ package Interval timebase; /* Constructor: * creator null for DOM calls, else non-null. * timebase non-null for syncbase values. */ public InstanceTime( TimeValueSpecification creator, Interval timebase, boolean clearOnReset ); /* dependentUpdate is called by intervals that are changed. * newTime is always in parent simple time for the timebase element. * Implementation must get the timebase node from the creator, and do * appropriate time conversion (maintaining syncbase local time * facilitates optimizations in time conversion that are important * if we ever use syncBehavior). In SVG 1.0, all times are in * document simple time, and so no conversions are needed. * For wallclock timing, this will require some fancy handling, * using creator context. */ void dependentUpdate( TimeVal newTime ); }
class Interval { protected TimeVal begin; protected TimeVal end; /* These are lists of InstanceTimes */ protected Vector beginDependents; protected Vector endDependents; package Interval(); /* This returns a boolean true if the Interval is still valid, and false * if the interval becomes invalid (negative duration). The owner * TimedElement will delete any invalid interval, and then notify * dependent TimeValueSpecifications. If this is the first interval, * prevInterval is null. This may result in an Interval that ends * immediately - it is up to the owner to handle this (creating a new * Interval when the old one actually becomes inactive). */ package boolean recalc( Vector beginTimes, Vector endTimes, Interval prevInterval, RESTART_MODE restart, boolean hasEnds, TimeVal min, TimeVal max ); package void addDependent( InstanceTime dependent, boolean forBegin ); package void removeDependent( InstanceTime dependent, boolean forBegin ); }
/* Class TimeValueSpecification * Represents an individual begin or end time value. This is created * with the tokenized syntax for a single value, and uses common parsing * facilities to determine the time value type, and parse out element, * event, offset and other references. This then constructs OM references * to elements, registers dependencies, event listeners, etc. This also * constructs instance times as appropriate. */ class TimeValueSpecification { /* I forget how to do class enumerations in Java. */ public final enum TYPE = { offset, syncbase, event, repeat, accessKey, wallclock, indefinite }; /* owner needed, e.g., to notify when owned instance is updated, * and TimedElement must re-eval current interval. */ protected TimedElement owner; protected boolean isBegin; protected String syntaxSpec; protected TYPE type; /* This is a timed element that we depend upon for syncbase timing, * or the doc root for wallclock timing, and null otherwise. * Note: we may have to handle beginEvent and endEvent timing specially */ protected TimedElement timebase; /* This is any element that we depend upon for eventbase timing. * Need to get the type right for Batik */ protected DOMElement eventbase; /* This is any offset we add to the timebase time */ protected float offset; /* Specifies the repeat instance value for repeat timing. */ protected int repeatInstance; /* Specifies the access key value for accessKey timing. */ protected char accessKey; /* Specifies the adjusted value for wallclock timing. * Not clear if DateFormat can be made to support ISO8601 or if * we have to implement a custom parser. Ugh. */ protected Date wallclockValue; /* Constructor */ package TimeValueSpecification( TimedElement owner, boolean isBegin, String syntaxSpec ); /* called by the timebase node when a new interval is created */ package void newInterval( Interval interval ); /* called by the timebase node when an existing interval is deleted. * This must notify the owner TimedElement, which in turn will remove * all instance times associated with this interval. */ package void removeInterval( Interval interval ); /* called by the eventbase node when a new event is raised */ /* STUB - needs to be made proper */ package void newEvent( DOMEvent evt ); /* Called by an owned InstanceTime when it has been updated. * The interface may not need the context... */ package void handleTimebaseUpdate( InstanceTime instance, TimeVal newTime ); }
interface TimeClient { /* Defines the interface for a client of time services (e.g. animation). * Later, maybe we could expose this for extension behaviors. */ /* Called when a timed element is sampled. Note that the client sees * only local simple time, simpleDur, and the repeat iteration. * The simpleDur should not vary, but not providing it here * would require that the client ask for it each sample, or cache * it and handle changes. * Client does not know about active time, * the details of repeat, mix, max, time manipulations, etc. */ public void sampleAt( float simpleTime, float simpleDur, int repeatIteration ); /* Called when a frozen timed element is sampled, and the active * duration is an even multiple of the simple duration, requiring * the client to compute a value that theoretically cannot be * sampled: the last value. Because of the effects of cumulative * animation, we must include the repeat. */ public void sampleLastValue( int repeatIteration ); /* The following 3 methods allow the time server to communicate some * state to the client. This should not be used in any sampling code, * and is only for handling ancillary state, e.g. for animation composition. */ /* Called when a timed element changes from inactive to active. * The beginTime value is in documentTime, * and is provided for animation clients that sort on begin time. * Called BEFORE first sample for current interval. */ public void toActive( float beginTime ); /* Called when a timed element changes from active to inactive. * The isFrozen boolean indicates whether the element continue to fill. * Called after last active sample for current interval. If * isFrozen is true, will continue to sample, but time will not advance. * If isFrozen is false, client must notify the compositor that done. */ public void toInactive( boolean isFrozen ); /* Called when an (inactive) timed element changes from frozen to * inactive, not frozen. This is used when fill=freeze and the * parent time container is seq or excl. * This is also called when a frozen element restarts. */ public void removeFill(); }
interface TimedElement { /* Since the attribute values will not change often, we use a fixed * array for the list of TimeValue specifications. * Each entry will register with any timebase or event base as needed. */ protected TimeValueSpecification beginSpecs[]; protected TimeValueSpecification endSpecs[]; /* dur value in SVG has an implicit value of indefinite. This * value is the resulting value after default semantics have * been evaluated. */ protected TimeValue simpleDur; #ifdef SUPPORT_TIME_MANIPULATIONS protected TimeValue segmentDur; #endif SUPPORT_TIME_MANIPULATIONS /* repeat values are Unresolved if not specified. At most * one value will be resolved. */ protected TimeValue repeatCount; protected TimeValue repeatDur; /* Indicates which repeat iteration the node is executing. * Only valid when element is active. */ protected int currRepeatIteration; /* Used for Nodes with variable duration, especially for seeking, * and time conversion. Seconds measured in local active time. * Only valid when element is active. */ protected TimeValue lastRepeatTime; /* I forget how to do enumerations in Java. * Note that in SVG 1.0, fill=hold is not legal. */ public final enum FILL_MODE = { remove, freeze }; protected FILL_MODE fill; /* I forget how to do enumerations in Java. */ public final enum RESTART_MODE = { always, never, whenNotActive }; protected RESTART_MODE restart; /* min/max values are Unresolved if not defined */ protected TimeValue min; protected TimeValue max; /* Maintains some minimal state between samples. Used also to * handle logic of time dependency update, event sensitivity, * and rules for changes to Interval times (cannot change a * time in the past, cannot change a begin when active, * cannot respond to end events when not active, etc.) * Seconds measured in local active time */ public boolean isActive; public boolean isFrozen; protected TimeVal lastSampleTime; /* Returns last sampled time in respective timespace. * Throw exception if element is not active. */ public float getActiveTime(); #ifdef SUPPORT_TIME_MANIPULATIONS public float getSegmentTime(); #endif SUPPORT_TIME_MANIPULATIONS public float getSimpleTime(); /* These should be strongly typed vectors of InstanceTimes */ protected Vector beginInstanceTimes; protected Vector endInstanceTimes; /* Called by owned TimeValueSpecifications, * when a new instance time is created. */ package void addInstanceTime( InstanceTime time, boolean isBegin ); protected Interval currentInterval; protected Vector oldIntervals; /* These are lists of TimeValueSpecifications in other TimedElements * that are defined relative to this TimedElement. When we create a * new Interval, we have to notify the dependents to create new instances. * Also, when the current interval is deleted, we notify dependents of this. */ protected Vector beginDependents; protected Vector endDependents; package void addDependent( TimeValueSpecification dependent, boolean forBegin ); package void removeDependent( TimeValueSpecification dependent, boolean forBegin ); protected TimeContainer parent; #ifdef SUPPORT_TIME_MANIPULATIONS protected float speed; protected float accelerate; protected float decelerate; protected boolean autoReverse; #endif SUPPORT_TIME_MANIPULATIONS public TimedElement( TimeContainer parent ); /* used to register clients for sampling */ protected Vector timeClients; public void addTimeClient( TimeClient client ); /* these return computed values combining implicit and explicit values */ public TimeVal getSimpleDur(); #ifdef SUPPORT_TIME_MANIPULATIONS public TimeVal getSegmentDur(); #endif SUPPORT_TIME_MANIPULATIONS public TimeVal getActiveDur(); /* Calculates the local simple time, and calls all registered clients */ package void sampleAt( TimeVal parentSimpleTime ); /* Called when time container repeats or restarts, * or when DOM changes require re-init. */ package void reset(); /* Time conversion methods - used mostly for syncbase resolution, * and for sampling. May need to add the global to local variants * here, if not in a utils package. */ package TimeVal activeTimeToSimpleTime( TimeVal activeTime ); package TimeVal simpleTimeToActiveTime( TimeVal simpleTime, boolean useLastRepeat ); }
interface TimeContainer extends TimedElement { Vector timedChildren; /* Each timecontainer class defines the default begin values for * children. Since they can be logical, and can depend upon the * position of the child in the list of timed children, must * pass the child as context. * Throws Exception if child is not in timedChildren. * In SVG 1.0, this always returns a simple offset value of 0. */ package TimeValueSpecification getDefaultBegin( TimedElement child ); /* called from within sample */ protected void sampleChildren( TimeVal localSimpleTime ); /* called from within reset, and on restart or repeat */ protected void resetChildren(); }
interface TimedDocumentRoot extends TimeContainer { protected Date documentBegin; /* This is used when the document has been paused. * We accumulate an offset in paused time, and then use this to adjust * the sync calculations for wallclock time. * If we allow individual elements * to be paused, we moved this (and associated logic) to Interval. */ protected float accumulatedOffset; public void pause(); public void resume(); /* This deals with the issues of converting a wallclock time to a * document time, dealing with pause-effects, etc. */ package TimeVal wallclockToDocumentTime( TimeValueSpecification time ); public void seekToTime( float seekTo ); }
@@ note that an external mechanism determines when to sample the timegraph, and what the document time should be. This allows the timegraph to be completely abstract and independent of application (runtime, editing tool, etc.). The timegraph implementation must be stateless with respect to the sampling frequency and the time delta between samples. If a media scheduler is added (e.g. to help schedule audio media), this will have to maintain some state to predictively schedule media. Similarly, of a media manager is added that supports download of media (such as images and stylesheets), and that incorporates timing as a factor in scheduling media download, this will also have to incorporate some state or assumptions about play rate, etc. However, the timegraph implementation itself should not make assumptions of this sort.
@@Need to resolve whether events flow through the interval logic in the event thread, or at sample time. Simpler if elements only end at sample time (but this will complicate endsync when we handle that later). Could well do the interval side-effects in event-thread, but leave all active/inactive state transitions (and side-effects/propagations) to the sample. This means that when graph is stopped and events happen, need to force a resample to get event side-effects. Could optimize by noting when current interval changes, and marking timegraph dirty. Need to consider how active/inactive transitions propagate, to ensure that dependencies that chain backwards relative to the document order, are correctly updated in a single traversal (otherwise, we get stale state all over the graph). Need to handle beginEvent, endEvent and repeatEvent like syncbase times, w.r.t. propagation. This requires a separate mechanism from the simple event registration mechanism - it is really just a variant on syncbase dependency. If event delivery is fast enough in practice, then it may not even be required.
@@Current design has nothing in it to accommodate externalResourcesRequired. This would be an additional constraint on the begin condition, basically. We would probably fit it into the work to calculate the first Interval, or in the TimedElement sample logic - the semantics still must be resolved by the WG.
The animation engine is really divided into two pieces or services areas, with individual animations as clients of these services. The core service is that of the compositor, which is responsible for combining animations with underlying values, and with other animations that are actively targeting a given property/attribute. For individual animation elements, there is also a set of shared animation services related to the animation function logic, including generalized value interpolation services. Naturally, animation elements are also clients of the timing model services, but the design abstracts the two in such as way as to make this nearly transparent.
For each property/attribute that is targeted by at least one animation element, there is an instance of the compositor. All the animations that target the associated property/attribute register as clients of this compositor. The animation logic is also a client of the timing services for the animation element. When an element becomes active (as determined by the timing engine), the animation logic will notify the compositor that it is active, and will provide the begin time of the current interval (this is used to determine the order in the sandwich)1. When an animation becomes inactive, it may be frozen or not (depending upon the fill attribute). If it is frozen, it continues to participate in the sandwich. If it is not frozen, it will notify the compositor and is removed from the sandwich.
[Footnote 1: We also have to consider the position in the tree, if the begin times of two animations are the same. If we assume that concurrent begin times for multiple animations to the same target property will be rare, and that computation of the relative order is not expensive, we can just compute it each time two concurrent animations begin. This would allow for a variety of solutions, such as comparing the child positions under the closest common ancestor. If we find this is expensive or does not scale for some reason, another approach would be to define a cardinal node value, computed from a simple DOM tree traversal, that reflects the element order in the document. This value can be cached, and a fast comparison is ensured. We would have to provide additional code to recompute cached values when the document element structure is changed (i.e. when the list of element children for any element changes). This is probably relatively easy and reasonable as well. Reviewers - comments?]
With each sample of the document presentation, the timing engine will call active
animation clients with the current local simple time, and a repeat count (to handle
cumulative animation). The animation logic computes the associated animation result,
accounting for cumulative, but not additive, behavior. It provides this result to the
compositor. Once all the registered, active clients have provided results, the compositor
combines the results according to the sandwich model, and applies these results to the
property/attribute, setting the associated SVGAnimatedValue
.
There is one exception to this simple model, that requires a slightly more complex interaction between an animation function and the compositor. When the syntax describes a "TO" animation, the interpolation is defined in terms of the composed underlying value for the target property/attribute. In this case, when the animation logic is sampled by the timing engine, it notifies the compositor that it is ready to compute a result, and provides a callback function for this purpose. When the compositor is ready to complete the sandwich composition, it computes the sandwich from the bottom up (in terms of priority), and calls the callback for each "TO" animation with the composed underlying value. This requires that the animation logic maintain the time associated with the current (i.e. most recent) sample.
We also leverage the animation function callback extension to handle dependency among the underlying values. In addition to "TO" animations, when the specified values are defined using relative units (e.g. "2em", "10%", etc.), the animation function sets a flag when it provides its result to the compositor. The compositor itself registers with the underlying property for notification when it must recompute relative values (i.e. when the property it depends upon changes for any reason). When so notified, the compositor will recompute the animation sandwich using the modified underlying value. The compositor must call back individual animation functions that are marked as relative, in addition to any "TO" animation functions.
The compositor can be optimized to only recompute when there are additive or relative animation functions. More precisely, the compositor must only recompute under the following conditions:
Note that marking an animation function as relative may only apply to certain portions of the animation function duration. Some values may be absolute and some may be relative. The function only marks itself as relative if one of the values used in the current calculation is defined to be relative.
Reviewers - Please consider this carefully - I suspect it has implications for the CSS model, and perhaps other pieces.
It is still under discussion which values to use in computing underlying values, and which to use when computing relative values. There are two cases in which we must decide which values to use:
SVGAnimatedValues
be used in the calculations? If we do not, then animating a given style property will
not affect any children that are defined to "inherit" the style property
value. This seems problematic to me, and so (for me) indicates that computations of
underlying values must use animated values to calculate relative values.SVGAnimatedValues
be used in the calculations? I think that the answer must again be "yes". If
not, then when an animation is applied to a base value, animations defined on
dependent properties will not remain in visual synchronization. If authors do not want
this effect, they can specify the dependent animation using only absolute values, and
so have a work-around. If we decide the answer to this basic question is
"no", and authors wish to get the more dynamic, chained behavior, there is
no workaround.We also leverage the animation function callback extension to handle dependency among the underlying values. In addition to "TO" animations, when the specified values are defined using relative units (e.g. "2em", "10%", etc.), the animation function sets a flag when it provides its result to the compositor. The compositor itself registers with the underlying property for notification when it must recompute relative values (i.e. when the property it depends upon changes for any reason). When so notified, the compositor will recompute the animation sandwich using the modified underlying value. The compositor must call back individual animation functions that are marked as relative, in addition to any "TO" animation functions.
Need to talk about how we handle compound properties like position, that have multiple underlying values, can interact with separate property animations, etc. Or does SVG really treat motion and transform animation as affecting a single transform matrix property? This would simplify things.
Reviewers - Do we need extra intelligence to handle the interaction of animateMotion with underlying CSS position values, or do we just animate the CTM property and be done with it? How does additive=replace work with the CTM and underlying transforms? How can we make motion override all other positioning?
Reviewers - do not worry if you cannot make any sense of the comments below - they are mostly for my own use ;-)
@@Need to handle case of an animation that is sampled, and then (later within the same traversal) is seen to have already ended. The animation sample should be withdrawn from the composition manager. Since composition happens later, this should be fine. It does however preclude any optimizations like clearing out lower prio animations when the anim is not additive.
[later] - Not clear how this would obtain. Since SMIL Anim/SVG 1.1 has no time containment, and since end conditions cannot be retroactively applied, this should not obtain. When endsync semantics are added, then will need to address this, but that should be done with a more complex time-sampling scheme (such as above). Ahh - forward reference to a node that will end at current sample time. Can we avoid this by predicting the document time of end? How are events handled? Orthogonal to time sample, so can propagate time deps. Unless depends upon event, times are resolved. Resolved times ripple and so should not obtain. This arises mostly with media with unresolved end, and with event-based durations. Move this up to timing, and only mention here that it may arise in future, and should be accommodated. Since have to accommodate state to handle data dependency propagation, this should not be an issue.
[even later again] It is not just an issue with media and endsync, etc.: If an element transitions to inactive, then end events are fired, and are supposed to propagate immediately (including backwards in the document to timed elements that have already been sampled). Restart on repeat events is a similar case that we will likely not be able to predict (or will we?). The issue is whether we complete the current sample and compose animations, before we handle timing events. If we do, then we may just get flashing on the animated values. If we try to be really smart, we have to propagate the timing changes synchronously, which will cause the element to recompute (either restarting or simply ending and being removed from the sandwich), which in turn will require the compositor to change and/or recompute the sandwich. Not a big change to the compositor in any case, given other requirements on dynamism. Requires that an animation change from active to inactive, even after the compositor thinks it is done for the current sample. Requires that an animation may restart, resetting the begin time and potentially changing the composition priorities. No big deal - just allow for this in any case, and force a sandwich recompute as needed.
Given the services of the timing model, the animation function becomes very simple. We
have to determine which values are involved in the current interpolation, and then we have
to compute the associated interpolation (discrete animation only requires the first step).
The arithmetic involved is common to all animation functions that follow the framework
model using standard attributes: from
, to
, by
, values
,
keyTimes
, keySplines
and calcMode
. Many cases
involve interpreting scalar values, and so this code can also be generalized. Some
specialized logic (such as evaluating the path spline and computing the tangent rotation for animateMotion
)
must be performed for certain animation elements, but the goal is to abstract and share as
much code as possible.
Simple functions to compare counts of arguments, conflicting use of arguments, etc. In
addition, this verifies that target element is legal, and supports animation of the target
property/attribute. It then verifies that the values provided in from/to/by/values
attributes are legal formats (units) for the the property/attribute. This is extended in
individual element subclasses for special values or compound/abstract values such as
supported by animateMotion
.
This phase can also create internal data structures to represent the values, handling
defaults, etc. It will convert from/to/by
values to a list of two values, and
will compute keyTimes
for values lists with default or paced
interval timing. For paced animation, a set of key times are calculated from the distance
between the values. This is only calculated once for the animation when it begins
(although we should perhaps recalculate for each active interval instance). If the values
specify relative units, these will be resolved once for the purposes of pacing, and will
not be recalculated dynamically. This may produce incorrectly paced animations in some
dynamic situations, but this is the best we can do. If the animated value does not support
units that have some reasonable definition of distance, then the calcMode=paced
setting will be ignored, and the default value will be used.
This piece takes a unit-normalized simple time (i.e., a sample time), the element duration, any
specified keyTimes
, and figures out the canonical unit progress through a
given interval. Result is an interval index and a proportion through that interval. If
only two values have been specified (or are implicit as in simple "BY"
and "TO" animation), there is only one interval. The canonical time result is an index into the set of
defined intervals (0 in the simple case), and a floating point value in the range (0.0
≤ p < 1.0), indicating progress through the given interval. Note that it is
illegal to pass in a sample time greater than or equal to the simple duration. When an
element is frozen at the end of the simple duration (or an even multiple thereof), the
time services will query a different interface to sample the last defined value.
Since the spline interpolation is a function of time and not of the values, this piece also encapsulates the interpretation of the spline math.
The work to interpolate between common scalar types (especially the various CSS length units) is logically separate from generalized animation, and is provided as a centralized set of functions. This makes it much easier to extend the framework with other animation functions. This piece hides all the work to convert between units, etc. These are class static functions for a framework utility class, or in a base animation class implementation.
The units conversion methods include the target element as a parameter, to use for calculating relative units.
ConvertUnitsToUnits( DOMElement target, value with units, result units, &result);
Interpolate( DOMElement target, value1_with_units, value2_with_units, result units, unit distance, &result );
A set of classes and interfaces associated with timing support are defined below. I have defined some pieces as interfaces to easily simulate multiple inheritance in Java, although this needs review. We can then define an implementation of the interfaces that supports delegation of the interface from the primary element class implementations. This will likely require a back pointer to the parent/host-object (I am not yet sure what interface this is - a DOMElement or something?).
These are still incomplete - the compositor classes need attention. Someone with intimate knowledge of the Batik CSS engine and related support as well as the DOM animation layer needs to help fill in a lot of blanks.
This is somewhere between Java and pseudo-code. Forgive my rusty memory of Java syntax.
package SMIL.Animation;
/* AnimValue and subclasses * Represents an individual from/to/by or "values" value. This is * created with the tokenized syntax for a single value, and uses * common parsing facilities to determine the value type (units). * TODO: figure out where the parsing methods go. */ class AnimValue { /* owner needed, e.g., to handle recalc for relative units, etc. */ protected AnimElement owner; protected String syntaxSpec; }
class AnimCSSLengthValue extends AnimValue { /* Is there another centralized CSS service and associated types * for dealing with CSS length units, etc.? */ public final enum TYPES = { em, ex, px, pct, mm, cm, in, ??? }; protected TYPES type; protected float value; /* This holds the value in canonical units. If the type is one of * the relative forms, then it will be updated as needed. */ protected float canonicalValue; /* shared methods for conversion, etc. */ package static ConvertUnitsToUnits( DOMElement target, AnimCSSLengthValue origValue, TYPES resultUnits, AnimCSSLengthValue &resultValue ); /* Used to compute paced timing */ package static ComputeDistance( AnimCSSLengthValue from, AnimCSSLengthValue to, float &distance ); package static Interpolate( DOMElement target, float unitDistance, AnimCSSLengthValue startValue, AnimCSSLengthValue endValue, TYPES resultUnits, AnimCSSLengthValue &resultValue ); }
class AnimCSSColorValue extends AnimValue { /* However colors are represented in SVG, we hold that here. */ protected RGB(?) color; /* Used to compute paced timing */ package static ComputeDistance( RGB from, RGB to, float &distance ); package static Interpolate( float unitDistance, RGB startValue, RGB endValue, RGB &resultValue ); }
class AnimMotionValue extends AnimValue { /* However positions are represented in SVG, we hold that here. * I.e. convert the path specification to points, for simpler * interpolation and manipulation. Assume that points are in the * correct coordinate space units for the target element. */ protected SVGPoint(?) point; /* Used to compute paced timing */ package static ComputeDistance( SVGPoint from, SVGPoint to, float &distance ); package static Interpolate( float unitDistance, SVGPoint startValue, SVGPoint endValue, SVGPoint &resultValue ); }
class AnimSVGTransformValue extends AnimValue { /* translate, scale, rotate all have x and y properties. * If y is NaN, then Y was omitted for translate and scale. * If x and y are NaN, then they were omitted for rotate. * rotate also has an angle. * skewX and skewY only use the angle. */ protected float x; protected float y; protected float angle; /* Used to compute paced timing */ package static ComputeDistance( AnimSVGTransformValue from, AnimSVGTransformValue to, float &distance ); package static Interpolate( float unitDistance, AnimSVGTransformValue start, AnimSVGTransformValue end, AnimSVGTransformValue &result ); }
The AnimComposable interface is for all clients of the compositor.
interface AnimComposable { package void calcResult( AnimValue base, AnimValue &result ); }
How do I declare an interface to be abstract - only intended as a base class?
abstract interface SimpleAnimElement extends AnimComposable { /*** converted ATTRIBUTE values - these are useful values, from validated strings */ /* Not sure how best to declare a generic element reference */ protected DOMElement targetElement; /* For right now, must either be an attribute or a CSS property */ protected boolean targetIsCSSProp; /* This should really be the animated layer of the DOM - * are there SVGAnimatedAttr and SVGAnimatedCSSProp Base * classes or interfaces? */ protected DOMAttribute targetAttr; protected CSSProp targetProp; }
abstract interface InterpolatingAnimElement extends SimpleAnimElement { /* I forget how to define enumerations ... */ package enum CALC_MODE { discrete, linear, paced, spline }; /* This is the validated result, correcting for value type * conflicts, etc. */ protected CALC_MODE calcMode; /* These might change to enums someday - we'll assume * boolean for now */ protected boolean additive; protected boolean cumulative; /* The length changes only if the attribute is modified, * so we used an array. If syntax specified as: * from/to, we set 2 values: from, to * from/by, we set 2 values: from, from+by * by (only), we force additive and set 2 values: 0, by * to (only), we set toAnim, and set 1 value: to * Note that if values.length==1, we have a "TO" animation, * and ignore additive. */ protected AnimValue values[]; /* The length changes only if the attribute is modified, * so we used an array. If valid keyTimes were set, this * contains those values. Otherwise, this will have values * computed according to the default semantics for pacing * (depending on calcMode). * Thus this will have at least 2 values (1 interval), * even for the simplest case. */ protected float keyTimes[]; /* We store 4 floats per spline point. * If no keySplines were set (or syntax was bad), this * will be null/empty. */ protected float keySplines[][4]; /* This is a generic interface to figure out which * interval we are in, and how far into it. */ protected calcCanonicalTime( float simpleTime, float simpleDur, int &interval, float &progressWithinInterval ); }
Note: Whatever implements these SVGXXXElement interfaces must also implement TimeClient.
interface SVGSetElement extends SimpleAnimElement { /* The sample ALWAYS returns the to value */ protected AnimValue value; }
interface SVGAnimateElement extends InterpolatingAnimElement { /* Most of what we need is in superclass, but this will force * single (scalar?) types in values, etc. */ }
interface SVGAnimateMotionElement extends InterpolatingAnimElement { /* Need to add special parsing for values, and to handle path * argument. Not clear if path is parsed into AnimMotionValues, * or handled separately - up to implementation. * Will need special handling of the pacing - must override * the initialization of intervals for paced animation to set * keyTimes based upon estimated arc-length of each path-segment. * This is a client of the CTM compositor (and...?) */ }
interface SVGAnimateColorElement extends InterpolatingAnimElement { /* Need to add parsing for RGB values. Otherwise pretty standard. */ }
interface SVGAnimateTransformElement extends InterpolatingAnimElement { public final enum TYPE = { translate, scale, rotate, skewX, skewY }; protected TYPE type; /* Need to add parsing for AnimSVGTransformValue values. * Need to consider how compositor works, since the spec says it * acts like the resulting transform matrix is a single property. */ }