So far we have been primarily concerned with each media type or format individually. We have noted that certain media (individually) are based on spatial and/or temporal representations, other may be static.
Once we start to integrate media spacial and temporal implications become even more critical. For example static text may need to index or label a portion of video at a given instant or segment of time and there the integration becomes temporal and spatial if the label is placed at a given location (or locations moving over time).
Clearly, it is important to know the tolerance and limits for each medium as integration will require knowledge of these for synchronisation and indeed create further limits (e.g. bandwidth of two media types increase, if audio is encoded at a 48Khz sampling rate and it needs to accompany video being streamed out at 60 frames per second then inter-stream synchronisation is not necessarily straightforward.
It is common (obvious) that media types are bundled together for ease of delivery, storage etc.. Therefore, it is not suprising that formats have been developed to support, store and deliver media in an integrated form.
The need for interchange between different multimedia applications probably running on different paltforms have lead to the evolution of common interchage file formats. Many of these formats build on underlying individual media formats (MPEG, JPEG etc.) however further relationships are necessary when the media is truly integrated to become multimedia. Spatial, temporal structural and procedural constraints will exist between the media. This especially true now that interaction is a common feature of multimedia.