Commit fe37c5aa authored by Wim Taymans's avatar Wim Taymans

docs/design/draft-klass.txt: Proposal for klass field values.

Original commit message from CVS:
* docs/design/draft-klass.txt:
Proposal for klass field values.

* docs/design/part-streams.txt:
Start of a doc describing stream anatomy.
parent a1939810
2006-02-28 Wim Taymans <>
* docs/design/draft-klass.txt:
Proposal for klass field values.
* docs/design/part-streams.txt:
Start of a doc describing stream anatomy.
2006-02-28 Wim Taymans <>
* gst/gstbin.c: (gst_bin_get_type), (gst_bin_handle_message_func):
Element Klass definition
Applications should be able to retrieve elements from the registry of existing
elements based on specific capabilities or features of the element.
A playback application might want to retrieve all the elements that can be
used for visualisation, for example, or a video editor might want to select all
video effect filters.
The topic of defining the klass of elements should be based on use cases.
The GstElementDetails contains a field named klass that is a pointer to a
string describing the element type.
In this document we describe the format and contents of the string. Elements
should adhere to this specification although that is not enforced to allow
for wild (application specific) customisation.
1) string format
The string consists of an unordered list of keywords separated with a '/'
character. While the / suggests a hierarchy, this is not the case.
2) keyword categories
- functional
Categories are base on _intended usage_ of the element. Some elements
might have other sideeffects (especially for filers/effects). The purpose
is to list enough keywords so that applications can do meaningfull filtering,
not to completely describe the functionality, that is expressed in caps etc..
* Source : produces data
* Sink : consumes data
* Filter : transforms data, no modification on the data is
intended (although it might be unavoidable)
* Effect : applies an effect to some data, changes to data is
* Demuxer : splits audio, video, ... from a stream
* Muxer : combines audio, video, ... into one stream
* Decoder : decodes encoded data into a raw format
* Encoder : encodes raw data into an encoded format
* Visualisation : transforms audio into video
* Analyzer : reports about the stream contents.
* Debug : tee, identity, fakesrc, navseek, ...
* Control : controls some aspect of a hardware device
* Extracter : extracts tags/headers from a stream
* Formatter : adds tags/headers to a stream
* ...
- Based on media type
Purpose is to make a selection for elements operating on the different
types of media. An audio application must be able to filter out the
elements operating on audio, for example.
* Audio : operates on audio data
* Video : operates on video data
* Text : operates on text data
* Metadata : operates on metadata
* ...
- Extra features
The purpose is to further specialize the element, mostly for
application specific needs.
* Network : element is used in networked situations
* Protocol : implements some protocol (RTSP, HTTP, ...)
* Payloader : encapsulate as payload (RTP, RDT,.. )
* Depayloader : strip a payload (RTP, RDT,.. )
* RTP : intended to be used in RTP applications
3) suggested order:
<functional>/<media type>/<extra...>
4) examples:
apedemux : Extracter/Metadata
autoaudiosink : Sink/Audio
cairotimeoverlay : Effect/Muxer/Video/Text
dvdec : Decoder/Video
dvdemux : Demuxer
goom : Visualisation
id3demux : Extracter/Metadata
udpsrc : Source/Network/Protocol
videomixer : Effect/Muxer/Video
ffmpegcolorspace : Filter/Video (intended use to convert video)
vertigotv : Effect/Video (intended use is to change the video)
volume : Effect/Audio (intended use is to change the audio)
vorbisdec : Decoder/Audio
vorbisenc : Encoder/Audio
oggmux : Muxer
adder : Effect/Muxer/Audio
videobox : Effect/Video
alsamixer : Control/Audio
audioconvert : Filter/Audio
audioresample : Filter/Audio
xvimagesink : Sink/Video
navseek : Debug
decodebin : Decoder/Demuxer
level : Filter/Analyzer/Audio
Use cases:
- get a list of all elements implementing a video effect (pitivi):
klass.contains (Effect & Video)
- get list of muxers (pitivi):
klass.contains (Muxer & !Effect)
- get list of encoders (pitivi):
klass.contains (Encoder)
- Get a list of all visualisations (totem):
klass.contains (Visualisation)
- Get a list of all decoders/demuxer/metadata parsers/vis (playbin):
klass.contains (Visualisation | Demuxer | Decoder | (Extractor & Metadata))
This document describes the objects that are passed from element to
element in the streaming thread.
Stream objects
The following objects are to be expected in the streaming thread:
- events
- EOS (EOS) *
- TAG (T)
- buffers (B) *
Objects marked with * need to be synchronised to the clock in sinks
and live sources.
Typical stream
A typical stream starts with a newsegment event that marks the
buffer timestamp range. After that buffers are send one after the
other. After the last buffer an EOS marks the end of the stream. No
more buffer are to be processed after the EOS event.
+--+ +-++-+ +-+ +---+
|NS| |B||B| ... |B| |EOS|
+--+ +-++-+ +-+ +---+
1) NEW_SEGMENT, rate, start/stop, time
- marks valid buffer timestamp range
- marks stream_time of buffers in NEW_SEGMENT
- marks playback rate
2) N buffers
- displayable buffers are between start/stop of the NEW_SEGMENT
- display_time: (B.timestamp - NS.start) * NS.abs_rate
- stream_time: display_time + NS.time
- sync_time: display_time + base_time
3) EOS
- marks the end of data, nothing is to be expected after EOS
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment