Commit 42adb02a authored by Mathieu Duponchelle's avatar Mathieu Duponchelle 🐸

docstrings: port ulinks to markdown links

parent c5738c61
Pipeline #58474 passed with stages
in 47 minutes and 19 seconds
...@@ -27,8 +27,8 @@ ...@@ -27,8 +27,8 @@
* *
* The chromaprint element calculates an acoustic fingerprint for an * The chromaprint element calculates an acoustic fingerprint for an
* audio stream which can be used to identify a song and look up * audio stream which can be used to identify a song and look up
* further metadata from the <ulink url="http://acoustid.org/">Acoustid</ulink> * further metadata from the [Acoustid](http://acoustid.org/) and Musicbrainz
* and Musicbrainz databases. * databases.
* *
* ## Example launch line * ## Example launch line
* |[ * |[
......
...@@ -23,19 +23,21 @@ ...@@ -23,19 +23,21 @@
* @title: dfbvideosink * @title: dfbvideosink
* *
* DfbVideoSink renders video frames using the * DfbVideoSink renders video frames using the
* <ulink url="http://www.directfb.org/">DirectFB</ulink> library. * [DirectFB](http://www.directfb.org/) library.
* Rendering can happen in two different modes : * Rendering can happen in two different modes :
* *
* * Standalone: this mode will take complete control of the monitor forcing * * Standalone: this mode will take complete control of the monitor forcing
* <ulink url="http://www.directfb.org/">DirectFB</ulink> to fullscreen layout. * DirectFB to fullscreen layout.
*
* This is convenient to test using the gst-launch-1.0 command line tool or * This is convenient to test using the gst-launch-1.0 command line tool or
* other simple applications. It is possible to interrupt playback while * other simple applications. It is possible to interrupt playback while
* being in this mode by pressing the Escape key. * being in this mode by pressing the Escape key.
* This mode handles navigation events for every input device supported by * This mode handles navigation events for every input device supported by
* the <ulink url="http://www.directfb.org/">DirectFB</ulink> library, it will * the DirectFB library, it will look for available video modes in the fb.modes
* look for available video modes in the fb.modes file and try to switch * file and try to switch the framebuffer video mode to the most suitable one.
* the framebuffer video mode to the most suitable one. Depending on * Depending on hardware acceleration capabilities the element will handle
* hardware acceleration capabilities the element will handle scaling or not. * scaling or not.
*
* If no acceleration is available it will do clipping or centering of the * If no acceleration is available it will do clipping or centering of the
* video frames respecting the original aspect ratio. * video frames respecting the original aspect ratio.
* *
...@@ -43,7 +45,8 @@ ...@@ -43,7 +45,8 @@
* #GstDfbVideoSink:surface provided by the * #GstDfbVideoSink:surface provided by the
* application developer. This is a more advanced usage of the element and * application developer. This is a more advanced usage of the element and
* it is required to integrate video playback in existing * it is required to integrate video playback in existing
* <ulink url="http://www.directfb.org/">DirectFB</ulink> applications. * DirectFB applications.
*
* When using this mode the element just renders to the * When using this mode the element just renders to the
* #GstDfbVideoSink:surface provided by the * #GstDfbVideoSink:surface provided by the
* application, that means it won't handle navigation events and won't resize * application, that means it won't handle navigation events and won't resize
......
...@@ -25,7 +25,7 @@ ...@@ -25,7 +25,7 @@
* @see_also: timidity, wildmidi * @see_also: timidity, wildmidi
* *
* This element renders midi-events as audio streams using * This element renders midi-events as audio streams using
* <ulink url="http://fluidsynth.sourceforge.net//">Fluidsynth</ulink>. * [Fluidsynth](http://fluidsynth.sourceforge.net/).
* It offers better sound quality compared to the timidity or wildmidi element. * It offers better sound quality compared to the timidity or wildmidi element.
* *
* ## Example pipeline * ## Example pipeline
......
...@@ -48,8 +48,9 @@ ...@@ -48,8 +48,9 @@
* @title: katedec * @title: katedec
* @see_also: oggdemux * @see_also: oggdemux
* *
* This element decodes Kate streams * This element decodes Kate streams.
* <ulink url="http://libkate.googlecode.com/">Kate</ulink> is a free codec *
* [Kate](http://libkate.googlecode.com/) is a free codec
* for text based data, such as subtitles. Any number of kate streams can be * for text based data, such as subtitles. Any number of kate streams can be
* embedded in an Ogg stream. * embedded in an Ogg stream.
* *
......
...@@ -49,10 +49,11 @@ ...@@ -49,10 +49,11 @@
* @title: kateenc * @title: kateenc
* @see_also: oggmux * @see_also: oggmux
* *
* This element encodes Kate streams * This element encodes Kate streams.
* <ulink url="http://libkate.googlecode.com/">Kate</ulink> is a free codec *
* for text based data, such as subtitles. Any number of kate streams can be * [Kate](http://libkate.googlecode.com/) is a free codec for text based data,
* embedded in an Ogg stream. * such as subtitles. Any number of kate streams can be embedded in an Ogg
* stream.
* *
* libkate (see above url) is needed to build this plugin. * libkate (see above url) is needed to build this plugin.
* *
......
...@@ -48,12 +48,12 @@ ...@@ -48,12 +48,12 @@
* @title: tiger * @title: tiger
* @see_also: katedec * @see_also: katedec
* *
* This element decodes and renders Kate streams * This element decodes and renders Kate streams.
* <ulink url="http://libkate.googlecode.com/">Kate</ulink> is a free codec * [Kate](http://libkate.googlecode.com/) is a free codec for text based data,
* for text based data, such as subtitles. Any number of kate streams can be * such as subtitles. Any number of kate streams can be embedded in an Ogg
* embedded in an Ogg stream. * stream.
* *
* libkate (see above url) and <ulink url="http://libtiger.googlecode.com/">libtiger</ulink> * libkate (see above url) and [libtiger](http://libtiger.googlecode.com/)
* are needed to build this element. * are needed to build this element.
* *
* ## Example pipeline * ## Example pipeline
......
...@@ -27,7 +27,8 @@ ...@@ -27,7 +27,8 @@
* @see_also: #GstAudioConvert #GstAudioResample, #GstAudioTestSrc, #GstAutoAudioSink * @see_also: #GstAudioConvert #GstAudioResample, #GstAudioTestSrc, #GstAutoAudioSink
* *
* The LADSPA (Linux Audio Developer's Simple Plugin API) element is a bridge * The LADSPA (Linux Audio Developer's Simple Plugin API) element is a bridge
* for plugins using the <ulink url="http://www.ladspa.org/">LADSPA</ulink> API. * for plugins using the [LADSPA](http://www.ladspa.org/) API.
*
* It scans all installed LADSPA plugins and registers them as gstreamer * It scans all installed LADSPA plugins and registers them as gstreamer
* elements. If available it can also parse LRDF files and use the metadata for * elements. If available it can also parse LRDF files and use the metadata for
* element classification. The functionality you get depends on the LADSPA plugins * element classification. The functionality you get depends on the LADSPA plugins
......
...@@ -30,8 +30,8 @@ ...@@ -30,8 +30,8 @@
* a successor to LADSPA (Linux Audio Developer's Simple Plugin API). * a successor to LADSPA (Linux Audio Developer's Simple Plugin API).
* *
* The LV2 element is a bridge for plugins using the * The LV2 element is a bridge for plugins using the
* <ulink url="http://www.lv2plug.in/">LV2</ulink> API. It scans all * [LV2](http://www.lv2plug.in/) API. It scans all installed LV2 plugins and
* installed LV2 plugins and registers them as gstreamer elements. * registers them as gstreamer elements.
*/ */
#ifdef HAVE_CONFIG_H #ifdef HAVE_CONFIG_H
......
...@@ -28,8 +28,8 @@ ...@@ -28,8 +28,8 @@
/** /**
* SECTION:element-modplug * SECTION:element-modplug
* *
* Modplug uses the <ulink url="http://modplug-xmms.sourceforge.net/">modplug</ulink> * Modplug uses the [modplug](http://modplug-xmms.sourceforge.net/) library to
* library to decode tracked music in the MOD/S3M/XM/IT and related formats. * decode tracked music in the MOD/S3M/XM/IT and related formats.
* *
* ## Example pipeline * ## Example pipeline
* *
......
...@@ -25,9 +25,10 @@ ...@@ -25,9 +25,10 @@
* @see_also: mpeg2dec * @see_also: mpeg2dec
* *
* This element encodes raw video into an MPEG-1/2 elementary stream using the * This element encodes raw video into an MPEG-1/2 elementary stream using the
* <ulink url="http://mjpeg.sourceforge.net/">mjpegtools</ulink> library. * [mjpegtools](http://mjpeg.sourceforge.net/) library.
*
* Documentation on MPEG encoding in general can be found in the * Documentation on MPEG encoding in general can be found in the
* <ulink url="https://sourceforge.net/docman/display_doc.php?docid=3456&group_id=5776">MJPEG Howto</ulink> * [MJPEG Howto](https://sourceforge.net/docman/display_doc.php?docid=3456&group_id=5776)
* and on the various available parameters in the documentation * and on the various available parameters in the documentation
* of the mpeg2enc tool in particular, which shares options with this element. * of the mpeg2enc tool in particular, which shares options with this element.
* *
......
...@@ -26,9 +26,9 @@ ...@@ -26,9 +26,9 @@
* *
* This element is an audio/video multiplexer for MPEG-1/2 video streams * This element is an audio/video multiplexer for MPEG-1/2 video streams
* and (un)compressed audio streams such as AC3, MPEG layer I/II/III. * and (un)compressed audio streams such as AC3, MPEG layer I/II/III.
* It is based on the <ulink url="http://mjpeg.sourceforge.net/">mjpegtools</ulink> library. * It is based on the [mjpegtools](http://mjpeg.sourceforge.net/) library.
* Documentation on creating MPEG videos in general can be found in the * Documentation on creating MPEG videos in general can be found in the
* <ulink url="https://sourceforge.net/docman/display_doc.php?docid=3456&group_id=5776">MJPEG Howto</ulink> * [MJPEG Howto](https://sourceforge.net/docman/display_doc.php?docid=3456&group_id=5776)
* and the man-page of the mplex tool documents the properties of this element, * and the man-page of the mplex tool documents the properties of this element,
* which are shared with the mplex tool. * which are shared with the mplex tool.
* *
......
...@@ -23,8 +23,8 @@ ...@@ -23,8 +23,8 @@
* @see_also: #GstOpenMptDec * @see_also: #GstOpenMptDec
* *
* openmpdec decodes module music formats, such as S3M, MOD, XM, IT. * openmpdec decodes module music formats, such as S3M, MOD, XM, IT.
* It uses the <ulink url="https://lib.openmpt.org">OpenMPT library</ulink> * It uses the [OpenMPT library](https://lib.openmpt.org) for this purpose.
* for this purpose. It can be autoplugged and therefore works with decodebin. * It can be autoplugged and therefore works with decodebin.
* *
* ## Example launch line * ## Example launch line
* *
......
...@@ -23,7 +23,7 @@ ...@@ -23,7 +23,7 @@
* SECTION:element-srtsink * SECTION:element-srtsink
* @title: srtsink * @title: srtsink
* *
* srtsink is a network sink that sends <ulink url="http://www.srtalliance.org/">SRT</ulink> * srtsink is a network sink that sends [SRT](http://www.srtalliance.org/)
* packets to the network. * packets to the network.
* *
* ## Examples</title> * ## Examples</title>
......
...@@ -23,7 +23,7 @@ ...@@ -23,7 +23,7 @@
* SECTION:element-srtsrc * SECTION:element-srtsrc
* @title: srtsrc * @title: srtsrc
* *
* srtsrc is a network source that reads <ulink url="http://www.srtalliance.org/">SRT</ulink> * srtsrc is a network source that reads [SRT](http://www.srtalliance.org/)
* packets from the network. * packets from the network.
* *
* ## Examples * ## Examples
......
...@@ -21,8 +21,9 @@ ...@@ -21,8 +21,9 @@
* SECTION:element-voaacenc * SECTION:element-voaacenc
* @title: voaacenc * @title: voaacenc
* *
* AAC audio encoder based on vo-aacenc library * AAC audio encoder based on vo-aacenc library.
* <ulink url="http://sourceforge.net/projects/opencore-amr/files/vo-aacenc/">vo-aacenc library source file</ulink>. *
* [vo-aacenc library source file](http://sourceforge.net/projects/opencore-amr/files/vo-aacenc/)
* *
* ## Example launch line * ## Example launch line
* |[ * |[
......
...@@ -23,7 +23,7 @@ ...@@ -23,7 +23,7 @@
* @see_also: #GstAmrWbDec, #GstAmrWbParse * @see_also: #GstAmrWbDec, #GstAmrWbParse
* *
* AMR wideband encoder based on the * AMR wideband encoder based on the
* <ulink url="http://www.penguin.cz/~utx/amr">reference codec implementation</ulink>. * [reference codec implementation](http://www.penguin.cz/~utx/amr).
* *
* ## Example launch line * ## Example launch line
* |[ * |[
......
...@@ -27,8 +27,9 @@ ...@@ -27,8 +27,9 @@
* *
* The waylandsink is creating its own window and render the decoded video frames to that. * The waylandsink is creating its own window and render the decoded video frames to that.
* Setup the Wayland environment as described in * Setup the Wayland environment as described in
* <ulink url="http://wayland.freedesktop.org/building.html">Wayland</ulink> home page. * [Wayland](http://wayland.freedesktop.org/building.html) home page.
* The current implementaion is based on weston compositor. *
* The current implementation is based on weston compositor.
* *
* ## Example pipelines * ## Example pipelines
* |[ * |[
......
...@@ -23,7 +23,7 @@ ...@@ -23,7 +23,7 @@
* @title: GstWebRTCDataChannel * @title: GstWebRTCDataChannel
* @see_also: #GstWebRTCRTPTransceiver * @see_also: #GstWebRTCRTPTransceiver
* *
* <ulink url="http://w3c.github.io/webrtc-pc/#dom-rtcsctptransport">http://w3c.github.io/webrtc-pc/#dom-rtcsctptransport</ulink> * <http://w3c.github.io/webrtc-pc/#dom-rtcsctptransport>
*/ */
#ifdef HAVE_CONFIG_H #ifdef HAVE_CONFIG_H
......
...@@ -23,8 +23,9 @@ ...@@ -23,8 +23,9 @@
* @see_also: #GstWildmidiDec * @see_also: #GstWildmidiDec
* *
* wildmididec decodes MIDI files. * wildmididec decodes MIDI files.
* It uses <ulink url="https://www.mindwerks.net/projects/wildmidi/">WildMidi</ulink> *
* for this purpose. It can be autoplugged and therefore works with decodebin. * It uses [WildMidi](https://www.mindwerks.net/projects/wildmidi/) for this
* purpose. It can be autoplugged and therefore works with decodebin.
* *
* ## Example launch line * ## Example launch line
* *
......
...@@ -23,7 +23,7 @@ ...@@ -23,7 +23,7 @@
* @title: GstWebRTCDTLSTransport * @title: GstWebRTCDTLSTransport
* @see_also: #GstWebRTCRTPSender, #GstWebRTCRTPReceiver, #GstWebRTCICETransport * @see_also: #GstWebRTCRTPSender, #GstWebRTCRTPReceiver, #GstWebRTCICETransport
* *
* <ulink url="https://www.w3.org/TR/webrtc/#rtcdtlstransport">https://www.w3.org/TR/webrtc/#rtcdtlstransport</ulink> * <https://www.w3.org/TR/webrtc/#rtcdtlstransport>
*/ */
#ifdef HAVE_CONFIG_H #ifdef HAVE_CONFIG_H
......
...@@ -23,7 +23,7 @@ ...@@ -23,7 +23,7 @@
* @title: GstWebRTCICETransport * @title: GstWebRTCICETransport
* @see_also: #GstWebRTCRTPSender, #GstWebRTCRTPReceiver, #GstWebRTCDTLSTransport * @see_also: #GstWebRTCRTPSender, #GstWebRTCRTPReceiver, #GstWebRTCDTLSTransport
* *
* <ulink url="https://www.w3.org/TR/webrtc/#rtcicetransport">https://www.w3.org/TR/webrtc/#rtcicetransport</ulink> * <https://www.w3.org/TR/webrtc/#rtcicetransport>
*/ */
#ifdef HAVE_CONFIG_H #ifdef HAVE_CONFIG_H
......
...@@ -22,7 +22,7 @@ ...@@ -22,7 +22,7 @@
* @short_description: RTCSessionDescription object * @short_description: RTCSessionDescription object
* @title: GstWebRTCSessionDescription * @title: GstWebRTCSessionDescription
* *
* <ulink url="https://www.w3.org/TR/webrtc/#rtcsessiondescription-class">https://www.w3.org/TR/webrtc/#rtcsessiondescription-class</ulink> * <https://www.w3.org/TR/webrtc/#rtcsessiondescription-class>
*/ */
#ifdef HAVE_CONFIG_H #ifdef HAVE_CONFIG_H
......
...@@ -38,7 +38,7 @@ GType gst_webrtc_session_description_get_type (void); ...@@ -38,7 +38,7 @@ GType gst_webrtc_session_description_get_type (void);
* @type: the #GstWebRTCSDPType of the description * @type: the #GstWebRTCSDPType of the description
* @sdp: the #GstSDPMessage of the description * @sdp: the #GstSDPMessage of the description
* *
* See <ulink url="https://www.w3.org/TR/webrtc/#rtcsessiondescription-class">https://www.w3.org/TR/webrtc/#rtcsessiondescription-class</ulink> * See <https://www.w3.org/TR/webrtc/#rtcsessiondescription-class>
*/ */
struct _GstWebRTCSessionDescription struct _GstWebRTCSessionDescription
{ {
......
...@@ -23,7 +23,7 @@ ...@@ -23,7 +23,7 @@
* @title: GstWebRTCRTPReceiver * @title: GstWebRTCRTPReceiver
* @see_also: #GstWebRTCRTPSender, #GstWebRTCRTPTransceiver * @see_also: #GstWebRTCRTPSender, #GstWebRTCRTPTransceiver
* *
* <ulink url="https://www.w3.org/TR/webrtc/#rtcrtpreceiver-interface">https://www.w3.org/TR/webrtc/#rtcrtpreceiver-interface</ulink> * <https://www.w3.org/TR/webrtc/#rtcrtpreceiver-interface>
*/ */
#ifdef HAVE_CONFIG_H #ifdef HAVE_CONFIG_H
......
...@@ -23,7 +23,7 @@ ...@@ -23,7 +23,7 @@
* @title: GstWebRTCRTPSender * @title: GstWebRTCRTPSender
* @see_also: #GstWebRTCRTPReceiver, #GstWebRTCRTPTransceiver * @see_also: #GstWebRTCRTPReceiver, #GstWebRTCRTPTransceiver
* *
* <ulink url="https://www.w3.org/TR/webrtc/#rtcrtpsender-interface">https://www.w3.org/TR/webrtc/#rtcrtpsender-interface</ulink> * <https://www.w3.org/TR/webrtc/#rtcrtpsender-interface>
*/ */
#ifdef HAVE_CONFIG_H #ifdef HAVE_CONFIG_H
......
...@@ -23,7 +23,7 @@ ...@@ -23,7 +23,7 @@
* @title: GstWebRTCRTPTransceiver * @title: GstWebRTCRTPTransceiver
* @see_also: #GstWebRTCRTPSender, #GstWebRTCRTPReceiver * @see_also: #GstWebRTCRTPSender, #GstWebRTCRTPReceiver
* *
* <ulink url="https://www.w3.org/TR/webrtc/#rtcrtptransceiver-interface">https://www.w3.org/TR/webrtc/#rtcrtptransceiver-interface</ulink> * <https://www.w3.org/TR/webrtc/#rtcrtptransceiver-interface>
*/ */
#ifdef HAVE_CONFIG_H #ifdef HAVE_CONFIG_H
......
...@@ -82,7 +82,7 @@ typedef enum /*< underscore_name=gst_webrtc_dtls_transport_state >*/ ...@@ -82,7 +82,7 @@ typedef enum /*< underscore_name=gst_webrtc_dtls_transport_state >*/
* @GST_WEBRTC_ICE_GATHERING_STATE_GATHERING: gathering * @GST_WEBRTC_ICE_GATHERING_STATE_GATHERING: gathering
* @GST_WEBRTC_ICE_GATHERING_STATE_COMPLETE: complete * @GST_WEBRTC_ICE_GATHERING_STATE_COMPLETE: complete
* *
* See <ulink url="http://w3c.github.io/webrtc-pc/#dom-rtcicegatheringstate">http://w3c.github.io/webrtc-pc/#dom-rtcicegatheringstate</ulink> * See <http://w3c.github.io/webrtc-pc/#dom-rtcicegatheringstate>
*/ */
typedef enum /*< underscore_name=gst_webrtc_ice_gathering_state >*/ typedef enum /*< underscore_name=gst_webrtc_ice_gathering_state >*/
{ {
...@@ -101,7 +101,7 @@ typedef enum /*< underscore_name=gst_webrtc_ice_gathering_state >*/ ...@@ -101,7 +101,7 @@ typedef enum /*< underscore_name=gst_webrtc_ice_gathering_state >*/
* @GST_WEBRTC_ICE_CONNECTION_STATE_DISCONNECTED: disconnected * @GST_WEBRTC_ICE_CONNECTION_STATE_DISCONNECTED: disconnected
* @GST_WEBRTC_ICE_CONNECTION_STATE_CLOSED: closed * @GST_WEBRTC_ICE_CONNECTION_STATE_CLOSED: closed
* *
* See <ulink url="http://w3c.github.io/webrtc-pc/#dom-rtciceconnectionstate">http://w3c.github.io/webrtc-pc/#dom-rtciceconnectionstate</ulink> * See <http://w3c.github.io/webrtc-pc/#dom-rtciceconnectionstate>
*/ */
typedef enum /*< underscore_name=gst_webrtc_ice_connection_state >*/ typedef enum /*< underscore_name=gst_webrtc_ice_connection_state >*/
{ {
...@@ -123,7 +123,7 @@ typedef enum /*< underscore_name=gst_webrtc_ice_connection_state >*/ ...@@ -123,7 +123,7 @@ typedef enum /*< underscore_name=gst_webrtc_ice_connection_state >*/
* @GST_WEBRTC_SIGNALING_STATE_HAVE_LOCAL_PRANSWER: have-local-pranswer * @GST_WEBRTC_SIGNALING_STATE_HAVE_LOCAL_PRANSWER: have-local-pranswer
* @GST_WEBRTC_SIGNALING_STATE_HAVE_REMOTE_PRANSWER: have-remote-pranswer * @GST_WEBRTC_SIGNALING_STATE_HAVE_REMOTE_PRANSWER: have-remote-pranswer
* *
* See <ulink url="http://w3c.github.io/webrtc-pc/#dom-rtcsignalingstate">http://w3c.github.io/webrtc-pc/#dom-rtcsignalingstate</ulink> * See <http://w3c.github.io/webrtc-pc/#dom-rtcsignalingstate>
*/ */
typedef enum /*< underscore_name=gst_webrtc_signaling_state >*/ typedef enum /*< underscore_name=gst_webrtc_signaling_state >*/
{ {
...@@ -144,7 +144,7 @@ typedef enum /*< underscore_name=gst_webrtc_signaling_state >*/ ...@@ -144,7 +144,7 @@ typedef enum /*< underscore_name=gst_webrtc_signaling_state >*/
* @GST_WEBRTC_PEER_CONNECTION_STATE_FAILED: failed * @GST_WEBRTC_PEER_CONNECTION_STATE_FAILED: failed
* @GST_WEBRTC_PEER_CONNECTION_STATE_CLOSED: closed * @GST_WEBRTC_PEER_CONNECTION_STATE_CLOSED: closed
* *
* See <ulink url="http://w3c.github.io/webrtc-pc/#dom-rtcpeerconnectionstate">http://w3c.github.io/webrtc-pc/#dom-rtcpeerconnectionstate</ulink> * See <http://w3c.github.io/webrtc-pc/#dom-rtcpeerconnectionstate>
*/ */
typedef enum /*< underscore_name=gst_webrtc_peer_connection_state >*/ typedef enum /*< underscore_name=gst_webrtc_peer_connection_state >*/
{ {
...@@ -185,7 +185,7 @@ typedef enum /*< underscore_name=gst_webrtc_ice_component >*/ ...@@ -185,7 +185,7 @@ typedef enum /*< underscore_name=gst_webrtc_ice_component >*/
* @GST_WEBRTC_SDP_TYPE_ANSWER: answer * @GST_WEBRTC_SDP_TYPE_ANSWER: answer
* @GST_WEBRTC_SDP_TYPE_ROLLBACK: rollback * @GST_WEBRTC_SDP_TYPE_ROLLBACK: rollback
* *
* See <ulink url="http://w3c.github.io/webrtc-pc/#rtcsdptype">http://w3c.github.io/webrtc-pc/#rtcsdptype</ulink> * See <http://w3c.github.io/webrtc-pc/#rtcsdptype>
*/ */
typedef enum /*< underscore_name=gst_webrtc_sdp_type >*/ typedef enum /*< underscore_name=gst_webrtc_sdp_type >*/
{ {
...@@ -282,7 +282,7 @@ typedef enum /*< underscore_name=gst_webrtc_fec_type >*/ ...@@ -282,7 +282,7 @@ typedef enum /*< underscore_name=gst_webrtc_fec_type >*/
* GST_WEBRTC_SCTP_TRANSPORT_STATE_CONNECTED: connected * GST_WEBRTC_SCTP_TRANSPORT_STATE_CONNECTED: connected
* GST_WEBRTC_SCTP_TRANSPORT_STATE_CLOSED: closed * GST_WEBRTC_SCTP_TRANSPORT_STATE_CLOSED: closed
* *
* See <ulink url="http://w3c.github.io/webrtc-pc/#dom-rtcsctptransportstate">http://w3c.github.io/webrtc-pc/#dom-rtcsctptransportstate</ulink> * See <http://w3c.github.io/webrtc-pc/#dom-rtcsctptransportstate>
* *
* Since: 1.16 * Since: 1.16
*/ */
...@@ -301,7 +301,7 @@ typedef enum /*< underscore_name=gst_webrtc_sctp_transport_state >*/ ...@@ -301,7 +301,7 @@ typedef enum /*< underscore_name=gst_webrtc_sctp_transport_state >*/
* GST_WEBRTC_PRIORITY_TYPE_MEDIUM: medium * GST_WEBRTC_PRIORITY_TYPE_MEDIUM: medium
* GST_WEBRTC_PRIORITY_TYPE_HIGH: high * GST_WEBRTC_PRIORITY_TYPE_HIGH: high
* *
* See <ulink url="http://w3c.github.io/webrtc-pc/#dom-rtcprioritytype">http://w3c.github.io/webrtc-pc/#dom-rtcprioritytype</ulink> * See <http://w3c.github.io/webrtc-pc/#dom-rtcprioritytype>
* *
* Since: 1.16 * Since: 1.16
*/ */
...@@ -321,7 +321,7 @@ typedef enum /*< underscore_name=gst_webrtc_priority_type >*/ ...@@ -321,7 +321,7 @@ typedef enum /*< underscore_name=gst_webrtc_priority_type >*/
* GST_WEBRTC_DATA_CHANNEL_STATE_CLOSING: closing * GST_WEBRTC_DATA_CHANNEL_STATE_CLOSING: closing
* GST_WEBRTC_DATA_CHANNEL_STATE_CLOSED: closed * GST_WEBRTC_DATA_CHANNEL_STATE_CLOSED: closed
* *
* See <ulink url="http://w3c.github.io/webrtc-pc/#dom-rtcdatachannelstate">http://w3c.github.io/webrtc-pc/#dom-rtcdatachannelstate</ulink> * See <http://w3c.github.io/webrtc-pc/#dom-rtcdatachannelstate>
* *
* Since: 1.16 * Since: 1.16
*/ */
......
...@@ -38,8 +38,8 @@ ...@@ -38,8 +38,8 @@
* *
* The accurip element calculates a CRC for an audio stream which can be used * The accurip element calculates a CRC for an audio stream which can be used
* to match the audio stream to a database hosted on * to match the audio stream to a database hosted on
* <ulink url="http://accuraterip.com/">AccurateRip</ulink>. This database * [AccurateRip](http://accuraterip.com/). This database is used to check for a
* is used to check for a CD rip accuracy. * CD rip accuracy.
* *
* ## Example launch line * ## Example launch line
* |[ * |[
......
...@@ -65,9 +65,9 @@ ...@@ -65,9 +65,9 @@
* @title: festival * @title: festival
* *
* This element connects to a * This element connects to a
* <ulink url="http://www.festvox.org/festival/index.html">festival</ulink> * [festival](http://www.festvox.org/festival/index.html) server process and
* server process and uses it to synthesize speech. Festival need to run already * uses it to synthesize speech. Festival need to run already in server mode,
* in server mode, started as `festival --server` * started as `festival --server`
* *
* ## Example pipeline * ## Example pipeline
* |[ * |[
......
...@@ -26,9 +26,8 @@ ...@@ -26,9 +26,8 @@
* #GstPcapParse:src-port and #GstPcapParse:dst-port to restrict which packets * #GstPcapParse:src-port and #GstPcapParse:dst-port to restrict which packets
* should be included. * should be included.
* *
* The supported data format is the classical <ulink * The supported data format is the classical
* url="https://wiki.wireshark.org/Development/LibpcapFileFormat">libpcap file * [libpcap file format](https://wiki.wireshark.org/Development/LibpcapFileFormat)
* format</ulink>.
* *
* ## Example pipelines * ## Example pipelines
* |[ * |[
......
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment