GStreamer应用开发手册
Foreword 前言
安装GStreamer
在linux系统上安装
教程
基础教程
基础教程 1:世界你好!
基础教程 2:GStreamer 概念
基础教程 3:动态管道
Plugin 写作指南
导言
序言
基础
编写插件
构建模板
-
+
首页
基础教程 3:动态管道
## Goal 目标 This tutorial shows the rest of the basic concepts required to use GStreamer, which allow building the pipeline "on the fly", as information becomes available, instead of having a monolithic pipeline defined at the beginning of your application. 本教程展示了使用 GStreamer 所需的其他基本概念,这些概念允许在信息可用时 "即时 "构建管道,而不是在应用程序开始时定义一个整体管道。 After this tutorial, you will have the necessary knowledge to start the Playback tutorials. The points reviewed here will be: 学完本教程后,您将掌握开始回放教程的必要知识。在此回顾的要点包括 - How to attain finer control when linking elements. - 如何在连接元素时实现更精细的控制。 - How to be notified of interesting events so you can react in time. - 如何获得有趣事件的通知,以便及时做出反应。 - The various states in which an element can be. - 元素的各种状态。 ## Introduction 导言 As you are about to see, the pipeline in this tutorial is not completely built before it is set to the playing state. This is OK. If we did not take further action, data would reach the end of the pipeline and the pipeline would produce an error message and stop. But we are going to take further action... 正如您即将看到的,本教程中的流水线在设置为播放状态之前并没有完成构建。这没有问题。如果我们不采取进一步行动,数据就会到达管道的末端,管道就会产生错误信息并停止。但我们将采取进一步措施... In this example we are opening a file which is multiplexed (or muxed), this is, audio and video are stored together inside a container file. The elements responsible for opening such containers are called demuxers, and some examples of container formats are Matroska (MKV), Quick Time (QT, MOV), Ogg, or Advanced Systems Format (ASF, WMV, WMA). 在本例中,我们要打开的文件是多路复用(或 muxed)文件,即音频和视频一起存储在一个容器文件中。负责打开此类容器的元件称为多路复用器,容器格式包括 Matroska (MKV)、Quick Time (QT、MOV)、Ogg 或 Advanced Systems Format (ASF、WMV、WMA)。 If a container embeds multiple streams (one video and two audio tracks, for example), the demuxer will separate them and expose them through different output ports. In this way, different branches can be created in the pipeline, dealing with different types of data. 如果一个容器包含多个数据流(例如一个视频和两个音轨),解码器会将它们分开,并通过不同的输出端口将它们显示出来。这样,就可以在流水线中创建不同的分支,处理不同类型的数据。 The ports through which GStreamer elements communicate with each other are called pads (GstPad). There exists sink pads, through which data enters an element, and source pads, through which data exits an element. It follows naturally that source elements only contain source pads, sink elements only contain sink pads, and filter elements contain both. GStreamer 元素之间相互通信的端口称为焊盘(GstPad)。数据通过汇垫进入元素,数据通过源垫流出元素,因此源元素只包含源垫,罪元素只包含汇垫,而过滤元素则两者都包含。 ![](/media/202404/1_1713887428.png) ![](/media/202404/2_1713887440.png) ![](/media/202404/3_1713887452.png) Figure 1. GStreamer elements with their pads. 图 1.带有焊盘的 GStreamer 元件。 A demuxer contains one sink pad, through which the muxed data arrives, and multiple source pads, one for each stream found in the container: 解多路复用器包含一个汇接垫和多个源接垫,汇接垫用于传输经多路复用的数据,源接垫则用于传输容器中的每个数据流: ![](/media/202404/4_1713887403.png) Figure 2. A demuxer with two source pads. 图 2.带有两个源焊盘的解复用器。 For completeness, here you have a simplified pipeline containing a demuxer and two branches, one for audio and one for video. This is NOT the pipeline that will be built in this example: 为完整起见,这里有一个简化的流水线,包含 ademuxer 和两个分支,一个用于音频,一个用于视频。这不是本示例中将要构建的流水线: ![](/media/202404/5_1713887383.png) Figure 3. Example pipeline with two branches. 图 3.有两个分支的流水线示例。 The main complexity when dealing with demuxers is that they cannot produce any information until they have received some data and have had a chance to look at the container to see what is inside. This is, demuxers start with no source pads to which other elements can link, and thus the pipeline must necessarily terminate at them. 处理解复用器时的主要复杂性在于,解复用器在接收到一些数据并有机会查看容器内的内容之前,无法产生任何信息。这是因为,解复用器在开始时没有其他元素可以链接的源垫,因此流水线必须以它们为终点。 The solution is to build the pipeline from the source down to the demuxer, and set it to run (play). When the demuxer has received enough information to know about the number and kind of streams in the container, it will start creating source pads. This is the right time for us to finish building the pipeline and attach it to the newly added demuxer pads. 解决办法是建立从源码到解串器的流水线,并将其设置为运行(播放)。当解耦器接收到足够的信息,了解到容器中流的数量和种类后,它就会开始创建源缓冲区。这时,我们就可以完成管道的构建,并将其连接到新添加的解串器焊盘上。 For simplicity, in this example, we will only link to the audio pad and ignore the video. 为简单起见,在本例中,我们将只链接音频垫,而不链接视频。 ## Dynamic Hello World 动态 Hello World Copy this code into a text file named basic-tutorial-3.c (or find it in your GStreamer installation). 将此代码复制到名为 basic-tutorial-3.c 的文本文件中(或在 GStreamer 安装文件中找到它)。 basic-tutorial-3.c basic-tutorial-3.c ```c #include <gst/gst.h> /* Structure to contain all our information, so we can pass it to callbacks */ typedef struct _CustomData { GstElement *pipeline; GstElement *source; GstElement *convert; GstElement *resample; GstElement *sink; } CustomData; /* Handler for the pad-added signal */ static void pad_added_handler (GstElement *src, GstPad *pad, CustomData *data); int main(int argc, char *argv[]) { CustomData data; GstBus *bus; GstMessage *msg; GstStateChangeReturn ret; gboolean terminate = FALSE; /* Initialize GStreamer */ gst_init (&argc, &argv); /* Create the elements */ data.source = gst_element_factory_make ("uridecodebin", "source"); data.convert = gst_element_factory_make ("audioconvert", "convert"); data.resample = gst_element_factory_make ("audioresample", "resample"); data.sink = gst_element_factory_make ("autoaudiosink", "sink"); /* Create the empty pipeline */ data.pipeline = gst_pipeline_new ("test-pipeline"); if (!data.pipeline || !data.source || !data.convert || !data.resample || !data.sink) { g_printerr ("Not all elements could be created.\n"); return -1; } /* Build the pipeline. Note that we are NOT linking the source at this * point. We will do it later. */ gst_bin_add_many (GST_BIN (data.pipeline), data.source, data.convert, data.resample, data.sink, NULL); if (!gst_element_link_many (data.convert, data.resample, data.sink, NULL)) { g_printerr ("Elements could not be linked.\n"); gst_object_unref (data.pipeline); return -1; } /* Set the URI to play */ g_object_set (data.source, "uri", "https://gstreamer.freedesktop.org/data/media/sintel_trailer-480p.webm", NULL); /* Connect to the pad-added signal */ g_signal_connect (data.source, "pad-added", G_CALLBACK (pad_added_handler), &data); /* Start playing */ ret = gst_element_set_state (data.pipeline, GST_STATE_PLAYING); if (ret == GST_STATE_CHANGE_FAILURE) { g_printerr ("Unable to set the pipeline to the playing state.\n"); gst_object_unref (data.pipeline); return -1; } /* Listen to the bus */ bus = gst_element_get_bus (data.pipeline); do { msg = gst_bus_timed_pop_filtered (bus, GST_CLOCK_TIME_NONE, GST_MESSAGE_STATE_CHANGED | GST_MESSAGE_ERROR | GST_MESSAGE_EOS); /* Parse message */ if (msg != NULL) { GError *err; gchar *debug_info; switch (GST_MESSAGE_TYPE (msg)) { case GST_MESSAGE_ERROR: gst_message_parse_error (msg, &err, &debug_info); g_printerr ("Error received from element %s: %s\n", GST_OBJECT_NAME (msg->src), err->message); g_printerr ("Debugging information: %s\n", debug_info ? debug_info : "none"); g_clear_error (&err); g_free (debug_info); terminate = TRUE; break; case GST_MESSAGE_EOS: g_print ("End-Of-Stream reached.\n"); terminate = TRUE; break; case GST_MESSAGE_STATE_CHANGED: /* We are only interested in state-changed messages from the pipeline */ if (GST_MESSAGE_SRC (msg) == GST_OBJECT (data.pipeline)) { GstState old_state, new_state, pending_state; gst_message_parse_state_changed (msg, &old_state, &new_state, &pending_state); g_print ("Pipeline state changed from %s to %s:\n", gst_element_state_get_name (old_state), gst_element_state_get_name (new_state)); } break; default: /* We should not reach here */ g_printerr ("Unexpected message received.\n"); break; } gst_message_unref (msg); } } while (!terminate); /* Free resources */ gst_object_unref (bus); gst_element_set_state (data.pipeline, GST_STATE_NULL); gst_object_unref (data.pipeline); return 0; } /* This function will be called by the pad-added signal */ static void pad_added_handler (GstElement *src, GstPad *new_pad, CustomData *data) { GstPad *sink_pad = gst_element_get_static_pad (data->convert, "sink"); GstPadLinkReturn ret; GstCaps *new_pad_caps = NULL; GstStructure *new_pad_struct = NULL; const gchar *new_pad_type = NULL; g_print ("Received new pad '%s' from '%s':\n", GST_PAD_NAME (new_pad), GST_ELEMENT_NAME (src)); /* If our converter is already linked, we have nothing to do here */ if (gst_pad_is_linked (sink_pad)) { g_print ("We are already linked. Ignoring.\n"); goto exit; } /* Check the new pad's type */ new_pad_caps = gst_pad_get_current_caps (new_pad); new_pad_struct = gst_caps_get_structure (new_pad_caps, 0); new_pad_type = gst_structure_get_name (new_pad_struct); if (!g_str_has_prefix (new_pad_type, "audio/x-raw")) { g_print ("It has type '%s' which is not raw audio. Ignoring.\n", new_pad_type); goto exit; } /* Attempt the link */ ret = gst_pad_link (new_pad, sink_pad); if (GST_PAD_LINK_FAILED (ret)) { g_print ("Type is '%s' but link failed.\n", new_pad_type); } else { g_print ("Link succeeded (type '%s').\n", new_pad_type); } exit: /* Unreference the new pad's caps, if we got them */ if (new_pad_caps != NULL) gst_caps_unref (new_pad_caps); /* Unreference the sink pad */ gst_object_unref (sink_pad); } ``` ***Information*** Need help? *If you need help to compile this code, refer to the Building the tutorials section for your platform: Linux, Mac OS X or Windows, or use this specific command on Linux: gcc basic-tutorial-3.c -o basic-tutorial-3 `pkg-config --cflags --libs gstreamer-1.0` *If you need help to run this code, refer to the Running the tutorials section for your platform: Linux, Mac OS X or Windows.* *This tutorial only plays audio. The media is fetched from the Internet, so it might take a few seconds to start, depending on your connection speed.* *本教程只播放音频。媒体是从互联网上获取的,因此可能需要几秒钟才能启动,具体取决于您的连接速度。* Required libraries: 所需的库:gstreamer-1.0. gstreamer-1.0* ## Walkthrough 演练 ```c /* Structure to contain all our information, so we can pass it to callbacks */ typedef struct _CustomData { GstElement *pipeline; GstElement *source; GstElement *convert; GstElement *resample; GstElement *sink; } CustomData; ``` So far we have kept all the information we needed (pointers to GstElements, basically) as local variables. Since this tutorial (and most real applications) involves callbacks, we will group all our data in a structure for easier handling. 到目前为止,我们将所有需要的信息(基本上是指向 GstElements 的指针)都保存为局部变量。由于本教程(以及大多数实际应用)涉及回调,我们将把所有数据归类到一个结构中,以便于处理。 ```c /* Handler for the pad-added signal */ static void pad_added_handler (GstElement *src, GstPad *pad, CustomData *data); ``` This is a forward reference, to be used later. 这是一个前向参照,将在以后使用。 ```c /* Create the elements */ data.source = gst_element_factory_make ("uridecodebin", "source"); data.convert = gst_element_factory_make ("audioconvert", "convert"); data.resample = gst_element_factory_make ("audioresample", "resample"); data.sink = gst_element_factory_make ("autoaudiosink", "sink"); ``` We create the elements as usual. uridecodebin will internally instantiate all the necessary elements (sources, demuxers and decoders) to turn a URI into raw audio and/or video streams. It does half the work that playbin does. Since it contains demuxers, its source pads are not initially available and we will need to link to them on the fly. 我们像往常一样创建元素。uridecodebin 将在内部实例化所有必要的元素(信号源、解码器和解码器),以便将 URI 转换为原始音频和/或视频流。它所做的工作是 playbin 所做工作的一半。由于它包含解码器,因此它的源垫最初不可用,我们需要临时链接到它们。 audioconvert is useful for converting between different audio formats, making sure that this example will work on any platform, since the format produced by the audio decoder might not be the same that the audio sink expects. audioconvert 用于在不同音频格式之间进行转换,确保本示例可以在任何平台上运行,因为音频解码器生成的格式可能与音频汇期望的格式不同。 audioresample is useful for converting between different audio sample rates, similarly making sure that this example will work on any platform, since the audio sample rate produced by the audio decoder might not be one that the audio sink supports. audioresample 用于在不同音频采样率之间进行转换,同样确保本示例可在任何平台上运行,因为音频解码器产生的音频采样率可能不是音频链路所支持的采样率。 The autoaudiosink is the equivalent of autovideosink seen in the previous tutorial, for audio. It will render the audio stream to the audio card. autoaudiosink相当于上一教程中的autovideosink,用于音频。它将把音频流呈现到音频卡上。 ```c if (!gst_element_link_many (data.convert, data.resample, data.sink, NULL)) { g_printerr ("Elements could not be linked.\n"); gst_object_unref (data.pipeline); return -1; } ``` Here we link the elements converter, resample and sink, but we DO NOT link them with the source, since at this point it contains no source pads. We just leave this branch (converter + sink) unlinked, until later on. 在这里,我们将转换器、重采样和水槽等元素连接起来,但并不将它们与源代码连接起来,因为此时源代码中没有源焊盘。我们只需将此分支(转换器 + 汇流排)保持未链接状态,直到稍后再进行链接。 ```c /* Set the URI to play */ g_object_set (data.source, "uri", "https://gstreamer.freedesktop.org/data/media/sintel_trailer-480p.webm", NULL); ``` We set the URI of the file to play via a property, just like we did in the previous tutorial. 我们通过属性设置要播放的文件的 URI,就像在上一教程中所做的那样。 ### Signals 信号 ```c /* Connect to the pad-added signal */ g_signal_connect (data.source, "pad-added", G_CALLBACK (pad_added_handler), &data); ``` GSignals are a crucial point in GStreamer. They allow you to be notified (by means of a callback) when something interesting has happened. Signals are identified by a name, and each GObject has its own signals. GSignals 是 GStreamer 中的一个关键点。当有趣的事情发生时,你可以通过回调得到通知。信号由名称标识,每个 GObject 都有自己的信号。 In this line, we are attaching to the “pad-added” signal of our source (an uridecodebin element). To do so, we use g_signal_connect() and provide the callback function to be used (pad_added_handler) and a data pointer. GStreamer does nothing with this data pointer, it just forwards it to the callback so we can share information with it. In this case, we pass a pointer to the CustomData structure we built specially for this purpose. 在这一行中,我们将连接到源代码的 "pad-added "信号(一个 uridecodebin 元素)。为此,我们使用 g_signal_connect() 并提供要使用的回调函数 (pad_added_handler) 和数据指针。GStreamer 不会对该数据指针做任何处理,它只是将其转发给回调函数,以便我们与其共享信息。在本例中,我们传递了一个指向专门为此构建的 CustomData 结构的指针。 The signals that a GstElement generates can be found in its documentation or using the gst-inspect-1.0 tool as described in Basic tutorial 10: GStreamer tools. GstElement生成的信号可在其文档中找到,或使用gst-inspect-1.0工具(如《基础教程 10:GStreamertools》中所述)。 We are now ready to go! Just set the pipeline to the PLAYING state and start listening to the bus for interesting messages (like ERROR or EOS), just like in the previous tutorials. 现在我们已经准备就绪!只需将管道设置为 PLAYING 状态,然后开始监听总线上的有趣信息(如 ERROR 或 EOS),就像之前的教程一样。 ### The callback 回调 When our source element finally has enough information to start producing data, it will create source pads, and trigger the “pad-added” signal. At this point our callback will be called: 当我们的源元素终于有足够的信息开始产生数据时,它会创建源焊盘,并触发 "焊盘已添加 "信号。此时,我们的回调将被调用: ```c static void pad_added_handler (GstElement *src, GstPad *new_pad, CustomData *data) { ``` src is the GstElement which triggered the signal. In this example, it can only be the uridecodebin, since it is the only signal to which we have attached. The first parameter of a signal handler is always the object that has triggered it. src 是触发信号的 GstElement 。在本例中,它只能是 uridecodebin,因为它是我们所连接的唯一信号。信号处理器的第一个参数始终是触发它的对象。 new_pad is the GstPad that has just been added to the src element. This is usually the pad to which we want to link. new_pad 是刚刚添加到 src 元素中的 GstPad。 data is the pointer we provided when attaching to the signal. In this example, we use it to pass the CustomData pointer. data 是我们在连接信号时提供的指针。在本例中,我们用它来传递 CustomData 指针。 `GstPad *sink_pad = gst_element_get_static_pad (data->convert, "sink");` From CustomData we extract the converter element, and then retrieve its sink pad using gst_element_get_static_pad (). This is the pad to which we want to link new_pad. In the previous tutorial we linked element against element, and let GStreamer choose the appropriate pads. Now we are going to link the pads directly. 我们从 CustomData 中提取转换器元素,然后使用 gst_element_get_static_pad () 获取其汇接垫。这就是我们要链接 new_pad 的焊盘。在上一教程中,我们将元素与元素连接起来,让 GStreamer 选择适当的焊盘。 ```c /* If our converter is already linked, we have nothing to do here */ if (gst_pad_is_linked (sink_pad)) { g_print ("We are already linked. Ignoring.\n"); goto exit; } ``` uridecodebin can create as many pads as it sees fit, and for each one, this callback will be called. These lines of code will prevent us from trying to link to a new pad once we are already linked. uridecodebin可以根据需要创建任意数量的焊盘,并且每个焊盘都将调用此回调。这几行代码将防止我们在已经链接到一个新 PAD 后,又试图链接到另一个新 PAD。 ```c /* Check the new pad's type */ new_pad_caps = gst_pad_get_current_caps (new_pad, NULL); new_pad_struct = gst_caps_get_structure (new_pad_caps, 0); new_pad_type = gst_structure_get_name (new_pad_struct); if (!g_str_has_prefix (new_pad_type, "audio/x-raw")) { g_print ("It has type '%s' which is not raw audio. Ignoring.\n", new_pad_type); goto exit; } ``` Now we will check the type of data this new pad is going to output, because we are only interested in pads producing audio. We have previously created a piece of pipeline which deals with audio (an audioconvert linked with an audioresample and an autoaudiosink), and we will not be able to link it to a pad producing video, for example. 现在,我们将检查新焊盘将要输出的数据类型,因为我们只对产生音频的焊盘感兴趣。我们之前已经创建了一个处理音频的流水线(一个 audioconvert 链接到一个 audioresample 和一个 autoaudiosink),因此我们无法将其链接到一个制作视频的 pad 上。 gst_pad_get_current_caps() retrieves the current capabilities of the pad (that is, the kind of data it currently outputs), wrapped in a GstCaps structure. All possible caps a pad can support can be queried with gst_pad_query_caps(). A pad can offer many capabilities, and hence GstCaps can contain many GstStructure, each representing a different capability. The current caps on a pad will always have a single GstStructure and represent a single media format, or if there are no current caps yet NULL will be returned. gst_pad_get_current_caps() 以 GstCaps 结构封装,检索 pad 的当前功能(即当前输出的数据类型)。使用gst_pad_query_caps()可以查询 PAD 可能支持的所有上限。一个 PAD 可以提供多种功能,因此 GstCaps 可以包含许多 GstStructure,每个都代表不同的功能。焊盘上的当前盖帽总是只有一个 GstStructure,代表一种媒体格式,如果还没有当前盖帽,则会返回 NULL。 Since, in this case, we know that the pad we want only had one capability (audio), we retrieve the first GstStructure with gst_caps_get_structure(). 在本例中,我们知道我们想要的焊盘只有一个功能(音频),因此我们用gst_caps_get_structure()检索第一个GstStructure。 Finally, with gst_structure_get_name() we recover the name of the structure, which contains the main description of the format (its media type, actually). 最后,我们用 gst_structure_get_name() 恢复结构的名称,其中包含格式的主要描述(实际上是其媒介类型)。 If the name is not audio/x-raw, this is not a decoded audio pad, and we are not interested in it. 如果名称不是 audio/x-raw,则这不是解码音频垫,我们对其不感兴趣。 Otherwise, attempt the link: 否则,请尝试链接: ```c /* Attempt the link */ ret = gst_pad_link (new_pad, sink_pad); if (GST_PAD_LINK_FAILED (ret)) { g_print ("Type is '%s' but link failed.\n", new_pad_type); } else { g_print ("Link succeeded (type '%s').\n", new_pad_type); } ``` gst_pad_link() tries to link two pads. As it was the case with gst_element_link(), the link must be specified from source to sink, and both pads must be owned by elements residing in the same bin (or pipeline). gst_pad_link() 试图链接两个焊盘。与 gst_element_link() 的情况一样,必须指定从源到汇的链接,并且两个焊盘都必须为驻留在同一个 bin(或管道)中的元素所拥有。 And we are done! When a pad of the right kind appears, it will be linked to the rest of the audio-processing pipeline and execution will continue until ERROR or EOS. However, we will squeeze a bit more content from this tutorial by also introducing the concept of State. 我们就大功告成了!当出现正确类型的焊盘时,它将被链接到音频处理管道的其他部分,并继续执行,直到出现 ERROR 或 EOS。不过,我们还将从本教程中挤出更多内容,介绍状态的概念。 ### GStreamer States GStreamer 状态 We already talked a bit about states when we said that playback does not start until you bring the pipeline to the PLAYING state. We will introduce here the rest of states and their meaning. There are 4 states in GStreamer: 当我们说到在将流水线引入 PLAYING 状态之前,播放不会开始时,我们已经谈了一些关于状态的内容。下面我们将介绍其他状态及其含义。GStreamer 有 4 种状态: - State 国家 Description 说明 - NULL the NULL state or initial state of an element. - 元素的 NULL 状态或初始状态。 - READY the element is ready to go to PAUSED. - 元素准备就绪,进入 PAUSED 状态。 - PAUSED the element is PAUSED, it is ready to accept and process data. Sink elements however only accept one buffer and then block. - 则表示该元件已准备好接受和处理数据。而 Sink 元件只能接受一个缓冲区,然后阻塞。 - PLAYING the element is PLAYING, the clock is running and the data is flowing. - 元素在播放,时钟在运转,数据在流动。 You can only move between adjacent ones, this is, you can't go from NULL to PLAYING, you have to go through the intermediate READY and PAUSED states. If you set the pipeline to PLAYING, though, GStreamer will make the intermediate transitions for you. 你只能在相邻的状态之间移动,也就是说,你不能从 NULL 到 PLAYING,你必须经过中间的 READY 和 PAUSED 状态。不过,如果将流水线设置为 PLAYING,GStreamer 会为你完成中间转换。 ```c case GST_MESSAGE_STATE_CHANGED: /* We are only interested in state-changed messages from the pipeline */ if (GST_MESSAGE_SRC (msg) == GST_OBJECT (data.pipeline)) { GstState old_state, new_state, pending_state; gst_message_parse_state_changed (msg, &old_state, &new_state, &pending_state); g_print ("Pipeline state changed from %s to %s:\n", gst_element_state_get_name (old_state), gst_element_state_get_name (new_state)); } break; ``` We added this piece of code that listens to bus messages regarding state changes and prints them on screen to help you understand the transitions. Every element puts messages on the bus regarding its current state, so we filter them out and only listen to messages coming from the pipeline. 我们添加了这段代码,用于监听有关状态变化的总线消息,并将其打印在屏幕上,以帮助您理解过渡。每个元素都会向总线发送有关其当前状态的信息,因此我们会过滤掉这些信息,只监听来自管道的信息。 Most applications only need to worry about going to PLAYING to start playback, then to PAUSED to perform a pause, and then back to NULL at program exit to free all resources. 大多数应用程序只需担心进入 PLAYING 开始播放,然后进入 PAUSED 执行暂停,然后在程序退出时返回 NULL 释放所有资源。 ## Exercise 练习 Dynamic pad linking has traditionally been a difficult topic for a lot of programmers. Prove that you have achieved its mastery by instantiating an autovideosink (probably with an videoconvert in front) and link it to the demuxer when the right pad appears. Hint: You are already printing on screen the type of the video pads. 对于许多程序员来说,动态焊盘链接历来是一个难点。通过建立一个 autovideosink(可能前面有一个 videoconvert),并在出现正确的焊盘时将其链接到解串器,证明您已经掌握了这一技巧。提示:您已经在屏幕上打印了视频焊盘的类型。 You should now see (and hear) the same movie as in Basic tutorial 1: Hello world!. In that tutorial you used playbin, which is a handy element that automatically takes care of all the demuxing and pad linking for you. Most of the Playback tutorials are devoted to playbin. 现在您应该看到(并听到)与基础教程 1:Hello world!在该教程中,您使用了 playbin,它是一个方便的元素,可以自动为您处理所有的解复用和 pad 连接。 ## Conclusion 结论 In this tutorial, you learned: 在本教程中,您将学到 How to be notified of events using 如何使用 GSignals 获得事件通知。 GSignals How to connect GstPads directly instead of their parent elements. 如何直接连接 GstPads 而不是其父元素。 The various states of a GStreamer element. GStreamer 元素的各种状态。 You also combined these items to build a dynamic pipeline, which was not defined at program start, but was created as information regarding the media was available. 您还可以将这些项目组合起来,建立一个动态管道,该管道并非在程序开始时定义,而是在获得媒体信息后创建的。 You can now continue with the basic tutorials and learn about performing seeks and time-related queries in Basic tutorial 4: Time management or move to the Playback tutorials, and gain more insight about the playbin element. 现在,您可以继续学习基础教程,并在《基础教程 4:时间管理》中了解有关执行查询和时间相关查询的内容,或者转到《回放》教程,获得有关 playbin 元素的更多知识。 Remember that attached to this page you should find the complete source code of the tutorial and any accessory files needed to build it. It has been a pleasure having you here, and see you soon! 请记住,在本页的附件中,您可以找到教程的完整源代码以及构建教程所需的任何附件文件!
admin
2024年4月23日 23:59
转发文档
收藏文档
上一篇
下一篇
手机扫码
复制链接
手机扫一扫转发分享
复制链接
Markdown文件
分享
链接
类型
密码
更新密码