概述
本教程演示如何将GStreamer集成到像GTK+这样的图形用户界面(GUI)工具包中。基本上,GStreamer负责媒体播放,而GUI工具包处理用户交互。最有趣的部分是两个库必须交互的部分:指示GStreamer将视频输出到GTK+窗口,并将用户操作转发到GStreamer。
特别是,你将学到:
- 如何告诉GStreamer将视频输出到特定窗口(而不是创建自己的窗口)。
- 如何使用来自GStreamer的信息不断刷新GUI。
- 如何从GStreamer的多线程中更新GUI,这是大多数GUI工具包禁止的操作。
- 一种机制,只订阅您感兴趣的消息,而不被通知所有消息。
1. 介绍
我们将使用GTK+工具包构建一个媒体播放器,但是这些概念适用于其他工具包,例如Qt。对GTK+的基本了解将有助于理解本教程。
要点是告诉GStreamer将视频输出到我们选择的窗口。具体的机制取决于操作系统(或者更确切地说,取决于窗口系统),但是GStreamer为平台独立性提供了一个抽象层。这种独立性来自GstVideoOverlay
接口,它允许应用程序告诉视频接收器应该接收呈现的窗口的处理程序。
GObject接口
GObject接口(GStreamer使用)是元素可以实现的一组函数。如果是这样的话,那么就说它支持那个特定的接口。例如,视频接收器通常创建自己的窗口来显示视频,但是,如果它们也能够渲染到外部窗口,则可以选择实现
GstVideoOverlay
接口并提供指定此外部窗口的函数。从应用程序开发人员的角度来看,如果支持某个接口,则可以使用它,而不必考虑是哪种元素实现它。此外,如果您使用playbin
,它将自动公开其内部元素支持的一些接口:您可以直接在playbin
上使用接口函数,而不知道谁在实现它们!
另一个问题是,GUI工具包通常只允许通过主(或应用程序)线程操作图形化“小部件”,而GStreamer通常生成多个线程来处理不同的任务。从回调中调用GTK+函数通常会失败,因为回调在调用线程中执行,而调用线程不需要是主线程。这个问题可以通过在回调中的GStreamer总线上发布一条消息来解决:消息将由主线程接收,然后主线程将做出相应的反应。
最后,到目前为止,我们已经注册了一个handle_message
,每当消息出现在总线上时,都会调用该函数,这迫使我们解析每个消息,以查看它是否对我们感兴趣。在本教程中,将使用不同的方法为每种消息注册一个回调,这样就减少了解析和代码的总量。
2. GTK+的媒体播放器+
让我们写一个非常简单的媒体播放器基于playbin
,这一次,与一个图形用户界面!
将此代码复制到名为basic-tutorial-5.c
的文本文件中(或在GStreamer安装中找到它)。
basic-tutorial-5.c
#include <string.h>
#include <gtk/gtk.h>
#include <gst/gst.h>
#include <gst/video/videooverlay.h>
#include <gdk/gdk.h>
#if defined (GDK_WINDOWING_X11)
#include <gdk/gdkx.h>
#elif defined (GDK_WINDOWING_WIN32)
#include <gdk/gdkwin32.h>
#elif defined (GDK_WINDOWING_QUARTZ)
#include <gdk/gdkquartz.h>
#endif
/* Structure to contain all our information, so we can pass it around */
typedef struct _CustomData {
GstElement *playbin; /* Our one and only pipeline */
GtkWidget *slider; /* Slider widget to keep track of current position */
GtkWidget *streams_list; /* Text widget to display info about the streams */
gulong slider_update_signal_id; /* Signal ID for the slider update signal */
GstState state; /* Current state of the pipeline */
gint64 duration; /* Duration of the clip, in nanoseconds */
} CustomData;
/* This function is called when the GUI toolkit creates the physical window that will hold the video.
* At this point we can retrieve its handler (which has a different meaning depending on the windowing system)
* and pass it to GStreamer through the VideoOverlay interface. */
static void realize_cb (GtkWidget *widget, CustomData *data) {
GdkWindow *window = gtk_widget_get_window (widget);
guintptr window_handle;
if (!gdk_window_ensure_native (window))
g_error ("Couldn't create native window needed for GstVideoOverlay!");
/* Retrieve window handler from GDK */
#if defined (GDK_WINDOWING_WIN32)
window_handle = (guintptr)GDK_WINDOW_HWND (window);
#elif defined (GDK_WINDOWING_QUARTZ)
window_handle = gdk_quartz_window_get_nsview (window);
#elif defined (GDK_WINDOWING_X11)
window_handle = GDK_WINDOW_XID (window);
#endif
/* Pass it to playbin, which implements VideoOverlay and will forward it to the video sink */
gst_video_overlay_set_window_handle (GST_VIDEO_OVERLAY (data->playbin), window_handle);
}
/* This function is called when the PLAY button is clicked */
static void play_cb (GtkButton *button, CustomData *data) {
gst_element_set_state (data->playbin, GST_STATE_PLAYING);
}
/* This function is called when the PAUSE button is clicked */
static void pause_cb (GtkButton *button, CustomData *data) {
gst_element_set_state (data->playbin, GST_STATE_PAUSED);
}
/* This function is called when the STOP button is clicked */
static void stop_cb (GtkButton *button, CustomData *data) {
gst_element_set_state (data->playbin, GST_STATE_READY);
}
/* This function is called when the main window is closed */
static void delete_event_cb (GtkWidget *widget, GdkEvent *event, CustomData *data) {
stop_cb (NULL, data);
gtk_main_quit ();
}
/* This function is called everytime the video window needs to be redrawn (due to damage/exposure,
* rescaling, etc). GStreamer takes care of this in the PAUSED and PLAYING states, otherwise,
* we simply draw a black rectangle to avoid garbage showing up. */
static gboolean draw_cb (GtkWidget *widget, cairo_t *cr, CustomData *data) {
if (data->state < GST_STATE_PAUSED) {
GtkAllocation allocation;
/* Cairo is a 2D graphics library which we use here to clean the video window.
* It is used by GStreamer for other reasons, so it will always be available to us. */
gtk_widget_get_allocation (widget, &allocation);
cairo_set_source_rgb (cr, 0, 0, 0);
cairo_rectangle (cr, 0, 0, allocation.width, allocation.height);
cairo_fill (cr);
}
return FALSE;
}
/* This function is called when the slider changes its position. We perform a seek to the
* new position here. */
static void slider_cb (GtkRange *range, CustomData *data) {
gdouble value = gtk_range_get_value (GTK_RANGE (data->slider));
gst_element_seek_simple (data->playbin, GST_FORMAT_TIME, GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_KEY_UNIT,
(gint64)(value * GST_SECOND));
}
/* This creates all the GTK+ widgets that compose our application, and registers the callbacks */
static void create_ui (CustomData *data) {
GtkWidget *main_window; /* The uppermost window, containing all other windows */
GtkWidget *video_window; /* The drawing area where the video will be shown */
GtkWidget *main_box; /* VBox to hold main_hbox and the controls */
GtkWidget *main_hbox; /* HBox to hold the video_window and the stream info text widget */
GtkWidget *controls; /* HBox to hold the buttons and the slider */
GtkWidget *play_button, *pause_button, *stop_button; /* Buttons */
main_window = gtk_window_new (GTK_WINDOW_TOPLEVEL);
g_signal_connect (G_OBJECT (main_window), "delete-event", G_CALLBACK (delete_event_cb), data);
video_window = gtk_drawing_area_new ();
gtk_widget_set_double_buffered (video_window, FALSE);
g_signal_connect (video_window, "realize", G_CALLBACK (realize_cb), data);
g_signal_connect (video_window, "draw", G_CALLBACK (draw_cb), data);
play_button = gtk_button_new_from_icon_name ("media-playback-start", GTK_ICON_SIZE_SMALL_TOOLBAR);
g_signal_connect (G_OBJECT (play_button), "clicked", G_CALLBACK (play_cb), data);
pause_button = gtk_button_new_from_icon_name ("media-playback-pause", GTK_ICON_SIZE_SMALL_TOOLBAR);
g_signal_connect (G_OBJECT (pause_button), "clicked", G_CALLBACK (pause_cb), data);
stop_button = gtk_button_new_from_icon_name ("media-playback-stop", GTK_ICON_SIZE_SMALL_TOOLBAR);
g_signal_connect (G_OBJECT (stop_button), "clicked", G_CALLBACK (stop_cb), data);
data->slider = gtk_scale_new_with_range (GTK_ORIENTATION_HORIZONTAL, 0, 100, 1);
gtk_scale_set_draw_value (GTK_SCALE (data->slider), 0);
data->slider_update_signal_id = g_signal_connect (G_OBJECT (data->slider), "value-changed", G_CALLBACK (slider_cb), data);
data->streams_list = gtk_text_view_new ();
gtk_text_view_set_editable (GTK_TEXT_VIEW (data->streams_list), FALSE);
controls = gtk_box_new (GTK_ORIENTATION_HORIZONTAL, 0);
gtk_box_pack_start (GTK_BOX (controls), play_button, FALSE, FALSE, 2);
gtk_box_pack_start (GTK_BOX (controls), pause_button, FALSE, FALSE, 2);
gtk_box_pack_start (GTK_BOX (controls), stop_button, FALSE, FALSE, 2);
gtk_box_pack_start (GTK_BOX (controls), data->slider, TRUE, TRUE, 2);
main_hbox = gtk_box_new (GTK_ORIENTATION_HORIZONTAL, 0);
gtk_box_pack_start (GTK_BOX (main_hbox), video_window, TRUE, TRUE, 0);
gtk_box_pack_start (GTK_BOX (main_hbox), data->streams_list, FALSE, FALSE, 2);
main_box = gtk_box_new (GTK_ORIENTATION_VERTICAL, 0);
gtk_box_pack_start (GTK_BOX (main_box), main_hbox, TRUE, TRUE, 0);
gtk_box_pack_start (GTK_BOX (main_box), controls, FALSE, FALSE, 0);
gtk_container_add (GTK_CONTAINER (main_window), main_box);
gtk_window_set_default_size (GTK_WINDOW (main_window), 640, 480);
gtk_widget_show_all (main_window);
}
/* This function is called periodically to refresh the GUI */
static gboolean refresh_ui (CustomData *data) {
gint64 current = -1;
/* We do not want to update anything unless we are in the PAUSED or PLAYING states */
if (data->state < GST_STATE_PAUSED)
return TRUE;
/* If we didn't know it yet, query the stream duration */
if (!GST_CLOCK_TIME_IS_VALID (data->duration)) {
if (!gst_element_query_duration (data->playbin, GST_FORMAT_TIME, &data->duration)) {
g_printerr ("Could not query current duration.n");
} else {
/* Set the range of the slider to the clip duration, in SECONDS */
gtk_range_set_range (GTK_RANGE (data->slider), 0, (gdouble)data->duration / GST_SECOND);
}
}
if (gst_element_query_position (data->playbin, GST_FORMAT_TIME, ¤t)) {
/* Block the "value-changed" signal, so the slider_cb function is not called
* (which would trigger a seek the user has not requested) */
g_signal_handler_block (data->slider, data->slider_update_signal_id);
/* Set the position of the slider to the current pipeline positoin, in SECONDS */
gtk_range_set_value (GTK_RANGE (data->slider), (gdouble)current / GST_SECOND);
/* Re-enable the signal */
g_signal_handler_unblock (data->slider, data->slider_update_signal_id);
}
return TRUE;
}
/* This function is called when new metadata is discovered in the stream */
static void tags_cb (GstElement *playbin, gint stream, CustomData *data) {
/* We are possibly in a GStreamer working thread, so we notify the main
* thread of this event through a message in the bus */
gst_element_post_message (playbin,
gst_message_new_application (GST_OBJECT (playbin),
gst_structure_new_empty ("tags-changed")));
}
/* This function is called when an error message is posted on the bus */
static void error_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
GError *err;
gchar *debug_info;
/* Print error details on the screen */
gst_message_parse_error (msg, &err, &debug_info);
g_printerr ("Error received from element %s: %sn", GST_OBJECT_NAME (msg->src), err->message);
g_printerr ("Debugging information: %sn", debug_info ? debug_info : "none");
g_clear_error (&err);
g_free (debug_info);
/* Set the pipeline to READY (which stops playback) */
gst_element_set_state (data->playbin, GST_STATE_READY);
}
/* This function is called when an End-Of-Stream message is posted on the bus.
* We just set the pipeline to READY (which stops playback) */
static void eos_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
g_print ("End-Of-Stream reached.n");
gst_element_set_state (data->playbin, GST_STATE_READY);
}
/* This function is called when the pipeline changes states. We use it to
* keep track of the current state. */
static void state_changed_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
GstState old_state, new_state, pending_state;
gst_message_parse_state_changed (msg, &old_state, &new_state, &pending_state);
if (GST_MESSAGE_SRC (msg) == GST_OBJECT (data->playbin)) {
data->state = new_state;
g_print ("State set to %sn", gst_element_state_get_name (new_state));
if (old_state == GST_STATE_READY && new_state == GST_STATE_PAUSED) {
/* For extra responsiveness, we refresh the GUI as soon as we reach the PAUSED state */
refresh_ui (data);
}
}
}
/* Extract metadata from all the streams and write it to the text widget in the GUI */
static void analyze_streams (CustomData *data) {
gint i;
GstTagList *tags;
gchar *str, *total_str;
guint rate;
gint n_video, n_audio, n_text;
GtkTextBuffer *text;
/* Clean current contents of the widget */
text = gtk_text_view_get_buffer (GTK_TEXT_VIEW (data->streams_list));
gtk_text_buffer_set_text (text, "", -1);
/* Read some properties */
g_object_get (data->playbin, "n-video", &n_video, NULL);
g_object_get (data->playbin, "n-audio", &n_audio, NULL);
g_object_get (data->playbin, "n-text", &n_text, NULL);
for (i = 0; i < n_video; i++) {
tags = NULL;
/* Retrieve the stream's video tags */
g_signal_emit_by_name (data->playbin, "get-video-tags", i, &tags);
if (tags) {
total_str = g_strdup_printf ("video stream %d:n", i);
gtk_text_buffer_insert_at_cursor (text, total_str, -1);
g_free (total_str);
gst_tag_list_get_string (tags, GST_TAG_VIDEO_CODEC, &str);
total_str = g_strdup_printf (" codec: %sn", str ? str : "unknown");
gtk_text_buffer_insert_at_cursor (text, total_str, -1);
g_free (total_str);
g_free (str);
gst_tag_list_free (tags);
}
}
for (i = 0; i < n_audio; i++) {
tags = NULL;
/* Retrieve the stream's audio tags */
g_signal_emit_by_name (data->playbin, "get-audio-tags", i, &tags);
if (tags) {
total_str = g_strdup_printf ("naudio stream %d:n", i);
gtk_text_buffer_insert_at_cursor (text, total_str, -1);
g_free (total_str);
if (gst_tag_list_get_string (tags, GST_TAG_AUDIO_CODEC, &str)) {
total_str = g_strdup_printf (" codec: %sn", str);
gtk_text_buffer_insert_at_cursor (text, total_str, -1);
g_free (total_str);
g_free (str);
}
if (gst_tag_list_get_string (tags, GST_TAG_LANGUAGE_CODE, &str)) {
total_str = g_strdup_printf (" language: %sn", str);
gtk_text_buffer_insert_at_cursor (text, total_str, -1);
g_free (total_str);
g_free (str);
}
if (gst_tag_list_get_uint (tags, GST_TAG_BITRATE, &rate)) {
total_str = g_strdup_printf (" bitrate: %dn", rate);
gtk_text_buffer_insert_at_cursor (text, total_str, -1);
g_free (total_str);
}
gst_tag_list_free (tags);
}
}
for (i = 0; i < n_text; i++) {
tags = NULL;
/* Retrieve the stream's subtitle tags */
g_signal_emit_by_name (data->playbin, "get-text-tags", i, &tags);
if (tags) {
total_str = g_strdup_printf ("nsubtitle stream %d:n", i);
gtk_text_buffer_insert_at_cursor (text, total_str, -1);
g_free (total_str);
if (gst_tag_list_get_string (tags, GST_TAG_LANGUAGE_CODE, &str)) {
total_str = g_strdup_printf (" language: %sn", str);
gtk_text_buffer_insert_at_cursor (text, total_str, -1);
g_free (total_str);
g_free (str);
}
gst_tag_list_free (tags);
}
}
}
/* This function is called when an "application" message is posted on the bus.
* Here we retrieve the message posted by the tags_cb callback */
static void application_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
if (g_strcmp0 (gst_structure_get_name (gst_message_get_structure (msg)), "tags-changed") == 0) {
/* If the message is the "tags-changed" (only one we are currently issuing), update
* the stream info GUI */
analyze_streams (data);
}
}
int main(int argc, char *argv[]) {
CustomData data;
GstStateChangeReturn ret;
GstBus *bus;
/* Initialize GTK */
gtk_init (&argc, &argv);
/* Initialize GStreamer */
gst_init (&argc, &argv);
/* Initialize our data structure */
memset (&data, 0, sizeof (data));
data.duration = GST_CLOCK_TIME_NONE;
/* Create the elements */
data.playbin = gst_element_factory_make ("playbin", "playbin");
if (!data.playbin) {
g_printerr ("Not all elements could be created.n");
return -1;
}
/* Set the URI to play */
g_object_set (data.playbin, "uri", "https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm", NULL);
/* Connect to interesting signals in playbin */
g_signal_connect (G_OBJECT (data.playbin), "video-tags-changed", (GCallback) tags_cb, &data);
g_signal_connect (G_OBJECT (data.playbin), "audio-tags-changed", (GCallback) tags_cb, &data);
g_signal_connect (G_OBJECT (data.playbin), "text-tags-changed", (GCallback) tags_cb, &data);
/* Create the GUI */
create_ui (&data);
/* Instruct the bus to emit signals for each received message, and connect to the interesting signals */
bus = gst_element_get_bus (data.playbin);
gst_bus_add_signal_watch (bus);
g_signal_connect (G_OBJECT (bus), "message::error", (GCallback)error_cb, &data);
g_signal_connect (G_OBJECT (bus), "message::eos", (GCallback)eos_cb, &data);
g_signal_connect (G_OBJECT (bus), "message::state-changed", (GCallback)state_changed_cb, &data);
g_signal_connect (G_OBJECT (bus), "message::application", (GCallback)application_cb, &data);
gst_object_unref (bus);
/* Start playing */
ret = gst_element_set_state (data.playbin, GST_STATE_PLAYING);
if (ret == GST_STATE_CHANGE_FAILURE) {
g_printerr ("Unable to set the pipeline to the playing state.n");
gst_object_unref (data.playbin);
return -1;
}
/* Register a function that GLib will call every second */
g_timeout_add_seconds (1, (GSourceFunc)refresh_ui, &data);
/* Start the GTK main loop. We will not regain control until gtk_main_quit is called. */
gtk_main ();
/* Free resources */
gst_element_set_state (data.playbin, GST_STATE_NULL);
gst_object_unref (data.playbin);
return 0;
}
3. 代码详解
关于本教程的结构,我们不再使用前向函数定义:函数将在使用之前定义。此外,为了解释清楚,代码片段的显示顺序并不总是与程序顺序匹配。使用行号定位完整代码中的代码段。
#include <gdk/gdk.h>
#if defined (GDK_WINDOWING_X11)
#include <gdk/gdkx.h>
#elif defined (GDK_WINDOWING_WIN32)
#include <gdk/gdkwin32.h>
#elif defined (GDK_WINDOWING_QUARTZ)
#include <gdk/gdkquartzwindow.h>
#endif
首先值得注意的是,我们不再完全独立于平台。我们需要为我们将要使用的窗口系统包含适当的GDK头。幸运的是,没有那么多受支持的窗口系统,所以这三行代码通常就足够了:X11用于Linux,Win32用于Windows,Quartz用于Mac OSX。
本教程主要由回调函数组成,这些函数将从GStreamer或GTK+调用,因此让我们回顾一下主函数,它注册了所有这些回调。
int main(int argc, char *argv[]) {
CustomData data;
GstStateChangeReturn ret;
GstBus *bus;
/* Initialize GTK */
gtk_init (&argc, &argv);
/* Initialize GStreamer */
gst_init (&argc, &argv);
/* Initialize our data structure */
memset (&data, 0, sizeof (data));
data.duration = GST_CLOCK_TIME_NONE;
/* Create the elements */
data.playbin = gst_element_factory_make ("playbin", "playbin");
if (!data.playbin) {
g_printerr ("Not all elements could be created.n");
return -1;
}
/* Set the URI to play */
g_object_set (data.playbin, "uri", "https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm", NULL);
标准GStreamer初始化和playbin管道创建,以及GTK+初始化。没什么新鲜事。
/* Connect to interesting signals in playbin */
g_signal_connect (G_OBJECT (data.playbin), "video-tags-changed", (GCallback) tags_cb, &data);
g_signal_connect (G_OBJECT (data.playbin), "audio-tags-changed", (GCallback) tags_cb, &data);
g_signal_connect (G_OBJECT (data.playbin), "text-tags-changed", (GCallback) tags_cb, &data);
当流中出现新标记(元数据)时,我们有兴趣得到通知。为了简单起见,我们将处理来自同一回调标记的所有类型的标记(视频、音频和文本)。
/* Create the GUI */
create_ui (&data);
所有GTK+小部件的创建和信号注册都在这个函数中进行。它只包含与GTK相关的函数调用,因此我们将跳过它的定义。它所注册的信号传递用户命令,如下在查看回调时所示。
/* Instruct the bus to emit signals for each received message, and connect to the interesting signals */
bus = gst_element_get_bus (data.playbin);
gst_bus_add_signal_watch (bus);
g_signal_connect (G_OBJECT (bus), "message::error", (GCallback)error_cb, &data);
g_signal_connect (G_OBJECT (bus), "message::eos", (GCallback)eos_cb, &data);
g_signal_connect (G_OBJECT (bus), "message::state-changed", (GCallback)state_changed_cb, &data);
g_signal_connect (G_OBJECT (bus), "message::application", (GCallback)application_cb, &data);
gst_object_unref (bus);
在回放教程1:Playbin用法中,gst_bus_add_watch()
用于注册接收发送到GStreamer总线的每条消息的函数。我们可以通过使用信号来实现更精细的粒度,这允许我们只注册感兴趣的消息。通过调用gst_bus_add_signal_watch()
,我们指示总线在每次收到消息时发出信号。此信号的名称为message::detail
,其中detail是触发信号发射的消息。例如,当总线接收到EOS
消息时,它发出一个名为message::eos
的信号。
本教程使用信号的详细信息仅注册到我们关心的消息。如果我们已经注册到消息信号,我们将收到每个消息的通知,就像gst_bus_add_watch()
那样。
请记住,要使总线监视工作(无论是gst_bus_add_watch()
还是gst_bus_add_signal_watch()
),必须有GLib主循环运行。在本例中,它隐藏在GTK+主循环中。
/* Register a function that GLib will call every second */
g_timeout_add_seconds (1, (GSourceFunc)refresh_ui, &data);
在将控制权转移到GTK+之前,我们使用g_timeout_add_seconds()
来注册另一个回调,这次有一个超时,因此它每秒都被调用一次。我们将使用它从刷新ui函数刷新GUI。
之后,我们完成了设置,可以启动GTK+主循环。当有趣的事情发生时,我们会从回拨中恢复控制。让我们回顾一下回调。每个回调都有不同的签名,这取决于调用它的人。您可以在信号文档中查找签名(参数的含义和返回值)。
/* This function is called when the GUI toolkit creates the physical window that will hold the video.
* At this point we can retrieve its handler (which has a different meaning depending on the windowing system)
* and pass it to GStreamer through the VideoOverlay interface. */
static void realize_cb (GtkWidget *widget, CustomData *data) {
GdkWindow *window = gtk_widget_get_window (widget);
guintptr window_handle;
if (!gdk_window_ensure_native (window))
g_error ("Couldn't create native window needed for GstVideoOverlay!");
/* Retrieve window handler from GDK */
#if defined (GDK_WINDOWING_WIN32)
window_handle = (guintptr)GDK_WINDOW_HWND (window);
#elif defined (GDK_WINDOWING_QUARTZ)
window_handle = gdk_quartz_window_get_nsview (window);
#elif defined (GDK_WINDOWING_X11)
window_handle = GDK_WINDOW_XID (window);
#endif
/* Pass it to playbin, which implements VideoOverlay and will forward it to the video sink */
gst_video_overlay_set_window_handle (GST_VIDEO_OVERLAY (data->playbin), window_handle);
}
代码注释可以自己说话。在应用程序生命周期的这一点上,我们知道GStreamer应该在其中呈现视频的窗口的句柄(无论是X11的XID、窗口的HWND还是Quartz的NSView)。我们只需从窗口系统中检索它,然后使用gst-video-overlay-set-window-handle()
通过GstVideoOverlay
接口将其传递给playbin
。playbin
将定位视频接收器并将处理程序传递给它,因此它不会创建自己的窗口并使用此窗口。
这里看不到更多;playbin
和GstVideoOverlay
真的简化了这个过程!
/* This function is called when the PLAY button is clicked */
static void play_cb (GtkButton *button, CustomData *data) {
gst_element_set_state (data->playbin, GST_STATE_PLAYING);
}
/* This function is called when the PAUSE button is clicked */
static void pause_cb (GtkButton *button, CustomData *data) {
gst_element_set_state (data->playbin, GST_STATE_PAUSED);
}
/* This function is called when the STOP button is clicked */
static void stop_cb (GtkButton *button, CustomData *data) {
gst_element_set_state (data->playbin, GST_STATE_READY);
}
这三个小回调与GUI中的PLAY、PAUSE和STOP按钮相关。他们只是将管道设置为相应的状态。注意,在STOP状态下,我们将管道设置为READY
。我们可以将管道一直降到NULL
状态,但是,转换会慢一些,因为需要释放和重新获取一些资源(如音频设备)。
/* This function is called when the main window is closed */
static void delete_event_cb (GtkWidget *widget, GdkEvent *event, CustomData *data) {
stop_cb (NULL, data);
gtk_main_quit ();
}
gtk_main_quit()
最终将调用main中的gtk_main_run()
来终止,在本例中,它将完成程序。在这里,我们在主窗口关闭时调用它,在管道停止后(只是为了整洁)。
/* This function is called everytime the video window needs to be redrawn (due to damage/exposure,
* rescaling, etc). GStreamer takes care of this in the PAUSED and PLAYING states, otherwise,
* we simply draw a black rectangle to avoid garbage showing up. */
static gboolean draw_cb (GtkWidget *widget, cairo_t *cr, CustomData *data) {
if (data->state < GST_STATE_PAUSED) {
GtkAllocation allocation;
/* Cairo is a 2D graphics library which we use here to clean the video window.
* It is used by GStreamer for other reasons, so it will always be available to us. */
gtk_widget_get_allocation (widget, &allocation);
cairo_set_source_rgb (cr, 0, 0, 0);
cairo_rectangle (cr, 0, 0, allocation.width, allocation.height);
cairo_fill (cr);
}
return FALSE;
}
当有数据流(处于暂停和播放状态)时,视频接收器负责刷新视频窗口的内容。但在其他情况下,它不会,所以我们必须这样做。在本例中,我们只是用一个黑色矩形填充窗口。
/* This function is called when the slider changes its position. We perform a seek to the
* new position here. */
static void slider_cb (GtkRange *range, CustomData *data) {
gdouble value = gtk_range_get_value (GTK_RANGE (data->slider));
gst_element_seek_simple (data->playbin, GST_FORMAT_TIME, GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_KEY_UNIT,
(gint64)(value * GST_SECOND));
}
这是一个例子,说明了由于GStreamer和GTK+的协作,可以非常容易地实现复杂的GUI元素,如seeker bar(或允许搜索的滑块)。如果滑块已被拖动到新位置,请告诉GStreamer使用gst_element_seek_simple()
搜索到该位置(如基本教程4:时间管理中所示)。滑块已设置,因此其值表示秒。
值得一提的是,执行一些限制可以获得一些性能(和响应性),也就是说,不响应每个要查找的用户请求。由于seek操作肯定需要一些时间,因此在seek之后等待半秒(例如)再允许另一个seek通常会更好。否则,如果用户疯狂地拖动滑块,应用程序可能看起来没有响应,这将不允许在新的搜索排队之前完成任何搜索。
/* This function is called periodically to refresh the GUI */
static gboolean refresh_ui (CustomData *data) {
gint64 current = -1;
/* We do not want to update anything unless we are in the PAUSED or PLAYING states */
if (data->state < GST_STATE_PAUSED)
return TRUE;
此功能将移动滑块以反映媒体的当前位置。首先,如果我们没有处于播放状态,我们在这里没有任何事情可做(另外,位置和持续时间查询通常会失败)。
/* If we didn't know it yet, query the stream duration */
if (!GST_CLOCK_TIME_IS_VALID (data->duration)) {
if (!gst_element_query_duration (data->playbin, GST_FORMAT_TIME, &data->duration)) {
g_printerr ("Could not query current duration.n");
} else {
/* Set the range of the slider to the clip duration, in SECONDS */
gtk_range_set_range (GTK_RANGE (data->slider), 0, (gdouble)data->duration / GST_SECOND);
}
}
如果不知道,我们会恢复剪辑的持续时间,这样就可以设置滑块的范围。
if (gst_element_query_position (data->playbin, GST_FORMAT_TIME, ¤t)) {
/* Block the "value-changed" signal, so the slider_cb function is not called
* (which would trigger a seek the user has not requested) */
g_signal_handler_block (data->slider, data->slider_update_signal_id);
/* Set the position of the slider to the current pipeline positoin, in SECONDS */
gtk_range_set_value (GTK_RANGE (data->slider), (gdouble)current / GST_SECOND);
/* Re-enable the signal */
g_signal_handler_unblock (data->slider, data->slider_update_signal_id);
}
return TRUE;
我们查询当前的管道位置,并相应地设置滑块的位置。这将触发值改变信号的发射,我们用来知道用户何时拖动滑块。由于我们不希望在用户请求的情况下发生查找,因此我们使用g_signal_handler_block()
和g_signal_handler_unblock()
禁用此操作期间更改的值信号发射。
从该函数返回TRUE将使它在将来保持调用状态。如果返回FALSE,计时器将被删除。
/* This function is called when new metadata is discovered in the stream */
static void tags_cb (GstElement *playbin, gint stream, CustomData *data) {
/* We are possibly in a GStreamer working thread, so we notify the main
* thread of this event through a message in the bus */
gst_element_post_message (playbin,
gst_message_new_application (GST_OBJECT (playbin),
gst_structure_new_empty ("tags-changed")));
}
这是本教程的要点之一。当在媒体中发现新标记时,将调用此函数,即从流式线程(即从应用程序(或主)线程以外的线程)调用此函数。我们在这里要做的是更新GTK+小部件来反映这个新信息,但是GTK+不允许从主线程以外的线程进行操作。
解决方案是让playbin在总线上发布一条消息并返回到调用线程。适当时,主线程将获取此消息并更新GTK。
gst_element_post_message()
使GStreamer元素将给定的消息发布到总线。gst_message_new_application()
创建应用程序类型的新消息。GStreamer消息有不同的类型,这个特定的类型被保留给应用程序:它将通过总线而不受GStreamer的影响。类型列表可以在GstMessageType文档中找到。
消息可以通过它们的嵌入式GstStructure传递附加信息,GstStructure是一个非常灵活的数据容器。在这里,我们使用gst_structure_new()
创建一个新结构,并更改其名称标记,以避免在发送其他应用程序消息时出现混淆。
稍后,在主线程中,总线将接收此消息并发出message::application
,我们已将其与application_cb
函数关联:
/* This function is called when an "application" message is posted on the bus.
* Here we retrieve the message posted by the tags_cb callback */
static void application_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
if (g_strcmp0 (gst_structure_get_name (gst_message_get_structure (msg)), "tags-changed") == 0) {
/* If the message is the "tags-changed" (only one we are currently issuing), update
* the stream info GUI */
analyze_streams (data);
}
}
一旦我确定是标记更改消息,我们就调用analyze_streams
函数,该函数也用于回放教程1:Playbin用法中,在这里更详细。它基本上从流中恢复标记,并将它们写入GUI中的文本小部件中。
error_cb
、eos-cb
和state-changed-cb
并不真正值得解释,因为它们与之前的所有教程中所做的一样,而是从它们自己的功能开始。
就这样!本教程中的代码量可能看起来令人望而生畏,但所需的概念很少而且容易。如果你已经遵循了前面的教程,并有一点GTK的知识,你可能理解这一点,现在可以享受你自己的媒体播放器!
4. 练习
如果此媒体播放器不适合您,请尝试将显示流信息的文本小部件更改为适当的列表视图(或树视图)。然后,当用户选择不同的流时,让GStreamer切换流!要切换流,您需要阅读回放教程1:Playbin用法。
5. 小结
本教程显示:
- 如何使用
gst-video-overlay-set-window-handle()
将视频输出到特定的窗口句柄。 - 如何通过向
g_timeout_add_seconds()
注册超时回调来定期刷新GUI。 - 如何使用
gst_element_post_message()
通过总线通过应用程序消息向主线程传递信息。 - 如何使用
gst_bus_add_signal_watch()
使总线发出信号,并使用信号详细信息区分所有消息类型,从而仅通知感兴趣的消息。
这允许您构建一个有点完整的媒体播放器和一个适当的图形用户界面。以下基本教程将重点放在其他单独的GStreamer主题上。非常感谢阅读本课程,下次再见!
最后
以上就是高高红酒为你收集整理的GStreamer官方入门课程5:把GStreamer与你的图形界面集成在一起的全部内容,希望文章能够帮你解决GStreamer官方入门课程5:把GStreamer与你的图形界面集成在一起所遇到的程序开发问题。
如果觉得靠谱客网站的内容还不错,欢迎将靠谱客网站推荐给程序员好友。
发表评论 取消回复