CN103905878A - Video data and audio data synchronized playing method and device and equipment - Google Patents

Video data and audio data synchronized playing method and device and equipment Download PDF

Info

Publication number
CN103905878A
CN103905878A CN201410093301.3A CN201410093301A CN103905878A CN 103905878 A CN103905878 A CN 103905878A CN 201410093301 A CN201410093301 A CN 201410093301A CN 103905878 A CN103905878 A CN 103905878A
Authority
CN
China
Prior art keywords
playing
data
display terminal
time value
audio data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201410093301.3A
Other languages
Chinese (zh)
Inventor
李典
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing QIYI Century Science and Technology Co Ltd
Original Assignee
Beijing QIYI Century Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing QIYI Century Science and Technology Co Ltd filed Critical Beijing QIYI Century Science and Technology Co Ltd
Priority to CN201410093301.3A priority Critical patent/CN103905878A/en
Publication of CN103905878A publication Critical patent/CN103905878A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The embodiment of the invention provides a video data and audio data synchronized playing method and device and equipment. The method includes the steps that multimedia data are received on a display terminal side; the multimedia data comprise video data and audio data; the video data carry one or more video timestamps; the audio data are sent to a mobile device side; the display terminal side is connected with the mobile device side in a wireless transmission mode; when the video data are played, playing object timestamps are generated according to currently-played video timestamps; the playing object timestamps are sent to the mobile device side; the mobile device side is used for playing the audio data corresponding to the playing object timestamps. With the video data and audio data synchronized playing method and device and the equipment, the constraint caused when wired earphones are directly connected with a display terminal can be gotten rid of by a user, operation of the user is facilitated, and synchronized playing of the audio data and the video data is achieved.

Description

Method, device and equipment for synchronously playing video data and audio data
Technical Field
The embodiment of the invention relates to the technical field of multimedia data processing, in particular to a method for synchronously playing video data and audio data, a device for synchronously playing video data and audio data and equipment.
Background
With the popularization of players capable of supporting various media, more and more media files are played synchronously to obtain better appreciation effects and artistic effects, wherein playing videos, enjoying music and browsing pictures are the most widely applied media execution modes.
Taking the smart television device as an example, the smart television comprises a smart television and a smart set top box, and can play multimedia data. In order to avoid disturbing other people and ensure watching of multimedia data in the case of rest of family members at night, for example, a user may wish to listen to sound of the smart television device playing video by wearing earphones instead of the speaker device of the smart television device playing sound.
There are generally two solutions to wearing headphones: one method is to connect the smart television device with a wired earphone; another method is for a device supporting a bluetooth headset, which is linked to a smart tv.
The first method is to insert the wired earphone into the smart tv device, and since the smart tv device is usually far from the user viewing location, the wired earphone is required to have a long line, and the plugging and unplugging are inconvenient. The user needs to drag a relatively long line to perform other actions, such as pouring water, etc., which causes inconvenience in operation.
The second method needs to purchase additional bluetooth headsets, which is high in cost; moreover, the bluetooth headset usually has a problem that sound is delayed from a picture, and the user experience is very poor.
Disclosure of Invention
The technical problem to be solved by the embodiments of the present invention is to provide a method for synchronously playing video data and audio data to solve the problems of inconvenient operation and high cost.
Correspondingly, the embodiment of the invention also provides a device and equipment for synchronously playing the video data and the audio data, which are used for ensuring the realization and the application of the method.
In order to solve the above problem, an embodiment of the present invention discloses a method for synchronously playing video data and audio data, including:
receiving multimedia data at a display terminal side; the multimedia data comprises video data and audio data; the video data carries one or more video time stamps;
sending the audio data to a mobile equipment side; the display terminal side is connected with the mobile equipment side in a wireless transmission mode;
when the video data is played, generating a playing target timestamp according to the currently played video timestamp;
sending the playing target timestamp to the mobile equipment side; and the mobile equipment side is used for playing the audio data corresponding to the playing target timestamp.
Preferably, the wireless transmission means comprises one or more of:
ethernet, Bluetooth, 2.4G wireless network, infrared ray and wireless network protocol ZigBee.
Preferably, the playing target timestamp includes a video timestamp corresponding to current video data extracted by the display terminal side when the video data is played.
Preferably, the playing target timestamp includes a video timestamp and a delay time value corresponding to current video data extracted by the display terminal side when the video data is played; the delay time value is the time of data transmission delay between the display terminal side and the mobile equipment side.
Preferably, the delay time value is a delay time value obtained by sending preset simulation data to the mobile device and recording a current first system time value, receiving the simulation data returned by the mobile device and recording a current second system time value, and calculating a half of a difference between the second system time value and the first system time value;
or;
the delay time value is the delay time value sent by the mobile equipment.
Preferably, before the step of generating a playing target timestamp according to a currently playing video timestamp when the video data is played, the method further includes:
and buffering the video data.
The embodiment of the invention also discloses a method for synchronously playing the video data and the audio data, which comprises the following steps:
receiving audio data sent by a display terminal side at a mobile equipment side; the display terminal side is connected with the mobile equipment side in a wireless transmission mode;
receiving a playing target timestamp sent by the display terminal side; the playing time stamp is a time stamp generated by the display terminal according to the currently played video time stamp when the video data is played;
playing the audio data corresponding to the playing target timestamp; the audio data and the video data are multimedia data received by the display terminal.
Preferably, the wireless transmission means comprises one or more of:
ethernet, Bluetooth, 2.4G wireless network, infrared ray and wireless network protocol ZigBee.
Preferably, the playing target timestamp includes a video timestamp corresponding to current video data extracted by the display terminal side when the video data is played.
Preferably, the playing target timestamp includes a video timestamp and a delay time value corresponding to current video data extracted by the display terminal side when the video data is played; the delay time value is the time of data transmission delay between the display terminal side and the mobile equipment side.
Preferably, after the step of receiving the play target timestamp sent by the display terminal side, the method further includes:
obtaining a delay time value;
adding the delay time value to the time value indicated by the play target timestamp.
Preferably, the delay time value is a delay time value obtained by sending preset analog data to the display terminal and recording a current third system time value, receiving the analog data returned by the display terminal and recording a current fourth system time value, and calculating a half of a difference between the fourth system time value and the third system time value;
or,
the delay time value is the delay time value sent by the display terminal.
Preferably, the audio data carries one or more audio time stamps;
the step of playing the audio data corresponding to the playing target timestamp includes:
when the audio time stamp played currently is larger than the playing target time stamp, the audio data playing is paused until the audio time stamp played currently is equal to the playing target time stamp;
and/or the presence of a gas in the gas,
when the audio time stamp played currently is smaller than or equal to the playing target time stamp, searching for the audio time stamp which is equal to the playing target time stamp;
and playing the audio data corresponding to the audio time stamp.
Preferably, before the step of playing the audio data corresponding to the playing target timestamp, the method further includes:
and carrying out buffering processing on the audio data.
The embodiment of the invention also discloses a device for synchronously playing the video data and the audio data, which comprises the following steps:
the multimedia data receiving module is used for receiving the multimedia data at the display terminal side; the multimedia data comprises video data and audio data; the video data carries one or more video time stamps;
the audio data sending module is used for sending the audio data to a mobile equipment side; the display terminal side is connected with the mobile equipment side in a wireless transmission mode;
the playing target timestamp generating module is used for generating a playing target timestamp according to the currently played video timestamp when the video data is played;
a playing target timestamp sending module, configured to send the playing target timestamp to the mobile device side; and the mobile equipment side is used for playing the audio data corresponding to the playing target timestamp.
Preferably, the wireless transmission means comprises one or more of:
ethernet, Bluetooth, 2.4G wireless network, infrared ray and wireless network protocol ZigBee.
Preferably, the playing target timestamp includes a video timestamp corresponding to current video data extracted by the display terminal side when the video data is played.
Preferably, the playing target timestamp includes a video timestamp and a delay time value corresponding to current video data extracted by the display terminal side when the video data is played; the delay time value is the time of data transmission delay between the display terminal side and the mobile equipment side.
Preferably, the delay time value is a delay time value obtained by sending preset simulation data to the mobile device and recording a current first system time value, receiving the simulation data returned by the mobile device and recording a current second system time value, and calculating a half of a difference between the second system time value and the first system time value;
or;
the delay time value is the delay time value sent by the mobile equipment.
Preferably, the method further comprises the following steps:
and the first buffer module is used for buffering the video data.
The embodiment of the invention also discloses a device for synchronously playing the video data and the audio data, which comprises the following steps:
the audio data receiving module is used for receiving audio data sent by the display terminal side at the mobile equipment side; the display terminal side is connected with the mobile equipment side in a wireless transmission mode;
a playing target timestamp receiving module, configured to receive a playing target timestamp sent by the display terminal side; the playing time stamp is a time stamp generated by the display terminal according to the currently played video time stamp when the video data is played;
the audio data playing module is used for playing the audio data corresponding to the playing target timestamp; the audio data and the video data are multimedia data received by the display terminal.
Preferably, the wireless transmission means comprises one or more of:
ethernet, Bluetooth, 2.4G wireless network, infrared ray and wireless network protocol ZigBee.
Preferably, the playing target timestamp includes a video timestamp corresponding to current video data extracted by the display terminal side when the video data is played.
Preferably, the playing target timestamp includes a video timestamp and a delay time value corresponding to current video data extracted by the display terminal side when the video data is played; the delay time value is the time of data transmission delay between the display terminal side and the mobile equipment side.
Preferably, the method further comprises the following steps:
the delay time value acquisition module is used for acquiring a delay time value;
and the delay time value increasing module is used for increasing the delay time value on the time value indicated by the playing target timestamp.
Preferably, the delay time value is a delay time value obtained by sending preset analog data to the display terminal and recording a current third system time value, receiving the analog data returned by the display terminal and recording a current fourth system time value, and calculating a half of a difference between the fourth system time value and the third system time value;
or,
the delay time value is the delay time value sent by the display terminal.
Preferably, the audio data carries one or more audio time stamps; the audio data broadcasting module comprises:
the playing pause submodule is used for pausing the playing of the audio data until the currently played audio timestamp is equal to the playing target timestamp when the currently played audio timestamp is greater than the playing target timestamp;
and/or the presence of a gas in the gas,
the searching submodule is used for searching the audio time stamp which is equal to the playing target time stamp when the currently played audio time stamp is less than or equal to the playing target time stamp;
and the corresponding playing submodule is used for playing the audio data corresponding to the audio time stamp.
Preferably, the method further comprises the following steps:
and the second buffer module is used for carrying out buffer processing on the audio data.
The embodiment of the invention also discloses a device, which comprises:
one or more processors;
a memory; and
one or more modules stored in the memory and configured to be executed by the one or more processors, wherein the one or more modules have functionality to:
receiving multimedia data at a display terminal side; the multimedia data comprises video data and audio data; the video data carries one or more video time stamps;
sending the audio data to a mobile equipment side; the display terminal side is connected with the mobile equipment side in a wireless transmission mode;
when the video data is played, generating a playing target timestamp according to the currently played video timestamp;
sending the playing target timestamp to the mobile equipment side; and the mobile equipment side is used for playing the audio data corresponding to the playing target timestamp.
The embodiment of the invention also discloses a device, which comprises:
one or more processors;
a memory; and
one or more modules stored in the memory and configured to be executed by the one or more processors, wherein the one or more modules have functionality to:
receiving audio data sent by a display terminal side at a mobile equipment side; the display terminal side is connected with the mobile equipment side in a wireless transmission mode;
receiving a playing target timestamp sent by the display terminal side; the playing time stamp is a time stamp generated by the display terminal according to the currently played video time stamp when the video data is played;
playing the audio data corresponding to the playing target timestamp; the audio data and the video data are multimedia data received by the display terminal.
Compared with the background art, the embodiment of the invention has the following advantages:
according to the embodiment of the invention, the display terminal sends the audio data to the mobile equipment after receiving the multimedia data, the display terminal sends the playing target timestamp to the mobile equipment when playing the video data, and the mobile equipment plays the audio data according to the playing target timestamp, wherein the display terminal is connected with the mobile equipment in a wireless transmission mode, so that a user can get rid of the constraint of directly connecting a wired earphone with the display terminal, the operation of the user is convenient, meanwhile, the problem of obvious asynchronization caused by the accumulation of tiny differences generated when the audio data and the video data are played is avoided, and the synchronous playing of the audio data and the video data is realized. In addition, the mobile equipment is a product with high frequency of use for the public, the embodiment of the invention reuses the mobile equipment, has multiple functions, avoids additional purchase of Bluetooth earphones, has strong practicability and greatly reduces the cost.
According to the embodiment of the invention, the delay time value is added in the playing target timestamp, so that the influence of the delay of the display terminal and the mobile equipment in transmitting the playing target timestamp is eliminated, and the synchronous playing precision of the audio data and the video data is further improved.
Drawings
FIG. 1 is a diagram illustrating an exemplary process for a display terminal to synchronously play video data and audio data;
FIG. 2 is a diagram illustrating an exemplary process of a display terminal playing video data and audio data synchronously with a Bluetooth headset;
fig. 3 is a flowchart illustrating steps of embodiment 1 of a method for synchronously playing video data and audio data according to the present invention;
FIG. 4 is a flow chart illustrating the steps of embodiment 2 of the method for playing video data and audio data synchronously;
FIG. 5 is a flow chart illustrating the steps of embodiment 3 of the method for playing video data and audio data synchronously;
FIG. 6 is a flow chart illustrating the steps of embodiment 4 of the method for playing video data and audio data synchronously;
fig. 7 is a block diagram showing the structure of an embodiment 1 of the apparatus for synchronously playing video data and audio data according to the present invention;
fig. 8 is a block diagram showing the structure of an embodiment 2 of the apparatus for synchronously playing video data and audio data according to the present invention;
fig. 9 is a schematic structural diagram of a smart television according to an embodiment of the present invention;
fig. 10 shows a schematic structural diagram of a terminal device according to an embodiment of the present invention.
Detailed Description
In order to make the aforementioned objects, features and advantages of the embodiments of the present invention more comprehensible, embodiments of the present invention are described in detail below with reference to the accompanying drawings and the detailed description.
Referring to fig. 1, a flowchart illustrating a process of synchronously playing video data and audio data by a display terminal is shown.
As shown in fig. 1, an audio/video data reading module in a display terminal reads multimedia data and then decodes the multimedia data to obtain video data and audio data, and the video data and the audio data have time stamps for synchronization.
And then sending the video data to a video output module and sending the audio data to an audio output module. When the multimedia data are played, the playing time point control module synchronizes timestamps of the video data and the audio data, then the video output module plays the video data on the display according to the synchronized timestamps, the audio data module plays the audio data on the loudspeaker according to the synchronized timestamps, finally, the synchronous playing of the video data and the audio data is realized, and the playing of the multimedia data is realized on the whole.
However, in some cases, such as at night when family members rest, to avoid disturbing others, the user may wish to listen to the audio data through the earphones when the smart television device plays the multimedia data.
In order to solve the problems, the Bluetooth headset can be used for remotely listening to audio data when the intelligent television equipment plays multimedia data.
In bluetooth applications, bluetooth products differentiate between device types and service types.
Generally, the device types include a main device type and an auxiliary device type, and specify which type of device the bluetooth device belongs to, such as a headset, a mobile phone, a printer, and the like. Taking a mobile phone as an example, whether the mobile phone is a smart phone or a normal mobile phone is specified by the type of the auxiliary device.
The service type specifies the services that a bluetooth device can provide. Taking a mobile phone as an example, some mobile phones support two File Transfer services, i.e., an Object store Profile (OPP) service and a File Transfer Protocol (FTP), some mobile phones only provide the OPP service, two bluetooth devices need to communicate with each other, and the device types may be different, such as a mobile phone and an earphone, but the service protocols of the bluetooth devices must be consistent because the earphone is required to provide a voice service, and the earphone is searched by the mobile phone, which service it can provide is required to inquire before connection, and then communication is performed.
Although a mobile device, such as a smart phone, a smart tablet, etc., has a bluetooth function and belongs to a bluetooth product, the device type of the mobile device is not a headset, and the mobile device cannot provide a bluetooth headset service capability, and therefore cannot be used as a headset to be connected to a bluetooth module on a display terminal and push audio data.
Therefore, if the user wants to listen to the audio data of the smart television device playing the multimedia data through the bluetooth headset, the bluetooth headset needs to be additionally purchased.
Referring to fig. 2, a flowchart illustrating a process of playing video data and audio data synchronously by a display terminal and a bluetooth headset is shown.
As shown in fig. 2, the display terminal is connected to the bluetooth headset through a bluetooth link. And an audio and video data reading module in the display terminal reads the multimedia data and then decodes the multimedia data to obtain video data and audio data, wherein the video data and the audio data are provided with time stamps for synchronization.
And then sending the video data to a video output module and sending the audio data to the Bluetooth headset.
And a Bluetooth audio receiving module in the Bluetooth earphone receives audio data sent by the display terminal, then transmits the audio data to an audio output module, and then outputs the audio data to an earphone loudspeaker for playing.
The audio data and the video data are respectively played on two devices, and before the audio data is played, the audio data is subjected to a Bluetooth transmission process and data processing in the transmission process, so that time delay is generated certainly.
The clock frequency that bluetooth headset timing was used and display terminal's clock frequency have little difference, can make audio data and video data produce little difference at the speed of broadcast, and this difference can constantly accumulate gradually, causes audio data and video data asynchronous phenomenon along with the increase of broadcast time, and is more and more obvious.
Based on the above requirements, the inventor creatively proposes one of the core concepts of the embodiments of the present invention, and the display terminal decodes the multimedia data into the video data and the audio data, transmits the audio data to the mobile device, sends the corresponding timestamp to the mobile device when the video data is played, and the mobile device performs the synchronized video data playing according to the timestamp.
Referring to fig. 3, a flowchart illustrating steps of embodiment 1 of a method for playing video data and audio data synchronously according to the present invention is shown, and an embodiment of the present invention may include the following steps:
step 301, receiving multimedia data at a display terminal side; the multimedia data may include video data and audio data; the video data may carry one or more video time stamps;
it should be noted that the Display terminal may include a smart television, a personal computer, a palm computer, a mobile device, and the like, and the smart television may include a Liquid Crystal Display (LCD) television, a Light Emitting Diode (LED) television, a 3D television, a plasma television, and the like, which is not limited in this embodiment of the present invention.
The multimedia data may be a digital television signal, may be multimedia data stored on a display terminal or a magnetic disk of an external device, may be streaming media data, and the like, which is not limited in this embodiment of the present invention.
In practical applications, a time stamp indicating information of a time point may be added at the time of multimedia data production.
The display terminal can decode the multimedia data after receiving the multimedia data to obtain video data and audio data.
The audio data may carry one or more audio time stamps and the video data may carry one or more video time stamps. The audio time stamp may refer to time point data at which a piece of audio will be played, the video time stamp may refer to time point data at which a frame of picture will be played, and at a certain time point of the multimedia data, the audio time stamp and the video time stamp may be equal, which may be substantially a time stamp added when the multimedia data is produced. For example, an audio time stamp may indicate how many milliseconds a certain piece of audio should be output for playback, and a video time stamp may indicate how many milliseconds a certain frame of picture should be output for playback. In practical applications, the video time stamp of the first frame picture and the audio time stamp of the first section of audio may be 0, and the following video time stamp and audio time stamp may be increased at the same interval time.
Step 302, sending the audio data to a mobile equipment side; the display terminal side is connected with the mobile equipment side in a wireless transmission mode;
by applying the embodiment of the invention, the display terminal can establish a data transmission link with the mobile equipment.
It should be noted that the mobile device may be various mobile devices such as a tablet computer, a Personal Digital Assistant (PDA), a mobile phone, and the like, and the embodiment of the present invention is not limited thereto.
In a preferred example of the embodiment of the present invention, the manner of wireless transmission may include one or more of the following:
an Ethernet;
in this example, the transmission link between the display terminal and the mobile device may be an Ethernet (Ethernet) link, and the display terminal side may transmit the audio data to the mobile device side through the Ethernet transmission manner.
Ethernet (Ethernet) is a computer local area networking technology that uses passive media to broadcast information. It specifies the physical layer and data link layer protocols, the interfaces of the physical layer and data link layer and the interfaces of the data link layer with higher layers. The standard topological structure is bus type topology, but the current fast Ethernet (100 BASE-T, 1000BASE-T standard) uses the switch (Switchhub) to connect and organize the network in order to reduce the conflict to the maximum extent and improve the network speed and the use efficiency to the maximum extent, so the topological structure of the Ethernet is star-shaped, but logically, the Ethernet still uses the bus type topology and the bus contention technology of CSMA/CD (Carrier Sense multiple Access/Collision Detect, i.e. Carrier Sense multiple Access with conflict detection).
In this example, the ethernet may be WiFi (a wireless local area network device established in IEEE802.11 standard), and after the display terminal and the mobile device are connected to the same local area network, the connection may be initiated by using a TCP/IP (Transmission Control Protocol/Internet Protocol, also known as a network communication Protocol) Protocol through an IP (Internet Protocol, Protocol for interconnection between networks) address.
Bluetooth;
in this example, a transmission link between the display terminal and the mobile device may be a bluetooth link, and the display terminal side may transmit the audio data to the mobile device side by using a bluetooth transmission manner.
Bluetooth, a radio technology that supports short-range communication (typically within 10 m) of devices. The wireless information exchange can be carried out among a plurality of devices such as mobile phones, PDAs, wireless earphones, notebook computers and related peripherals.
By using the bluetooth technology, the communication between mobile communication terminal devices can be effectively simplified, and the communication between the devices and the Internet can also be successfully simplified, so that the data transmission becomes faster and more efficient, and the way is widened for wireless communication.
Bluetooth adopts a distributed network structure and a fast frequency hopping and short packet technology, supports point-to-point and point-to-multipoint communication, and works in a global universal 2.4GHz ISM (industrial, scientific and medical) frequency band. The data rate is 1 Mbps. And the full duplex transmission is realized by adopting a time division duplex transmission scheme.
In this example, according to the bluetooth protocol, the mobile device may search for surrounding devices, list device IDs and names, and select a display terminal to be connected to perform connection.
It should be noted that, in this example, a mobile device, such as a smart phone, a smart tablet, and the like, has a bluetooth function, belongs to a bluetooth product, and the device type thereof may be a display terminal, and may provide a bluetooth data transmission capability between the display terminals, so that the mobile device may be used as an audio data receiving terminal to be connected to a bluetooth module on the display terminal and push audio data.
2.4G wireless networks;
in this example, the transmission link between the display terminal and the mobile device may be a 2.4G wireless network link, and the display terminal side may transmit the audio data to the mobile device side through a transmission manner of the 2.4G wireless network.
The 2.4G wireless network frequency band belongs to the ISM frequency band, which is an ultra-low radiation green environment-friendly frequency band widely used in the global range; 125 communication channels are provided, because the 2.4G wireless network communication is smoother, a plurality of communication commands can not interfere with each other; the highest bandwidth transmission rate of the 2.4G wireless grid can reach 108Mbps, so that the transmission speed of the wireless grid is high; its transmission distance is relatively long (open area: 200m effective transmission distance), and it is not affected by transmission party, and supports two-way communication.
Infrared rays;
in this example, the transmission link between the display terminal and the mobile device may be an infrared link, and the display terminal side may transmit the audio data to the mobile device side by an infrared transmission manner.
Infrared is short for infrared, is a wireless communication mode, and can transmit wireless data. The infrared communication has the characteristics of low cost, convenient connection, simplicity, easy use and compact structure, so the infrared communication is widely applied to small-sized mobile equipment. Through the infrared interface, various mobile devices can freely exchange data.
And (4) a wireless network protocol ZigBee.
In this example, a transmission link between the display terminal and the mobile device may be a ZigBee link, and the display terminal side may transmit the audio data to the mobile device side in a transmission manner of a wireless network protocol ZigBee.
Zigbee is a wireless network protocol for low-speed short-range transmission based on the ieee802.15.4 standard. The protocols are, from bottom to top, a physical layer (PHY), a medium access control layer (MAC), a Transport Layer (TL), a network layer (NWK), an application layer (APL), etc. Wherein the physical layer and the medium access control layer comply with the specifications of the ieee802.15.4 standard.
The ZigBee network has the main characteristics of low power consumption, low cost, low speed, support of a large number of nodes, support of various network topologies, low complexity, rapidness, reliability and safety. The devices in the ZigBee network can be divided into three roles, namely, a Coordinator (Coordinator), a sink node (Router), a sensor node (end device), and the like.
Of course, the above transmission manner is only an example, and when implementing the embodiment of the present invention, other transmission manners may be set according to actual situations as long as the connection of the wireless transmission between the display terminal and the mobile device can be achieved, which is not limited in this embodiment of the present invention. In addition, besides the above transmission modes, those skilled in the art may also adopt other transmission modes according to actual needs, and the embodiment of the present invention is not limited thereto.
Step 303, when the video data is played, generating a playing target timestamp according to the currently played video timestamp;
in practical application, when the display terminal receives the audio data, the display terminal may continuously send video playing information (i.e., a playing target timestamp) to the display terminal.
There may be some delay in data transmission between the display terminal and the mobile device, and in one case, in order to improve the accuracy of synchronous playing, the embodiment of the present invention may consider the delay in transmission when synchronizing the video data and the audio data.
In this embodiment of the present invention, the playing target timestamp may include a video timestamp and a delay time value corresponding to current video data extracted by the display terminal when playing the video data; the delay time value is the time of data transmission delay between the display terminal side and the mobile equipment side.
In the embodiment of the present invention, the video timestamp corresponding to the current video data is taIf the delay time value is delta t, playing the target timestamp ta'=ta+Δt。
In the embodiment of the invention, the delay time value between the display terminal and the mobile equipment can be measured in advance or at present; the display terminal may be used for active measurement, or may be obtained from a mobile device, which is not limited in this embodiment of the present invention. For example, when the display terminal and the mobile device are connected for the first time, the display terminal or the mobile device actively initiates measurement of the delay time value, and after the measurement is finished, the identifier of the display terminal, the identifier of the mobile device, the transmission mode and the delay time value are stored in the display terminal and/or the mobile device. When the display terminal and the mobile device are connected again, and when the identifier of the display terminal, the identifier of the mobile device and the transmission mode are successfully matched, the previously measured delay time value can be directly obtained from the display terminal and/or the mobile device.
In a preferred example of the embodiment of the present invention, the delay time value may be a delay time value obtained by sending preset simulation data to the mobile device and recording a current first system time value, receiving the simulation data returned by the mobile device and recording a current second system time value, and calculating a half of a difference between the second system time value and the first system time value;
in practical applications, the analog data may be data in any format. The time delay value of data transmission is related to the size of the transmitted data, so that the size of the simulation data is the same as that of the time point data which is actually transmitted once. Then in this example the analog data may be the same size as the audio data between the two audio time stamps.
After the display terminal sends the analog data to the mobile device, the mobile device needs to immediately return the analog data to the display terminal, and the display terminal calculates half of the time difference (i.e. the difference value between the first system time and the second system time) between sending and receiving of the analog data to obtain the delay time value.
When the first system time is T1The second system time is T2Time delay Δ T = (T)2-T1)/2。
Of course, the embodiment of the present invention may also calculate a half of the time difference between the sending and receiving of the analog data for multiple times to obtain the delay time value, so as to reduce the error.
According to the embodiment of the invention, the delay time value is added in the playing target timestamp, so that the influence of the delay of the display terminal and the mobile equipment in transmitting the playing target timestamp is eliminated, and the synchronous playing precision of the audio data and the video data is further improved.
In another preferred example of the embodiment of the present invention, the delay time value may be a delay time value sent by the mobile device.
In this example, the delay time value may be a delay time value obtained by the mobile device by sending preset simulation data to the display terminal and recording a current third system time value, receiving the simulation data returned by the display terminal and recording a current fourth system time value, and calculating a half of a difference between the fourth system time value and the third system time value.
The human eye resolution is typically 1/24 seconds, and if there is a slight difference between the video data and the audio data during playback, the human eye will not perceive the difference. Therefore, in another case, in order to reduce the resource occupation of the display terminal or the mobile device, when the delay time value is less than a preset threshold (e.g., 40 ms), the embodiment of the present invention may not consider the transmission delay when synchronizing the video data and the audio data.
In this embodiment of the present invention, the playing target timestamp may include a video timestamp corresponding to current video data extracted by the display terminal when the video data is played.
The video time stamp corresponding to the current video data is taIf so, playing the target timestamp ta'=ta
Step 304, sending the playing target timestamp to the mobile equipment side; and the mobile equipment side is used for playing the audio data corresponding to the playing target timestamp.
Corresponding to step 302, the display terminal side may send the play target timestamp to the mobile device side through the transmission mode of the ethernet, or may send the play target timestamp to the mobile device side through the transmission mode of the bluetooth, or may send the play target timestamp to the mobile device side through the transmission mode of the 2.4G wireless network, or may send the play target timestamp to the mobile device side through the transmission mode of the infrared ray, or may send the play target timestamp to the mobile device side through the transmission mode of the wireless network protocol ZigBee.
According to the embodiment of the invention, the display terminal sends the audio data to the mobile equipment after receiving the multimedia data, the display terminal sends the playing target timestamp to the mobile equipment when playing the video data, and the mobile equipment plays the audio data according to the playing target timestamp, wherein the display terminal is connected with the mobile equipment in a wireless transmission mode, so that a user can get rid of the constraint of directly connecting the wired earphone with the display terminal, the operation of the user is convenient, meanwhile, the problem of obvious asynchronization caused by the accumulation of tiny differences generated when playing the audio data and the video data is avoided, and the synchronous playing of the audio data and the video data is realized. In addition, the mobile equipment is a product with high frequency of use for the public, the embodiment of the invention reuses the mobile equipment, has multiple functions, avoids additional purchase of Bluetooth earphones, has strong practicability and greatly reduces the cost.
Referring to fig. 4, a flowchart illustrating steps of embodiment 2 of a method for playing video data and audio data synchronously according to the present invention is shown, and an embodiment of the present invention may include the following steps:
step 401, receiving multimedia data at a display terminal side; the multimedia data may include video data and audio data; the video data may carry one or more video time stamps;
step 402, sending the audio data to a mobile device side; the display terminal side and the mobile equipment side can be connected in a wireless transmission mode;
step 403, buffering the video data;
in a specific implementation, the buffering process may be to resume playing the video data or the audio data after the buffering time value. The buffering time may be preset to a fixed value, for example 5 seconds.
In the display terminal, after enough video data is buffered by the buffer time, the video data can be played; in the mobile device, after enough audio data is buffered, the audio data can be played again after the same buffering time.
Step 404, when the video data is played, generating a playing target timestamp according to the currently played video timestamp;
in this embodiment of the present invention, the playing target timestamp may include a video timestamp corresponding to current video data extracted by the display terminal when the video data is played.
The video time stamp corresponding to the current video data is taIf so, playing the target timestamp ta'=ta
Step 405, sending the playing target timestamp to the mobile equipment side; and the mobile equipment side is used for playing the audio data corresponding to the playing target timestamp.
Referring to fig. 5, a flowchart illustrating steps of embodiment 3 of a method for playing video data and audio data synchronously according to the present invention is shown, and an embodiment of the present invention may include the following steps:
step 501, receiving audio data sent by a display terminal side at a mobile equipment side; the display terminal side is connected with the mobile equipment side in a wireless transmission mode;
by applying the embodiment of the invention, the display terminal can establish a data transmission link with the mobile equipment.
In a preferred example of the embodiment of the present invention, the manner of wireless transmission may include one or more of the following:
ethernet, Bluetooth, 2.4G wireless network, infrared ray and wireless network protocol ZigBee.
In this example, the transmission link between the display terminal and the mobile device may be an ethernet link, a bluetooth link, a 2.4G wireless network link, an infrared link, a ZigBee link, or other transmission link.
Specifically, the mobile device side may receive audio data sent by the display terminal side through a transmission mode of an ethernet, or may receive audio data sent by the display terminal side through a transmission mode of a bluetooth, or may receive audio data sent by the display terminal side through a transmission mode of a 2.4G wireless network, or may receive audio data sent by the display terminal side through a transmission mode of an infrared ray, or may receive audio data sent by the display terminal side through a transmission mode of a wireless network protocol ZigBee.
Step 502, receiving a play target timestamp sent by the display terminal side; the playing time stamp is a time stamp generated by the display terminal according to the currently played video time stamp when the video data is played;
corresponding to step 501, the mobile device side may receive the play target timestamp sent by the display terminal side through a transmission method of an ethernet, or may receive the play target timestamp sent by the display terminal side through a transmission method of a bluetooth, or may receive the play target timestamp sent by the display terminal side through a transmission method of a 2.4G wireless network, or may receive the play target timestamp sent by the display terminal side through a transmission method of an infrared ray, or may receive the play target timestamp sent by the display terminal side through a transmission method of a wireless network protocol ZigBee.
In a preferred embodiment of the present invention, the playing target timestamp may include a video timestamp corresponding to current video data extracted by the display terminal when the video data is played.
In the embodiment of the present invention, the video timestamp corresponding to the current video data is taIf so, playing the target timestamp ta'=ta
In another preferred embodiment of the present invention, the playing target timestamp may include a video timestamp and a delay time value corresponding to current video data extracted by the display terminal side when playing the video data; the delay time value may be a time of data transmission delay between the display terminal side and the mobile device side.
In the embodiment of the present invention, the video timestamp corresponding to the current video data is taIf the delay time value is delta t, playing the target timestamp ta'=ta+Δt。
Step 503, playing the audio data corresponding to the playing target timestamp; the audio data and the video data may be multimedia data received by the display terminal.
When the mobile device receives the video playing information (i.e. the playing target timestamp), the mobile device can synchronously play the audio data by using a speaker of the mobile device or accessing a wired earphone and the like according to the video playing information.
In a preferred embodiment of the present invention, the audio data may carry one or more audio time stamps, and step 503 may include the following sub-steps:
a substep S11, when the currently played audio timestamp is greater than the playing target timestamp, pausing the playing of the audio data until the currently played audio timestamp is equal to the playing target timestamp;
if the currently played audio timestamp is greater than the playing target timestamp, that is, the audio data is played before the audio data is played, for example, the playing target timestamp is 50000 ms, and the currently played audio timestamp is 50040 ms, the audio data can be paused, for example, the current video data is repeatedly played, and normal playing cannot be started until the audio data and the audio data are synchronized.
Of course, the embodiment of the present invention may also directly play and search the audio time stamp equal to the playing target time stamp without pausing the playing of the audio data, and play the audio data corresponding to the audio time stamp, which is not limited in this embodiment of the present invention.
And/or the presence of a gas in the gas,
a substep S12, when the currently played audio timestamp is less than or equal to the playing target timestamp, searching for an audio timestamp equal to the playing target timestamp;
and a substep S13, playing the audio data corresponding to the audio time stamp.
The currently played audio timestamp is less than or equal to the playing target timestamp, that is, the audio data is played backward or synchronized with the playing of the audio data, for example, the playing target timestamp is 50000 ms, and the currently played video timestamp is 49960 ms, then the audio timestamp of the synchronization point can be searched, and the video data of the synchronization point can be directly played without playing the audio data that is in the middle backward.
It should be noted that, since method embodiment 3 corresponds to method embodiment 1, the description is relatively simple, and for the relevant points, reference may be made to part of description of method embodiment 1, and the embodiment of the present invention is not described in detail herein.
Referring to fig. 6, a flowchart illustrating steps of embodiment 4 of a method for playing video data and audio data synchronously according to the present invention is shown, and an embodiment of the present invention may include the following steps:
601, receiving audio data sent by a display terminal side at a mobile equipment side; the display terminal side is connected with the mobile equipment side in a wireless transmission mode;
step 602, performing buffering processing on the audio data;
in the display terminal, after enough video data is buffered by the buffer time, the video data can be played; in the mobile device, after enough audio data is buffered, the audio data can be played again after the same buffering time.
Step 603, receiving a play target timestamp sent by the display terminal side; the playing timestamp may be a timestamp generated by the display terminal according to a currently played video timestamp when the video data is played;
in a preferred embodiment of the present invention, the playing target timestamp may include a video timestamp corresponding to current video data extracted by the display terminal when the video data is played.
Then, in the embodiment of the present invention, the video timestamp corresponding to the current video data is taIf so, playing the target timestamp ta'=ta
Step 604, obtaining a delay time value;
in the embodiment of the invention, the delay time value between the display terminal and the mobile equipment can be measured in advance or at present; the display terminal may be used for active measurement, or may be obtained from a mobile device, which is not limited in this embodiment of the present invention.
In a preferred example of the embodiment of the present invention, the delay time value is a delay time value obtained by sending preset simulation data to the display terminal and recording a current third system time value, receiving the simulation data returned by the display terminal and recording a current fourth system time value, and calculating a half of a difference between the fourth system time value and the third system time value;
after the mobile device sends the analog data to the display terminal, the display terminal needs to immediately return the analog data to the mobile device, and the mobile device calculates half of the time difference (i.e. the difference between the third system time and the fourth system time) between sending and receiving of the analog data to obtain the delay time value.
When the third system time is T3The fourth system time is T4Time delay Δ T = (T)4-T3)/2。
Of course, the embodiment of the present invention may also calculate a half of the time difference between the sending and receiving of the analog data for multiple times to obtain the delay time value, so as to reduce the error.
In another preferred example of the embodiment of the present invention, the delay time value may be a delay time value sent by the display terminal.
In this example, the delay time value may be a delay time value obtained by a display terminal sending preset simulation data to the mobile device and recording a current first system time value, receiving the simulation data returned by the mobile device and recording a current second system time value, and calculating a half of a difference between the second system time value and the first system time value.
Step 605, adding the delay time value to the time value indicated by the playing target timestamp;
in the embodiment of the invention, the target timestamp t is playeda'=taIf the delay time value is Δ t, the updated playing target timestamp t is obtaineda''=ta'+Δt=ta+Δt。
Step 606, playing the audio data corresponding to the playing target timestamp; the audio data and the video data may be multimedia data received by the display terminal.
It should be noted that, since method embodiment 4 corresponds to method embodiment 2, the description is relatively simple, and for the relevant points, reference may be made to the partial description of method embodiment 2, and the embodiment of the present invention is not described in detail herein.
For simplicity of explanation, the method embodiments are described as a series of acts or combinations, but those skilled in the art will appreciate that the embodiments are not limited by the order of acts described, as some steps may occur in other orders or concurrently with other steps in accordance with the embodiments of the invention. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred and that no particular act is required to implement the invention.
Referring to fig. 7, a block diagram of an apparatus embodiment 1 for synchronously playing video data and audio data according to the present invention is shown, and the embodiment of the present invention may include the following modules:
a multimedia data receiving module 701, configured to receive multimedia data at a display terminal side; the multimedia data comprises video data and audio data; the video data may carry one or more video time stamps;
an audio data sending module 702, configured to send the audio data to a mobile device side; the display terminal side and the mobile equipment side can be connected in a wireless transmission mode;
a playing target timestamp generating module 703, configured to generate a playing target timestamp according to a currently played video timestamp when the video data is played;
a playing target timestamp sending module 704, configured to send the playing target timestamp to the mobile device side; the mobile device side may be configured to play the audio data corresponding to the play target timestamp.
In a preferred example of the embodiment of the present invention, the manner of wireless transmission may include one or more of the following:
ethernet, Bluetooth, 2.4G wireless network, infrared ray and wireless network protocol ZigBee.
In a preferred embodiment of the present invention, the playing target timestamp may include a video timestamp corresponding to current video data extracted by the display terminal when the video data is played.
In a preferred embodiment of the present invention, the playing target timestamp may include a video timestamp and a delay time value corresponding to current video data extracted by the display terminal side when playing the video data; the delay time value is the time of data transmission delay between the display terminal side and the mobile equipment side.
In a preferred embodiment of the present invention, the delay time value may be a delay time value obtained by sending preset simulation data to the mobile device and recording a current first system time value, receiving the simulation data returned by the mobile device and recording a current second system time value, and calculating a half of a difference between the second system time value and the first system time value;
or;
the delay time value may be a delay time value transmitted by the mobile device.
In a preferred embodiment of the present invention, the embodiment of the present invention may further include the following modules:
and the first buffer module is used for buffering the video data.
Referring to fig. 8, a block diagram of an embodiment 2 of the apparatus for synchronously playing video data and audio data according to the present invention is shown, and the embodiment of the present invention may include the following modules:
an audio data receiving module 801, configured to receive, at a mobile device side, audio data sent by a display terminal side; the display terminal side is connected with the mobile equipment side in a wireless transmission mode;
a playing target timestamp receiving module 802, configured to receive a playing target timestamp sent by the display terminal side; the playing time stamp is a time stamp generated by the display terminal according to the currently played video time stamp when the video data is played;
an audio data playing module 803, configured to play the audio data corresponding to the playing target timestamp; the audio data and the video data are multimedia data received by the display terminal.
In a preferred example of the embodiment of the present invention, the manner of wireless transmission may include one or more of the following:
ethernet, Bluetooth, 2.4G wireless network, infrared ray and wireless network protocol ZigBee.
In a preferred embodiment of the present invention, the playing target timestamp may include a video timestamp corresponding to current video data extracted by the display terminal when the video data is played.
In a preferred embodiment of the present invention, the playing target timestamp may include a video timestamp and a delay time value corresponding to current video data extracted by the display terminal side when playing the video data; the delay time value is the time of data transmission delay between the display terminal side and the mobile equipment side.
In a preferred embodiment of the present invention, the embodiment of the present invention may further include the following modules:
the delay time value acquisition module is used for acquiring a delay time value;
and the delay time value increasing module is used for increasing the delay time value on the time value indicated by the playing target timestamp.
In a preferred embodiment of the present invention, the delay time value may be a delay time value obtained by sending preset analog data to the display terminal and recording a current third system time value, receiving the analog data returned by the display terminal and recording a current fourth system time value, and calculating a half of a difference between the fourth system time value and the third system time value;
or,
the delay time value may be a delay time value transmitted by the display terminal.
In a preferred embodiment of the present invention, the audio data may carry one or more audio time stamps; the audio data playing module 803 may include the following sub-modules:
the playing pause submodule is used for pausing the playing of the audio data until the currently played audio timestamp is equal to the playing target timestamp when the currently played audio timestamp is greater than the playing target timestamp;
and/or the presence of a gas in the gas,
the searching submodule is used for searching the audio time stamp which is equal to the playing target time stamp when the currently played audio time stamp is less than or equal to the playing target time stamp;
and the corresponding playing submodule is used for playing the audio data corresponding to the audio time stamp.
In a preferred embodiment of the present invention, the embodiment of the present invention may further include the following modules:
and the second buffer module is used for carrying out buffer processing on the audio data.
For the device embodiment, since it is basically similar to the method embodiment, the description is relatively simple, and the relevant points can be referred to the partial description of the method embodiment.
An embodiment of the present invention further provides an apparatus, where the apparatus may include:
one or more processors;
a memory; and
one or more modules stored in the memory and configured to be executed by the one or more processors, wherein the one or more modules have the functionality to:
receiving multimedia data at a display terminal side; the multimedia data comprises video data and audio data; the video data carries one or more video time stamps;
sending the audio data to a mobile equipment side; the display terminal side is connected with the mobile equipment side in a wireless transmission mode;
when the video data is played, generating a playing target timestamp according to the currently played video timestamp;
sending the playing target timestamp to the mobile equipment side; and the mobile equipment side is used for playing the audio data corresponding to the playing target timestamp.
Optionally, the wireless transmission means includes one or more of:
ethernet, Bluetooth, 2.4G wireless network, infrared ray and wireless network protocol ZigBee.
Optionally, the playing target timestamp includes a video timestamp corresponding to current video data extracted by the display terminal when the video data is played.
Optionally, the playing target timestamp includes a video timestamp and a delay time value corresponding to current video data extracted by the display terminal side when the video data is played; the delay time value is the time of data transmission delay between the display terminal side and the mobile equipment side.
Optionally, the delay time value is obtained by sending preset simulation data to the mobile device and recording a current first system time value, receiving the simulation data returned by the mobile device and recording a current second system time value, and calculating a half of a difference between the second system time value and the first system time value;
or;
the delay time value is the delay time value sent by the mobile equipment.
Optionally, the one or more modules may also have the following functions:
and buffering the video data.
An embodiment of the present invention further provides a non-volatile readable storage medium, where one or more modules (programs) are stored in the storage medium, and when the one or more modules are applied to a device with a display, the one or more modules may cause the device to execute instructions (instructions) of:
sending the audio data to a mobile equipment side; the display terminal side is connected with the mobile equipment side in a wireless transmission mode;
when the video data is played, generating a playing target timestamp according to the currently played video timestamp;
sending the playing target timestamp to the mobile equipment side; and the mobile equipment side is used for playing the audio data corresponding to the playing target timestamp.
Optionally, the wireless transmission means includes one or more of:
ethernet, Bluetooth, 2.4G wireless network, infrared ray and wireless network protocol ZigBee.
Optionally, the playing target timestamp includes a video timestamp corresponding to current video data extracted by the display terminal when the video data is played.
Optionally, the playing target timestamp includes a video timestamp and a delay time value corresponding to current video data extracted by the display terminal side when the video data is played; the delay time value is the time of data transmission delay between the display terminal side and the mobile equipment side.
Optionally, the delay time value is obtained by sending preset simulation data to the mobile device and recording a current first system time value, receiving the simulation data returned by the mobile device and recording a current second system time value, and calculating a half of a difference between the second system time value and the first system time value;
or;
the delay time value is the delay time value sent by the mobile equipment.
Optionally, the one or more modules may also have the following functions:
and buffering the video data.
Referring to fig. 9, a schematic structural diagram of an intelligent television according to an embodiment of the present invention is shown. The electronic device is configured to implement the content presentation method provided in the above embodiment, specifically:
electronic device 800 may include RF (Radio Frequency) circuitry 810, memory 820 including one or more computer-readable storage media, input unit 830, display unit 840, sensor 850, audio circuitry 860, short-range wireless transmission module 870, processor 880 including one or more processing cores, and power supply 890. Those skilled in the art will appreciate that the electronic device configuration shown in fig. 9 does not constitute a limitation of the electronic device and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components. Wherein:
the RF circuit 810 may be used for receiving and transmitting signals during information transmission and reception or during a call, and in particular, for receiving downlink information from a base station and then processing the received downlink information by the one or more processors 880; in addition, data relating to uplink is transmitted to the base station. In general, RF circuit 810 includes, but is not limited to, an antenna, at least one Amplifier, a tuner, one or more oscillators, a transceiver, a coupler, an LNA (low noise Amplifier), a duplexer, and the like. In addition, the RF circuit 810 may also communicate with networks and other devices via wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to GSM (Global System for Mobile communications), GPRS (General Packet Radio Service), CDMA (Code Division Multiple Access), WCDMA (Wideband Code Division Multiple Access), LTE (Long Term Evolution), email, SMS (Short Messaging Service), and the like. The memory 820 may be used to store software programs and modules, for example, the memory 820 may be used to store a software program for collecting voice signals, a software program for realizing keyword recognition, a software program for realizing continuous voice recognition, a software program for realizing setting reminders, and the like. The processor 880 executes various functional applications and data processing such as a function of "receiving multimedia data on the display terminal side," a function of transmitting the audio data to the mobile device side, "a function of generating a play target time stamp from a currently played video time stamp when the video data is played," a function of transmitting the play target time stamp to the mobile device side, "and the like in the embodiment of the present invention by running software programs and modules stored in the memory 820. The memory 820 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the electronic device 800, and the like. Further, the memory 820 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Accordingly, the memory 820 may also include a memory controller to provide the processor 880 and the input unit 830 access to the memory 820.
The input unit 830 may be used to receive input numeric or character information and generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function control. In particular, the input unit 830 may include a touch-sensitive surface 831 as well as other input devices 832. The touch-sensitive surface 831, also referred to as a touch display screen or a touch pad, may collect touch operations by a user on or near the touch-sensitive surface 831 (e.g., operations by a user on or near the touch-sensitive surface 831 using a finger, a stylus, or any other suitable object or attachment) and drive the corresponding connection device according to a predefined program. Alternatively, the touch-sensitive surface 831 can include two portions, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts it to touch point coordinates, and sends the touch point coordinates to the processor 880, and can receive and execute commands from the processor 880. In addition, the touch-sensitive surface 831 can be implemented using various types of resistive, capacitive, infrared, and surface acoustic waves. The input unit 830 may include other input devices 832 in addition to the touch-sensitive surface 831. In particular, other input devices 832 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display unit 840 may be used to display information input by or provided to a user and various graphical user interfaces of the electronic device 800, which may be made up of graphics, text, icons, video, and any combination thereof. The Display unit 840 may include a Display panel 841, and the Display panel 841 may be configured in the form of an LCD (Liquid Crystal Display), an OLED (organic light-Emitting Diode), or the like, as an option. Further, touch-sensitive surface 831 can overlie display panel 841 such that when touch-sensitive surface 831 detects a touch operation thereon or thereabout, it can be relayed to processor 880 to determine the type of touch event, and processor 880 can then provide a corresponding visual output on display panel 841 in accordance with the type of touch event. Although in FIG. 9, touch-sensitive surface 831 and display panel 841 are implemented as two separate components to implement input and output functions, in some embodiments, touch-sensitive surface 831 may be integrated with display panel 841 to implement input and output functions.
The electronic device 800 may also include at least one sensor 850, such as light sensors, motion sensors, and other sensors. Specifically, the light sensor may include an ambient light sensor that may adjust the brightness of the display panel 841 based on the brightness of ambient light, and a proximity sensor that may turn off the display panel 841 and/or backlight when the electronic device 800 is moved to the ear. As one of the motion sensors, the gravity acceleration sensor may detect the magnitude of acceleration in each direction (generally, three axes), detect the magnitude and direction of gravity when the mobile phone is stationary, and may be used for applications of recognizing gestures of the mobile phone (such as horizontal and vertical screen switching, related games, magnetometer gesture calibration), vibration recognition related functions (such as pedometer and tapping), and other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor that may be further configured to the electronic device 800, which are not described herein again.
The audio circuitry 860, speaker 861, microphone 862 may provide an audio interface between a user and the electronic device 800. The audio circuit 860 can transmit the electrical signal converted from the received audio data to the speaker 861, and the electrical signal is converted into a sound signal by the speaker 861 and output; on the other hand, the microphone 862 converts the collected sound signal into an electric signal, converts the electric signal into audio data after being received by the audio circuit 860, and outputs the audio data to the processor 880 for processing, and then transmits the audio data to another terminal via the RF circuit 810, or outputs the audio data to the memory 820 for further processing. The audio circuitry 860 may also include an earbud jack to provide communication of a peripheral headset with the electronic device 800.
The short-distance wireless transmission module 870 may be a WIFI (wireless fidelity) module or a bluetooth module, etc. The electronic device 800, which may assist the user in e-mailing, browsing web pages, accessing streaming media, etc., through the short-range wireless transmission module 870, provides the user with wireless broadband internet access. Although fig. 9 shows the short-range wireless transmission module 870, it is understood that it does not belong to the essential constitution of the electronic device 800 and may be omitted entirely within the scope not changing the essence of the invention as needed.
The processor 880 is a control center of the electronic device 800, connects various parts of the entire electronic device using various interfaces and lines, performs various functions of the electronic device 800 and processes data by operating or executing software programs and/or modules stored in the memory 820 and calling data stored in the memory 820, thereby monitoring the electronic device as a whole. Optionally, processor 880 may include one or more processing cores; preferably, the processor 880 may integrate an application processor, which mainly handles operating systems, user interfaces, applications, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 880.
The electronic device 800 also includes a power supply 890 (e.g., a battery) for powering the various components, which may be logically coupled to the processor 880 via a power management system that may be used to manage charging, discharging, and power consumption. Power supply 890 may also include any component of one or more dc or ac power sources, recharging systems, power failure detection circuitry, power converters or inverters, power status indicators, and the like.
Although not shown, the electronic device 800 may further include a camera, a bluetooth module, and the like, which are not described in detail herein. Specifically, in the present embodiment, the display unit of the electronic device 800 is a touch screen display.
An embodiment of the present invention further provides an apparatus, where the apparatus may include:
one or more processors;
a memory; and
one or more modules stored in the memory and configured to be executed by the one or more processors, wherein the one or more modules have the functionality to:
receiving audio data sent by a display terminal side at a mobile equipment side; the display terminal side is connected with the mobile equipment side in a wireless transmission mode;
receiving a playing target timestamp sent by the display terminal side; the playing time stamp is a time stamp generated by the display terminal according to the currently played video time stamp when the video data is played;
playing the audio data corresponding to the playing target timestamp; the audio data and the video data are multimedia data received by the display terminal.
Optionally, the wireless transmission means includes one or more of:
ethernet, Bluetooth, 2.4G wireless network, infrared ray and wireless network protocol ZigBee.
Optionally, the playing target timestamp includes a video timestamp corresponding to current video data extracted by the display terminal when the video data is played.
Optionally, the playing target timestamp includes a video timestamp and a delay time value corresponding to current video data extracted by the display terminal side when the video data is played; the delay time value is the time of data transmission delay between the display terminal side and the mobile equipment side.
Optionally, the one or more modules may also have the following functions:
obtaining a delay time value;
adding the delay time value to the time value indicated by the play target timestamp.
Optionally, the delay time value is a delay time value obtained by sending preset analog data to the display terminal and recording a current third system time value, receiving the analog data returned by the display terminal and recording a current fourth system time value, and calculating a half of a difference between the fourth system time value and the third system time value;
or,
the delay time value is the delay time value sent by the display terminal.
Optionally, the audio data carries one or more audio time stamps, and the one or more modules may have the following functions:
when the audio time stamp played currently is larger than the playing target time stamp, the audio data playing is paused until the audio time stamp played currently is equal to the playing target time stamp;
and/or the presence of a gas in the gas,
when the audio time stamp played currently is smaller than or equal to the playing target time stamp, searching for the audio time stamp which is equal to the playing target time stamp;
and playing the audio data corresponding to the audio time stamp.
Optionally, the one or more modules may also have the following functions:
and carrying out buffering processing on the audio data.
An embodiment of the present invention further provides a non-volatile readable storage medium, where one or more modules (programs) are stored in the storage medium, and when the one or more modules are applied to a device with an audio playing function, the one or more modules may cause the device to execute instructions (instructions) of the following steps:
receiving audio data sent by a display terminal side at a mobile equipment side; the display terminal side is connected with the mobile equipment side in a wireless transmission mode;
receiving a playing target timestamp sent by the display terminal side; the playing time stamp is a time stamp generated by the display terminal according to the currently played video time stamp when the video data is played;
playing the audio data corresponding to the playing target timestamp; the audio data and the video data are multimedia data received by the display terminal.
Optionally, the wireless transmission means includes one or more of:
ethernet, Bluetooth, 2.4G wireless network, infrared ray and wireless network protocol ZigBee.
Optionally, the playing target timestamp includes a video timestamp corresponding to current video data extracted by the display terminal when the video data is played.
Optionally, the playing target timestamp includes a video timestamp and a delay time value corresponding to current video data extracted by the display terminal side when the video data is played; the delay time value is the time of data transmission delay between the display terminal side and the mobile equipment side.
Optionally, the one or more modules may also have the following functions:
obtaining a delay time value;
adding the delay time value to the time value indicated by the play target timestamp.
Optionally, the delay time value is a delay time value obtained by sending preset analog data to the display terminal and recording a current third system time value, receiving the analog data returned by the display terminal and recording a current fourth system time value, and calculating a half of a difference between the fourth system time value and the third system time value;
or,
the delay time value is the delay time value sent by the display terminal.
Optionally, the audio data carries one or more audio time stamps, and the one or more modules may have the following functions:
when the audio time stamp played currently is larger than the playing target time stamp, the audio data playing is paused until the audio time stamp played currently is equal to the playing target time stamp;
and/or the presence of a gas in the gas,
when the audio time stamp played currently is smaller than or equal to the playing target time stamp, searching for the audio time stamp which is equal to the playing target time stamp;
and playing the audio data corresponding to the audio time stamp.
Optionally, the one or more modules may also have the following functions:
and carrying out buffering processing on the audio data.
Fig. 10 is a schematic structural diagram of a terminal device according to an embodiment of the present invention. Referring to fig. 10, the terminal device may be used to implement the method for synchronously playing video data and audio data provided in the above embodiment. Wherein, this terminal equipment can be cell-phone, panel, wearing formula mobile device (like intelligent wrist-watch) etc..
The terminal device 700 may include components such as a communication unit 110, a memory 120 including one or more computer-readable storage media, an input unit 130, a display unit 140, a sensor 150, an audio circuit 160, a WiFi (wireless fidelity) module 170, a processor 180 including one or more processing cores, and a power supply 190. Those skilled in the art will appreciate that the terminal device configuration shown in fig. 10 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components. Wherein:
the communication unit 110 may be used for receiving and transmitting information or signals during a call, and the communication unit 110 may be an RF (Radio Frequency) circuit, a router, a modem, or other network communication devices. In particular, when the communication unit 110 is an RF circuit, downlink information of the base station is received and then processed by the one or more processors 180; in addition, data relating to uplink is transmitted to the base station. Generally, the RF circuit as a communication unit includes, but is not limited to, an antenna, at least one Amplifier, a tuner, one or more oscillators, a Subscriber Identity Module (SIM) card, a transceiver, a coupler, an LNA (Low Noise Amplifier), a duplexer, and the like. In addition, the communication unit 110 may also communicate with a network and other devices through wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to GSM (Global System for mobile communications), GPRS (General Packet Radio Service), CDMA (Code Division Multiple Access), WCDMA (Wideband Code Division Multiple Access), LTE (Long Term Evolution), email, SMS (Short messaging Service), and the like. The memory 120 may be used to store software programs and modules, and the processor 180 executes various functional applications and data processing by operating the software programs and modules stored in the memory 120. The memory 120 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the terminal device 700, and the like. Further, the memory 120 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Accordingly, the memory 120 may further include a memory controller to provide the processor 180 and the input unit 130 with access to the memory 120.
The input unit 130 may be used to receive input numeric or character information and generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function control. Optionally, the input unit 130 may include a touch-sensitive surface 131 as well as other input devices 132. The touch-sensitive surface 131, also referred to as a touch display screen or a touch pad, may collect touch operations by a user on or near the touch-sensitive surface 131 (e.g., operations by a user on or near the touch-sensitive surface 131 using a finger, a stylus, or any other suitable object or attachment), and drive the corresponding connection device according to a predetermined program. Alternatively, the touch sensitive surface 131 may comprise two parts, a touch detection means and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 180, and can receive and execute commands sent by the processor 180. Additionally, the touch-sensitive surface 131 may be implemented using various types of resistive, capacitive, infrared, and surface acoustic waves. In addition to the touch-sensitive surface 131, the input unit 130 may also include other input devices 132. Alternatively, other input devices 132 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display unit 140 may be used to display information input by or provided to a user and various graphic user interfaces of the terminal device 700, which may be configured by graphics, text, icons, video, and any combination thereof. The Display unit 140 may include a Display panel 141, and optionally, the Display panel 141 may be configured in the form of an LCD (Liquid Crystal Display), an OLED (organic light-Emitting Diode), or the like. Further, the touch-sensitive surface 131 may cover the display panel 141, and when a touch operation is detected on or near the touch-sensitive surface 131, the touch operation is transmitted to the processor 180 to determine the type of the touch event, and then the processor 180 provides a corresponding visual output on the display panel 141 according to the type of the touch event. Although in FIG. 10, touch-sensitive surface 131 and display panel 141 are shown as two separate components to implement input and output functions, in some embodiments, touch-sensitive surface 131 may be integrated with display panel 141 to implement input and output functions.
The terminal device 700 may also include at least one sensor 150, such as a light sensor, a motion sensor, and other sensors. Alternatively, the light sensor may include an ambient light sensor that adjusts the brightness of the display panel 141 according to the brightness of ambient light, and a proximity sensor that turns off the display panel 141 and/or the backlight when the terminal device 700 is moved to the ear. As one of the motion sensors, the gravity acceleration sensor may detect the magnitude of acceleration in each direction (generally, three axes), detect the magnitude and direction of gravity when the mobile phone is stationary, and may be used for applications of recognizing gestures of the mobile phone (such as horizontal and vertical screen switching, related games, magnetometer gesture calibration), vibration recognition related functions (such as pedometer and tapping), and other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor that are further configured to the terminal device 700, and are not described herein again.
The audio circuitry 160, speaker 161, microphone 162 may provide an audio interface between the user and the terminal device 700. The audio circuit 160 may transmit the electrical signal converted from the received audio data to the speaker 161, and convert the electrical signal into a sound signal for output by the speaker 161; on the other hand, the microphone 162 converts the collected sound signal into an electric signal, converts the electric signal into audio data after being received by the audio circuit 160, and outputs the audio data to the processor 180 for processing, and then transmits the audio data to, for example, another terminal device via the RF circuit 110, or outputs the audio data to the memory 120 for further processing. The audio circuit 160 may also include an earbud jack to provide communication of peripheral headphones with the terminal device 700.
To implement wireless communication, a wireless communication unit 170 may be configured on the terminal device, and the wireless communication unit 170 may be a WiFi module. WiFi belongs to a short-range wireless transmission technology, and the terminal device 700 can help a user to send and receive e-mail, browse a web page, access streaming media, and the like through the wireless communication unit 170, which provides the user with wireless broadband internet access. Although fig. 10 shows the wireless communication unit 170, it is understood that it does not belong to the essential constitution of the terminal device 700 and may be omitted entirely as needed within the scope not changing the essence of the invention.
The processor 180 is a control center of the terminal device 700, connects various parts of the entire mobile phone using various interfaces and lines, and performs various functions of the terminal device 700 and processes data by operating or executing software programs and/or modules stored in the memory 120 and calling data stored in the memory 120, thereby performing overall monitoring of the mobile phone. Optionally, processor 180 may include one or more processing cores; preferably, the processor 180 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 180.
The terminal device 700 further includes a power supply 190 (e.g., a battery) for supplying power to the various components, which may preferably be logically connected to the processor 180 via a power management system, so as to manage charging, discharging, and power consumption via the power management system. The power supply 190 may also include any component including one or more of a dc or ac power source, a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator, and the like.
Although not shown, the terminal device 700 may further include a camera, a bluetooth module, and the like, which will not be described herein. Specifically, in this embodiment, the display unit of the terminal device is a touch screen display, the terminal device further includes a memory, and one or more programs, where the one or more programs are stored in the memory and configured to be executed by the one or more processors, and the one or more programs include instructions for:
receiving audio data sent by a display terminal side at a mobile equipment side; the display terminal side is connected with the mobile equipment side in a wireless transmission mode;
receiving a playing target timestamp sent by the display terminal side; the playing time stamp is a time stamp generated by the display terminal according to the currently played video time stamp when the video data is played;
playing the audio data corresponding to the playing target timestamp; the audio data and the video data are multimedia data received by the display terminal.
Optionally, the wireless transmission means includes one or more of:
ethernet, Bluetooth, 2.4G wireless network, infrared ray and wireless network protocol ZigBee.
Optionally, the playing target timestamp includes a video timestamp corresponding to current video data extracted by the display terminal when the video data is played.
Optionally, the playing target timestamp includes a video timestamp and a delay time value corresponding to current video data extracted by the display terminal side when the video data is played; the delay time value is the time of data transmission delay between the display terminal side and the mobile equipment side.
Optionally, the one or more modules may also have the following functions:
obtaining a delay time value;
adding the delay time value to the time value indicated by the play target timestamp.
Optionally, the delay time value is a delay time value obtained by sending preset analog data to the display terminal and recording a current third system time value, receiving the analog data returned by the display terminal and recording a current fourth system time value, and calculating a half of a difference between the fourth system time value and the third system time value;
or,
the delay time value is the delay time value sent by the display terminal.
Optionally, the audio data carries one or more audio time stamps, and the one or more modules may have the following functions:
when the audio time stamp played currently is larger than the playing target time stamp, the audio data playing is paused until the audio time stamp played currently is equal to the playing target time stamp;
and/or the presence of a gas in the gas,
when the audio time stamp played currently is smaller than or equal to the playing target time stamp, searching for the audio time stamp which is equal to the playing target time stamp;
and playing the audio data corresponding to the audio time stamp.
Optionally, the one or more modules may also have the following functions:
and carrying out buffering processing on the audio data.
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts in the embodiments are referred to each other.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, apparatus, or computer program product. Accordingly, embodiments of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
Embodiments of the present invention are described with reference to flowchart illustrations and/or block diagrams of methods, mobile devices (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing mobile device to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing mobile device, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications of these embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the embodiments of the invention.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or mobile device that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or mobile device. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in the process, method, article, or mobile device that comprises the element.
The method for synchronously playing video data and audio data, the device for synchronously playing video data and audio data, and the equipment provided by the embodiments of the present invention are described in detail above, a specific example is applied in the present document to explain the principle and the implementation manner of the embodiments of the present invention, and the description of the above embodiments is only used to help understanding the method and the core idea of the embodiments of the present invention; meanwhile, for a person skilled in the art, according to the idea of the embodiment of the present invention, there may be a change in the specific implementation and application scope, and in summary, the content of the present specification should not be construed as a limitation to the embodiment of the present invention.

Claims (30)

1. A method for synchronously playing video data and audio data, comprising:
receiving multimedia data at a display terminal side; the multimedia data comprises video data and audio data; the video data carries one or more video time stamps;
sending the audio data to a mobile equipment side; the display terminal side is connected with the mobile equipment side in a wireless transmission mode;
when the video data is played, generating a playing target timestamp according to the currently played video timestamp;
sending the playing target timestamp to the mobile equipment side; and the mobile equipment side is used for playing the audio data corresponding to the playing target timestamp.
2. The method of claim 1, wherein the manner of wireless transmission comprises one or more of:
ethernet, Bluetooth, 2.4G wireless network, infrared ray and wireless network protocol ZigBee.
3. The method according to claim 1, wherein the playback target timestamp comprises a video timestamp corresponding to current video data extracted by the display terminal when the video data is played back.
4. The method according to claim 1, wherein the play target timestamp comprises a video timestamp and a delay time value corresponding to current video data extracted by the display terminal side when playing the video data; the delay time value is the time of data transmission delay between the display terminal side and the mobile equipment side.
5. The method according to claim 4, wherein the delay time value is a delay time value obtained by sending preset simulation data to the mobile device and recording a current first system time value, receiving the simulation data returned by the mobile device and recording a current second system time value, and calculating a half of a difference value between the second system time value and the first system time value;
or;
the delay time value is the delay time value sent by the mobile equipment.
6. The method according to any one of claims 1 to 5, wherein before the step of generating a playing target timestamp from a currently playing video timestamp when playing the video data, the method further comprises:
and buffering the video data.
7. A method for synchronously playing video data and audio data, comprising:
receiving audio data sent by a display terminal side at a mobile equipment side; the display terminal side is connected with the mobile equipment side in a wireless transmission mode;
receiving a playing target timestamp sent by the display terminal side; the playing time stamp is a time stamp generated by the display terminal according to the currently played video time stamp when the video data is played;
playing the audio data corresponding to the playing target timestamp; the audio data and the video data are multimedia data received by the display terminal.
8. The method of claim 7, wherein the manner of wireless transmission comprises one or more of:
ethernet, Bluetooth, 2.4G wireless network, infrared ray and wireless network protocol ZigBee.
9. The method according to claim 7, wherein the playback target timestamp comprises a video timestamp corresponding to current video data extracted by the display terminal when the video data is played back.
10. The method according to claim 7, wherein the play target timestamp comprises a video timestamp and a delay time value corresponding to current video data extracted by the display terminal side when playing the video data; the delay time value is the time of data transmission delay between the display terminal side and the mobile equipment side.
11. The method according to claim 9, wherein after the step of receiving the play target time stamp transmitted from the display terminal side, the method further comprises:
obtaining a delay time value;
adding the delay time value to the time value indicated by the play target timestamp.
12. The method according to claim 11, wherein the delay time value is a delay time value obtained by transmitting preset analog data to the display terminal and recording a current third system time value, receiving the analog data returned from the display terminal and recording a current fourth system time value, and calculating a half of a difference between the fourth system time value and the third system time value;
or,
the delay time value is the delay time value sent by the display terminal.
13. The method according to any one of claims 7 to 12, wherein the audio data carries one or more audio time stamps;
the step of playing the audio data corresponding to the playing target timestamp includes:
when the audio time stamp played currently is larger than the playing target time stamp, the audio data playing is paused until the audio time stamp played currently is equal to the playing target time stamp;
and/or the presence of a gas in the gas,
when the audio time stamp played currently is smaller than or equal to the playing target time stamp, searching for the audio time stamp which is equal to the playing target time stamp;
and playing the audio data corresponding to the audio time stamp.
14. The method according to any one of claims 7 to 12, wherein before the step of playing the audio data corresponding to the playing target timestamp, the method further comprises:
and carrying out buffering processing on the audio data.
15. An apparatus for synchronously playing video data and audio data, comprising:
the multimedia data receiving module is used for receiving the multimedia data at the display terminal side; the multimedia data comprises video data and audio data; the video data carries one or more video time stamps;
the audio data sending module is used for sending the audio data to a mobile equipment side; the display terminal side is connected with the mobile equipment side in a wireless transmission mode;
the playing target timestamp generating module is used for generating a playing target timestamp according to the currently played video timestamp when the video data is played;
a playing target timestamp sending module, configured to send the playing target timestamp to the mobile device side; and the mobile equipment side is used for playing the audio data corresponding to the playing target timestamp.
16. The apparatus of claim 15, wherein the means for wirelessly transmitting comprises one or more of:
ethernet, Bluetooth, 2.4G wireless network, infrared ray and wireless network protocol ZigBee.
17. The apparatus according to claim 15, wherein the play target timestamp comprises a video timestamp corresponding to current video data extracted by the display terminal when playing the video data.
18. The apparatus according to claim 15, wherein the play target timestamp comprises a video timestamp and a delay time value corresponding to current video data extracted by the display terminal side when playing the video data; the delay time value is the time of data transmission delay between the display terminal side and the mobile equipment side.
19. The apparatus of claim 18, wherein the delay time value is a delay time value obtained by sending preset simulation data to the mobile device and recording a current first system time value, receiving the simulation data returned by the mobile device and recording a current second system time value, and calculating a half of a difference between the second system time value and the first system time value;
or;
the delay time value is the delay time value sent by the mobile equipment.
20. The apparatus of any one of claims 15 to 19, further comprising:
and the first buffer module is used for buffering the video data.
21. An apparatus for synchronously playing video data and audio data, comprising:
the audio data receiving module is used for receiving audio data sent by the display terminal side at the mobile equipment side; the display terminal side is connected with the mobile equipment side in a wireless transmission mode;
a playing target timestamp receiving module, configured to receive a playing target timestamp sent by the display terminal side; the playing time stamp is a time stamp generated by the display terminal according to the currently played video time stamp when the video data is played;
the audio data playing module is used for playing the audio data corresponding to the playing target timestamp; the audio data and the video data are multimedia data received by the display terminal.
22. The apparatus of claim 21, wherein the means for wirelessly transmitting comprises one or more of:
ethernet, Bluetooth, 2.4G wireless network, infrared ray and wireless network protocol ZigBee.
23. The apparatus according to claim 21, wherein the play target timestamp comprises a video timestamp corresponding to current video data extracted by the display terminal when playing the video data.
24. The apparatus according to claim 21, wherein the play target timestamp comprises a video timestamp and a delay time value corresponding to current video data extracted by the display terminal side when playing the video data; the delay time value is the time of data transmission delay between the display terminal side and the mobile equipment side.
25. The apparatus of claim 23, further comprising:
the delay time value acquisition module is used for acquiring a delay time value;
and the delay time value increasing module is used for increasing the delay time value on the time value indicated by the playing target timestamp.
26. The apparatus of claim 25, wherein the delay time value is a delay time value obtained by transmitting preset analog data to the display terminal and recording a current third system time value, receiving the analog data returned by the display terminal and recording a current fourth system time value, and calculating a half of a difference between the fourth system time value and the third system time value;
or,
the delay time value is the delay time value sent by the display terminal.
27. The apparatus according to any one of claims 21 to 26, wherein the audio data carries one or more audio time stamps; the audio data broadcasting module comprises:
the playing pause submodule is used for pausing the playing of the audio data until the currently played audio timestamp is equal to the playing target timestamp when the currently played audio timestamp is greater than the playing target timestamp;
and/or the presence of a gas in the gas,
the searching submodule is used for searching the audio time stamp which is equal to the playing target time stamp when the currently played audio time stamp is less than or equal to the playing target time stamp;
and the corresponding playing submodule is used for playing the audio data corresponding to the audio time stamp.
28. The apparatus of any one of claims 21 to 26, further comprising:
and the second buffer module is used for carrying out buffer processing on the audio data.
29. An apparatus, comprising:
one or more processors;
a memory; and
one or more modules stored in the memory and configured to be executed by the one or more processors, wherein the one or more modules have functionality to:
receiving multimedia data at a display terminal side; the multimedia data comprises video data and audio data; the video data carries one or more video time stamps;
sending the audio data to a mobile equipment side; the display terminal side is connected with the mobile equipment side in a wireless transmission mode;
when the video data is played, generating a playing target timestamp according to the currently played video timestamp;
sending the playing target timestamp to the mobile equipment side; and the mobile equipment side is used for playing the audio data corresponding to the playing target timestamp.
30. An apparatus, comprising:
one or more processors;
a memory; and
one or more modules stored in the memory and configured to be executed by the one or more processors, wherein the one or more modules have functionality to:
receiving audio data sent by a display terminal side at a mobile equipment side; the display terminal side is connected with the mobile equipment side in a wireless transmission mode;
receiving a playing target timestamp sent by the display terminal side; the playing time stamp is a time stamp generated by the display terminal according to the currently played video time stamp when the video data is played;
playing the audio data corresponding to the playing target timestamp; the audio data and the video data are multimedia data received by the display terminal.
CN201410093301.3A 2014-03-13 2014-03-13 Video data and audio data synchronized playing method and device and equipment Pending CN103905878A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410093301.3A CN103905878A (en) 2014-03-13 2014-03-13 Video data and audio data synchronized playing method and device and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410093301.3A CN103905878A (en) 2014-03-13 2014-03-13 Video data and audio data synchronized playing method and device and equipment

Publications (1)

Publication Number Publication Date
CN103905878A true CN103905878A (en) 2014-07-02

Family

ID=50996994

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410093301.3A Pending CN103905878A (en) 2014-03-13 2014-03-13 Video data and audio data synchronized playing method and device and equipment

Country Status (1)

Country Link
CN (1) CN103905878A (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104837046A (en) * 2015-01-08 2015-08-12 腾讯科技(北京)有限公司 Multi-media file processing method and device
WO2016008131A1 (en) * 2014-07-17 2016-01-21 21 Vianet Group, Inc. Techniques for separately playing audio and video data in local networks
TWI556157B (en) * 2015-11-17 2016-11-01 九齊科技股份有限公司 Method and system for playing audio data
CN106488281A (en) * 2016-10-26 2017-03-08 Tcl集团股份有限公司 A kind of player method of television audio and control system, TV, communication system
CN107040848A (en) * 2017-03-07 2017-08-11 建荣半导体(深圳)有限公司 Synchronization parameter transmission method, device and the equipment of AVDTP agreements
CN107135413A (en) * 2017-03-20 2017-09-05 福建天泉教育科技有限公司 A kind of audio and video synchronization method and system
CN107181506A (en) * 2017-04-13 2017-09-19 深圳市金立通信设备有限公司 The method and bluetooth earphone of a kind of control terminal playing resource
CN107734378A (en) * 2017-10-31 2018-02-23 维沃移动通信有限公司 A kind of audio and video synchronization method, device and mobile terminal
CN109168059A (en) * 2018-10-17 2019-01-08 上海赛连信息科技有限公司 A kind of labial synchronization method playing audio & video respectively on different devices
CN109314631A (en) * 2016-06-24 2019-02-05 雅马哈株式会社 Synchronization settings device, conveyer system, synchronization settings method and program
CN111885555A (en) * 2020-06-08 2020-11-03 安凯(广州)微电子技术有限公司 TWS earphone based on monitoring scheme and implementation method thereof
CN112105005A (en) * 2019-08-30 2020-12-18 炬力(珠海)微电子有限公司 Method and device for controlling Bluetooth equipment to play
CN112423028A (en) * 2020-10-26 2021-02-26 深圳Tcl新技术有限公司 Multimedia file transmission method, device, multimedia terminal and storage medium
CN115802087A (en) * 2022-11-03 2023-03-14 深圳创维-Rgb电子有限公司 Sound and picture synchronous processing method and related equipment thereof

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1784674A (en) * 2004-05-03 2006-06-07 微软公司 Fast startup for streaming media
EP1860866A1 (en) * 2006-05-26 2007-11-28 British Telecommunications Public Limited Company Audio-visual reception
CN103297824A (en) * 2013-05-29 2013-09-11 华为技术有限公司 Video processing method, dongle, control terminal and system
EP2672721A1 (en) * 2012-06-08 2013-12-11 LG Electronics Inc. Image display apparatus, mobile terminal and method for operating the same

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1784674A (en) * 2004-05-03 2006-06-07 微软公司 Fast startup for streaming media
EP1860866A1 (en) * 2006-05-26 2007-11-28 British Telecommunications Public Limited Company Audio-visual reception
EP2672721A1 (en) * 2012-06-08 2013-12-11 LG Electronics Inc. Image display apparatus, mobile terminal and method for operating the same
CN103297824A (en) * 2013-05-29 2013-09-11 华为技术有限公司 Video processing method, dongle, control terminal and system

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016008131A1 (en) * 2014-07-17 2016-01-21 21 Vianet Group, Inc. Techniques for separately playing audio and video data in local networks
CN104837046A (en) * 2015-01-08 2015-08-12 腾讯科技(北京)有限公司 Multi-media file processing method and device
TWI556157B (en) * 2015-11-17 2016-11-01 九齊科技股份有限公司 Method and system for playing audio data
CN109314631A (en) * 2016-06-24 2019-02-05 雅马哈株式会社 Synchronization settings device, conveyer system, synchronization settings method and program
CN106488281A (en) * 2016-10-26 2017-03-08 Tcl集团股份有限公司 A kind of player method of television audio and control system, TV, communication system
CN107040848A (en) * 2017-03-07 2017-08-11 建荣半导体(深圳)有限公司 Synchronization parameter transmission method, device and the equipment of AVDTP agreements
CN107135413A (en) * 2017-03-20 2017-09-05 福建天泉教育科技有限公司 A kind of audio and video synchronization method and system
CN107181506A (en) * 2017-04-13 2017-09-19 深圳市金立通信设备有限公司 The method and bluetooth earphone of a kind of control terminal playing resource
CN107734378B (en) * 2017-10-31 2019-11-01 维沃移动通信有限公司 A kind of audio and video synchronization method, device and mobile terminal
CN107734378A (en) * 2017-10-31 2018-02-23 维沃移动通信有限公司 A kind of audio and video synchronization method, device and mobile terminal
CN109168059A (en) * 2018-10-17 2019-01-08 上海赛连信息科技有限公司 A kind of labial synchronization method playing audio & video respectively on different devices
CN113286184A (en) * 2018-10-17 2021-08-20 上海赛连信息科技有限公司 Lip sound synchronization method for respectively playing audio and video on different devices
CN113286184B (en) * 2018-10-17 2024-01-30 上海赛连信息科技有限公司 Lip synchronization method for respectively playing audio and video on different devices
CN112105005A (en) * 2019-08-30 2020-12-18 炬力(珠海)微电子有限公司 Method and device for controlling Bluetooth equipment to play
CN112105005B (en) * 2019-08-30 2024-05-03 炬力(珠海)微电子有限公司 Method and device for controlling Bluetooth equipment to play
CN111885555A (en) * 2020-06-08 2020-11-03 安凯(广州)微电子技术有限公司 TWS earphone based on monitoring scheme and implementation method thereof
CN111885555B (en) * 2020-06-08 2022-05-20 广州安凯微电子股份有限公司 TWS earphone based on monitoring scheme and implementation method thereof
CN112423028A (en) * 2020-10-26 2021-02-26 深圳Tcl新技术有限公司 Multimedia file transmission method, device, multimedia terminal and storage medium
CN115802087A (en) * 2022-11-03 2023-03-14 深圳创维-Rgb电子有限公司 Sound and picture synchronous processing method and related equipment thereof

Similar Documents

Publication Publication Date Title
CN103905879B (en) The method, apparatus and equipment that a kind of video data and audio data are played simultaneously
CN103905881B (en) The method, apparatus and equipment that a kind of video data and audio data are played simultaneously
CN103905876A (en) Video data and audio data synchronized playing method and device and equipment
CN103905878A (en) Video data and audio data synchronized playing method and device and equipment
US11582791B2 (en) PUCCH collision processing method and terminal
US20140354441A1 (en) System and constituent media device components and media device-based ecosystem
CN106254903B (en) A kind of synchronous broadcast method of multi-medium data, apparatus and system
CN103391473B (en) Method and device for providing and acquiring audio and video
CN105208056B (en) Information interaction method and terminal
RU2608328C2 (en) Connecting element for earphones jack plug, earphones and terminal device
JP7198944B2 (en) SEARCH SPACE ARRANGEMENT METHOD AND DEVICE, COMMUNICATION DEVICE
US20150304701A1 (en) Play control method and device
CN111245854B (en) Media transmission method, media control method and device
CN108810860B (en) Audio transmission method, terminal equipment and main earphone
CN107360318B (en) Voice noise reduction method and device, mobile terminal and computer readable storage medium
WO2020143658A1 (en) Method and apparatus for monitoring pdcch, terminal, base station, and storage medium
EP4021087A1 (en) Downlink data cache indication method and apparatus, and downlink data acquisition method and apparatus
WO2017215661A1 (en) Scenario-based sound effect control method and electronic device
CN106205657B (en) A kind of lyric display method and device
CN103491421B (en) Content displaying method, device and intelligent television
CN113805837A (en) Audio processing method, mobile terminal and storage medium
WO2019242633A1 (en) Measurement interval processing method, terminal and network node
CN111081283A (en) Music playing method and device, storage medium and terminal equipment
WO2024082906A1 (en) Information acquisition method and apparatus, bluetooth device, terminal device, and storage medium
WO2019191996A1 (en) Data transmission method and device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20140702

RJ01 Rejection of invention patent application after publication